Email Attachment to Azure Data Warehouse Integration (P&L Automation)


$85.00
Hourly: $85.00 - $85.00

We’re looking for a skilled developer to build an automated data pipeline that pulls weekly Profit & Loss (P&L) statement attachments from a designated email inbox and processes them into our Azure data environment. Key Responsibilities: • Monitor a specific email inbox and extract attached P&L files (Excel or CSV format) • Land the files in Azure Blob Storage or directly into a staging database • Normalize and transform the data into a clean, consistent structure (e.g., pivot/unpivot rows, align columns) • Ensure the dataset is report-ready for downstream consumption • Load transformed data into our Azure SQL Data Warehouse • Push the cleaned data into an existing Azure Analysis Services tabular model for use in Power BI Requirements: • Strong experience with Azure tools (Blob Storage, Data Factory, Logic Apps, or Functions) • Data wrangling and transformation expertise (Python, T-SQL, or ADF mapping data flows) • Familiarity with data modeling and tabular model structures • Ability to build fully automated and repeatable workflows The solution should run on a weekly schedule and require zero manual intervention once deployed.

Keyword: Data Cleaning

Price: $85.0

SQL Microsoft SQL Server Microsoft SQL Server Programming Python

 

Contact info for 50 insulation companies with an ugly website

I need a list of 50 insulation companies that have an ugly/outdated website. This means you will need to manually look at each website homepage to see if it's outdated. Here is the information I need for each business: Name URL Phone Number Owner's name Do not add sites...

View Job
AI/ML Engineer – Document Parsing with LLMs

Job Summary: We’re building a lightweight MVP for an AI-powered tool that helps industrial companies make sense of complex regulatory documents. Your job: build the backend engine that ingests long, messy documents and produces clean, structured summaries with LLM assis...

View Job
Build a Data Warehouse: Kafka → PostgreSQL ELT Pipeline w/ dbt + Metabase (Multi-Tenant SaaS)

Project Overview: We are a SaaS company with a multi-tenant architecture and Kafka-based event pipelines. We are looking for a skilled data engineer to implement an ELT system using: - Kafka (event source) - Kafka Connect (for streaming data into a central warehouse) - ...

View Job