Datahub Architect/Developer


Description:

We are seeking an experienced Datahub Architect to design and implement a centralized Data Hub for with our client. The ideal candidate will have deep expertise in Azure Cloud Services, API integrations, ETL pipeline development, and Power BI to enable seamless data synchronization and reporting across multiple platforms. This role involves working with Home Depot, ServiceTitan, and Sage APIs to automate data exchange, build a scalable Azure SQL database, and develop Power BI dashboards for operational insights. Key Responsibilities: API Integration Development: •Assess and confirm API access & authentication for Home Depot, ServiceTitan, and Sage. •Develop middleware to automate data exchange between systems. •Ensure secure and reliable data synchronization, including logging and error handling. Data Hub Development (Azure SQL): •Design and implement a structured, scalable database for cross-platform data. •Develop ETL pipelines for data extraction, transformation, and loading (Azure Data Factory). •Store and normalize job details, customer records, and transactional data for real-time reporting. Power BI Dashboards & Reporting: •Develop interactive dashboards for operational insights. •Enable self-service reporting with predefined data models. •Implement role-based access controls for data security. Deployment & Go-Live Support: •Conduct integration testing to validate API workflows. •Optimize database performance for efficiency. Required Skills & Experience: •Azure Cloud Services (Azure SQL, Azure Data Factory, Azure Logic Apps) •API Integration & Middleware Development (REST APIs, JSON, OAuth) •ETL & Data Pipeline Development (Azure Data Factory, SQL, Python)• •Database Design & Management (Azure SQL, Indexing, Optimization) •Power BI Development (DAX, Data Modeling, Security Roles) •Experience with third-party platforms (Home Depot APIs, ServiceTitan, Sage, Paylocity) •Security & Compliance Knowledge (Role-based access, Data validation) •Strong Problem-Solving & Troubleshooting Skills Preferred Qualifications: •Experience working with field service management systems. •Prior experience in construction, plumbing, or home services industries. •Knowledge of batch data exports and manual data import processes as contingency measures. Project Details: •Project Duration: March 2025 - June 2025 •Work Location: Remote

Tags: Data Extraction, ETL, ETL Pipeline, Data Analysis, Data Mining, Analytical Presentation

Keyword: Python Development

Job Type: Hourly

 

Seeking Expert M365 Dev, Power Automate, Power BI Programmer

We are seeking a qualified SharePoint, Power Automate, and Power BI developer to automate key reporting functions and improve data workflow efficiency. The primary goal of this engagement is to streamline business operations by reducing manual tasks, increasing visibili...

View Job
AI Vibe Coder

AI Vibe Coder – BlockchainUnmasked Location: Phoenix AZ Employment Type: Full-Time / Contract About Us At BlockchainUnmasked, we’re not just building AI—we’re shaping the future of forensic crypto investigations. Our platforms, including Sherlock, leverage AI to track s...

View Job
Data Solutions Architect who can assess and create a strategy

This 4-week engagement focuses on evaluating the existing SQL Server environment, assessing performance and security, and providing a structured data restructuring strategy. The Data Solutions Architect will develop a detailed roadmap that ensures minimal business disru...

View Job