We are seeking a hands‑on Senior Data Engineer with deep expertise in dbt (data build tool) and Snowflake to design, develop, and optimize our modern analytics stack. You will own the transformation layer, enforce engineering best practices, and work closely with analysts, data scientists, and DevOps to deliver reliable, governed, and high‑performance data models. Experience orchestrating workloads in Apache Airflow (or other workflow schedulers) is highly desirable. 11+ years building ELT pipelines in cloud data platforms (Snowflake, BigQuery, Redshift, etc.). 8+ years of hands‑on dbt development (YAML configs, macros, tests, dbt‑Cloud or Core). Proficiency in advanced SQL and performance tuning. Strong Git workflow experience and familiarity with CI/CD tools (GitHub Actions, Azure Pipelines, GitLab CI, etc.). Solid understanding of dimensional modeling, schema design, and data‑quality frameworks. Excellent communication skills—able to explain complex data concepts to non‑technical stakeholders. Preferred / “Plus” Skills Production experience with Apache Airflow, Exposure to Python for utility scripting and custom dbt macros. Familiarity with data observability tools (e.g., Monte Carlo, Great Expectations). Knowledge of Snowflake features such as Dynamic Tables, Streams & Tasks, and Snowpark.
Keyword: Data Analysis
Price: $45.0
dbt Snowflake Apache Airflow ETL Pipeline
Seeking a current or former academic researcher or doctoral student with experience conducting qualitative analysis in the social sciences - specifically coding transcripts using either Excel or software like NVivo or Atlas.ti - to do a first pass on five partial transc...
View JobThe purpose of this project is to conduct micro-marketing data research for a product, in order to determine consumers' purchase intention of the product. Need details for the company, team members, price, information and data analysis for the product.In this proje...
View Job