What is your current level of experience in freelance work?
We are seeking a hands‑on Senior Data Engineer with deep expertise in dbt (data build tool) and Snowflake to design, develop, and optimize our modern analytics stack. You will own the transformation layer, enforce engineering best practices, and work closely with analysts, data scientists, and DevOps to deliver reliable, governed, and high‑performance data models. Experience orchestrating workloads in Apache Airflow (or other workflow schedulers) is highly desirable. 11+ years building ELT pipelines in cloud data platforms (Snowflake, BigQuery, Redshift, etc.). 8+ years of hands‑on dbt development (YAML configs, macros, tests, dbt‑Cloud or Core). Proficiency in advanced SQL and performance tuning. Strong Git workflow experience and familiarity with CI/CD tools (GitHub Actions, Azure Pipelines, GitLab CI, etc.). Solid understanding of dimensional modeling, schema design, and data‑quality frameworks. Excellent communication skills—able to explain complex data concepts to non‑technical stakeholders. Preferred / “Plus” Skills Production experience with Apache Airflow, Exposure to Python for utility scripting and custom dbt macros. Familiarity with data observability tools (e.g., Monte Carlo, Great Expectations). Knowledge of Snowflake features such as Dynamic Tables, Streams & Tasks, and Snowpark.