Description:
We're seeking a Senior Data Engineer to design and implement robust data infrastructure that powers AI systems. You'll build scalable data pipelines and storage solutions while ensuring data security and compliance. You'll collaborate with data scientists and analysts to enable efficient data access and processing. Responsibilities: Design, develop, and maintain scalable etl/elt pipelines for processing large datasets build and optimize data storage solutions (data warehouses, data lakes, etc.) Work with relational and NoSQL databases to ensure data integrity and performance Automate workflows using Apache Airflow, Prefect, or similar tools Ensure data security, governance, and compliance (gdpr, hipaa, etc.) Collaborate with data scientists and analysts to enable efficient data access Work with cloud platforms (AWS, Azure, GCP) for scalable data solutions Requirements: 5+ years of experience in data engineering, big data processing, or software development Strong programming skills in Python, SQL, or Scala Experience with ETL tools (dbt, Apache NiFi, Talend, etc.) Knowledge of database systems (PostgreSQL, MySQL, Snowflake, Redshift, BigQuery, etc.) Experience with cloud services (AWS S3, Lambda, Glue, Azure Data Factory, GCP BigQuery) Familiarity with data modeling and schema design Understanding of distributed computing frameworks (Hadoop, Spark, etc.) Apply Now!!Category: IT & ProgrammingSubcategory: Data ScienceIs this a project or a position?: ProjectRequired availability: Full time
Tags: Amazon Web Services (AWS), Data Warehousing, Microsoft Azure, Python, SQL, Google Cloud Platform (GCP), Apache Hadoop, Data Science, Extract Transform Load (ETL)
Keyword: NoSQL Databases
Price: $45.0