
Opdrachten
Info
Functie
Data EngineerLocatie
AmsterdamUren per week
40 uren per weekLooptijd
21.06.2025 - 30.12.2025Opdrachtnummer
234576Sluitingsdatum
Geen zzp'ers aanbieden. Max 2 kandidaten per leverancier.
Duties:
Responsible for building data pipelines for the Engine Project (BtC):
Role Summary:
We are looking for a medior/senior Data Engineer to join our growing Data & Analytics team. The successful candidate will be responsible for designing, developing, and maintaining data pipelines and architectures on the Azure platform, with a focus on Snowflake as the data warehouse and Matillion as the primary ETL tool. This role will play a key part in enabling scalable, high-performance data solutions that support business intelligence, analytics, and data science initiatives.
Key Responsibilities:
Design and develop robust, scalable, and efficient data pipelines using Matillion ETL on Snowflake.
Build and maintain secure and optimized data models within Snowflake, ensuring best practices in schema design and data governance.
Utilize Azure services (e.g., Azure Data Factory, Azure Data Lake, Azure Blob Storage, Azure Synapse Analytics) to manage and orchestrate data flows.
Collaborate with Data Architects, Analysts, and Business Stakeholders to understand requirements and deliver high-quality data solutions.
Monitor and troubleshoot ETL/ELT workflows and performance-tune processes for efficiency and reliability.
Implement and maintain data quality, lineage, and documentation practices.
Participate in code reviews and adhere to DevOps/CI-CD practices for deploying data pipelines.
Stay current with industry trends and emerging tools in the data engineering and cloud space.
Dutch + English language
Skills:
Snowflake, Matillion, Azure, SQL, Python.
Education:
Academic.
Vattenfall AB
Geen zzp'ers aanbieden. Max 2 kandidaten per leverancier.
Duties:
Responsible for building data pipelines for the Engine Project (BtC):
Role Summary:
We are looking for a medior/senior Data Engineer to join our growing Data & Analytics team. The successful candidate will be responsible for designing, developing, and maintaining data pipelines and architectures on the Azure platform, with a focus on Snowflake as the data warehouse and Matillion as the primary ETL tool. This role will play a key part in enabling scalable, high-performance data solutions that support business intelligence, analytics, and data science initiatives.
Key Responsibilities:
Design and develop robust, scalable, and efficient data pipelines using Matillion ETL on Snowflake.
Build and maintain secure and optimized data models within Snowflake, ensuring best practices in schema design and data governance.
Utilize Azure services (e.g., Azure Data Factory, Azure Data Lake, Azure Blob Storage, Azure Synapse Analytics) to manage and orchestrate data flows.
Collaborate with Data Architects, Analysts, and Business Stakeholders to understand requirements and deliver high-quality data solutions.
Monitor and troubleshoot ETL/ELT workflows and performance-tune processes for efficiency and reliability.
Implement and maintain data quality, lineage, and documentation practices.
Participate in code reviews and adhere to DevOps/CI-CD practices for deploying data pipelines.
Stay current with industry trends and emerging tools in the data engineering and cloud space.
Dutch + English language
Skills:
Snowflake, Matillion, Azure, SQL, Python.
Education:
Academic.
Voor deze opdracht dien je een bieding te plaatsen op Striive. Striive is het grootste opdrachtenplatform van de Benelux waar jaarlijks meer dan 20.000 opdrachten gepubliceerd worden.