Job Title: Data Engineer (DBT & Snowflake)
Location: Dusseldorf, GER
Project Site: Hybrid (2-3 days WFO)
Pay Rate: €400/day
Duration: 6+ Months
________________________________________
Key Responsibilities:
• Design, develop, and optimize data pipelines and transformations using DBT.
• Work with Azure Data Factory (ADF) or similar ETL tools for data orchestration.
• Migrate BODS (or other ETL tools) to DBT and transition data from Oracle to Snowflake.
• Develop and optimize advanced SQL queries for efficient data processing.
• Implement Data Warehousing (DWH) principles and best practices.
• Collaborate with stakeholders to understand business and technical requirements.
• Provide technical leadership to the team and engage with clients for solutions.
• Ensure data quality, governance, and compliance standards.
Must-Have Skills:
• DBT, ADF, Advanced SQL, and foundational Cloud knowledge (AWS/Azure/GCP).
• Strong understanding of Data Warehousing (DWH) concepts and Data Engineering principles.
• Experience in ETL/ELT migration projects (BODS to DBT, Oracle to Snowflake).
• Analytical & problem-solving skills for data analysis and debugging.
• Technical leadership & client-facing experience with quick solutioning skills.
• Hands-on experience with Snowflake.
Nice-to-Have Skills:
• Azure Cloud Services & Azure DevOps (CI/CD).
• Kafka for real-time data streaming.
• Terraform & Containers for cloud infrastructure automation.
• Python for scripting and automation.