This job has expired

Check similar jobs, what people also searched, or create a job alert for Data Engineer jobs in Vienna, Austria

Expired

ThoughtLabs Belgium

Data Engineer

Vienna, Austria

8 days ago
Save Job

Summary

Position: Data Engineer (Snowflake)


For this specific position only, the candidate can choose to work either:

a) Onsite on the premises (Vienna, Austria)

OR

b) Fully remote


Your tasks:

  • Designs and develops Snowflake and Azure Data Factory data pipelines that extract data from various sources, transform it into the desired format, and load it into the appropriate data storage systems;
  • Integrates data from different sources, including cloud and on-premise SQL and Oracle databases, data warehouses, APIs, and external systems;
  • Ensures data consistency and integrity during the integration process, performing data validation and cleaning as needed;
  • Transforms raw data into a usable format by applying data cleansing, aggregation, filtering, and enrichment techniques;
  • Implements data quality checks and validations within data pipelines to ensure the accuracy, consistency, and completeness of data;
  • Establishes the governance of data and algorithms used for analysis, analytical applications, and automated decision-making;
  • Monitors and tunes data systems, identifies and resolves performance bottlenecks, and implements caching and indexing strategies to enhance query performance.


Requirements

  • University degree in IT or relevant discipline, combined with a minimum of 8 years' relevant experience in Information Technology, of which at least three years should have been in managing the development or deployment of data analysis, management, and integration solutions;
  • Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP) and modern data warehouse tools (Snowflake);
  • Hands-on experience with Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines;
  • Hands-on experience designing, building, and deploying data solutions that capture, explore, transform, and utilize data to support AI, ML, and BI ;
  • Expert in ETL languages/tools such as Python, SQL, R, SAS, or Excel;
  • Expert with database technologies such as SQL, NoSQL, Oracle, Hadoop, or Teradata;
  • Excellent command of the English language.

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: