Overview
- Ability to design database structure and experience in SQL programming language
- Experience in data modelling, data cleansing and ETLs (extract, transform and loads)
- Experience in data integration tool such as SSIS
- Familiarity with Tableau like Business Intelligence tool
- Understanding of Insurance and pensions domain
- Experience in Agile working
Key Responsibilities
- Design and implement data models to ensure high data integrity.
- Build and maintain efficient ETL pipelines for data extraction, transformation, and loading.
- Optimize data storage solutions for performance and scalability.
- Work with data scientists to define data requirements and ensure access to necessary data.
- Collaborate with IT teams to ensure infrastructure is aligned with data storage and processing requirements.
- Implement data governance and compliance measures to safeguard data security.
- Monitor and troubleshoot data pipeline systems to ensure reliability.
- Document data engineering processes and architectures for future reference.
- Keep up to date with emerging technologies and trends in data engineering.
- Automate repetitive tasks using scripting languages.
- Evaluate and recommend tools and technologies for data management purposes.
- Participate in system design reviews to ensure data integration effectiveness.
- Provide support for data-related incident resolution and troubleshooting.
- Conduct performance tuning of existing data processes.
- Contribute to the establishment of best practices for data engineering.
Required Qualifications
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 3+ years of experience in data engineering or a related role.
- Strong proficiency in SQL and relational databases.
- Experience with big data technologies such as Hadoop, Spark, or similar frameworks.
- Knowledge of cloud platforms like AWS, Azure, or Google Cloud.
- Proficiency in programming languages such as Python or Java.
- Experience in designing and implementing data warehouses.
- Familiarity with data pipeline tools such as Apache Airflow or similar.
- Understanding of data architecture best practices.
- Experience with data visualization tools is a plus.
- Knowledge of data governance and data quality frameworks.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration skills.
- Ability to work in fast-paced environments and manage multiple priorities.
- Familiarity with version control systems like Git.
Skills: sql,java,google cloud,cloud,sql proficiency,data modeling,data quality,azure,data engineering,data cleansing,ssis,data integration,etl,python,cloud technologies,hadoop,tableau,aws,data governance,integration,apache airflow,spark,data modelling,data warehousing