DigiHaul

Data Engineer

Federal Territory of Kuala Lumpur, MY

4 days ago
Save Job

Summary

The Data Engineer role will be responsible for maintaining and further developing DigiHaul's data platform. This will include managing existing ETL processes, building new data pipelines, troubleshooting and resolving issues when required and supporting scaling our data infrastructure to support analytics and data science initiatives.

 

Main Accountabilities

  • Develop and Maintain Data Solutions: Support data lake and warehouse systems for efficient data collection, storage, processing, and transformation.
  • Assist in Building and Optimizing Data Pipelines: Help design and develop reliable ETL/ELT pipelines for automated data ingestion and reporting.
  • Monitor and Troubleshoot System Performance: Monitor cloud systems, identify issues, and resolve them with senior team support.
  • Support Data Infrastructure and Scalability: Assist in scaling and optimizing data infrastructure, refining data models, and ensuring performance best practices.
  • Participate in Learning and Development: Learn new tools and best practices, seeking guidance to improve skills and contribute to team goals.
  • Manage Workload and Updates: Prioritize tasks, meet deadlines, and provide clear updates during team meetings.
  • Collaborate and Communicate Effectively: Work with cross teams to deliver data solutions and provide progress updates.


Desirable Experiences and Skills

  • Minimum 5 years of relevant work experiences.
  • Bachelor's degree in Computer Science (or related field).
  • Minimum 4 years of experience with SQL, Python for data manipulation, analysis, and scripting.
  • Minimum 3 years of hands-on experience in managing and building ETL and ELT pipelines.
  • Minimum 3 years of experience with Cloud Data Engineering and any cloud data warehouses - Snowflake, Databricks. Snowflake is desirable.
  • Minimum 2 years of experience in Power BI.
  • Minimum 2 years of experience in Git or any version control tools.
  • Minimum 1 year of experience with Orchestration Airflow.
  • Familiarity with real-time streaming technologies such as Apache Kafka is a plus.
  • ​Familiarity with Docker, Containerization and Azure Kubernetes Service (AKS) is a plus.
  • Familiarity with DBT is a plus.


Benefits

  • Hybrid Work Arrangement
  • Medical and Insurance Benefits
  • Dental and Optical Benefits
  • Annual Performance Bonus

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: