Digile

Data Engineer/Architect with Hadoop

Federal Territory of Kuala Lumpur, MY

12 days ago
Save Job

Summary

3 Positions open for 5 – 8 years Experience

3 Positions open for 8 – 12 years Experience

KL, Malaysia – Citizens preferred

Responsibilities

  • Design, construct, install, test and maintain highly scalable data management systems & Data Pipeline.
  • Ensure systems meet business requirements and industry practices.
  • Build high-performance algorithms, prototypes, predictive models, and proof of concepts.
  • Research opportunities for data acquisition and new uses for existing data.
  • Develop data set processes for data modeling, mining and production.
  • Integrate new data management technologies and software engineering tools into existing structures.
  • Create custom software components and analytics applications.
  • Install and update disaster recovery procedures.
  • Collaborate with data architects, modelers, and IT team members on project goals.
  • Provide senior level technical consulting to peer data engineers during data application design and development for highly complex and critical data projects.

Qualifications

  • Bachelor’s degree in computer science, Engineering, or related field, or equivalent work experience.
  • Proven 5-8 years of experience as a Senior Data Engineer or similar role.
  • Experience with big data tools: Hive, Hadoop, Spark, Airflow.
  • Good to have: Kafka, Nifi
  • Expert level SQL skills for data manipulation (DML) and validation (DB2).
  • Experience with data pipeline and workflow management tools.
  • Experience with object-oriented/object function scripting languages: Python, Java, Go lang etc.
  • Strong problem solving and analytical skills.
  • Excellent verbal communication skills.
  • Good interpersonal skills.
  • Ability to provide technical leadership for the team.
  • Experience in Datahub is preferred

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job