Rikkeisoft

Data Engineer (Python, Snowflake, Airflow) - Global Market

Ho Chi Minh City, Ho Chi Minh City, VN

6 days ago
Save Job

Summary

Responsibilities

  • Design, develop, test, and deploy new ETL pipelines or enhancements to existing pipelines using Talend in Big Data environment
  • Perform application software development, which includes new development, maintenance and support of applications and provide production support
  • Translate functional requirements into technical designs
  • Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform
  • Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, considering critical performance characteristics and security measures
  • Resolve customer complaints with data and respond to suggestions for improvements and enhancements
  • Recommend strategies for technical aspects of projects as well as broad system improvements
  • Ensures adherence to established standards and may consult with senior managers on technology solutions


Qualifications

  • From 3 years of experience as a Data Engineer working in the IT industry.
  • Bachelor’s degree in computer science/ software engineering, Mathematics
  • Must have experiences/ knowledge in Linux, ETL pipelines, Python, SQL, Snowflake
  • Experience in working with Cloud services: AWS/ Azure/ GCP
  • Nice to have experiences in Java, Airflow, GCP Git, Databricks
  • Familiar with CI/CD pipelines, Docker, new Data tech and tools (Tableau, PowerBI)
  • Strong experience with data manipulation tools and frameworks such as Apache Spark, or Apache Kafka
  • Good understanding of data science and machine learning concepts, with experience implementing Data Models, Data Warehouse
  • Analytical thinking, Teamwork, Automation mindset
  • Good English communication skills

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job