Working location: Remote full-time (based in Ha Noi, Da Nang or Ho Chi Minh)
Salary range: USD 1,500 – 5,500 gross
Company Overview: A startup laser-focused on delivering specialized tech solutions
RESPONSIBILITIES
As a Data Engineer, you will play a pivotal role in designing and implementing modern data
platforms to support data-driven decision-making. This is a hands-on role that requires expertise in
building scalable, efficient, and high-performing data architectures.
Beyond project execution, you will contribute to growing the company’s consulting practice, including
recruiting efforts, technical collateral creation, and staying at the forefront of technology trends
through training and certifications. You will also be responsible for building long-term strategic
relationships with clients while participating in all aspects of project delivery.
•Lead discovery sessions with clients to understand their business needs, data requirements,
and challenges.
•Design and develop data architectures, ensuring scalability, security, and efficiency.
•Build data pipelines to collect, process, and analyze structured and unstructured data from
multiple sources.
•Implement data validation and testing processes to ensure accuracy and efficiency.
•Automate data collection, processing, and reporting to improve efficiency and reduce manual
effort.
•Create high-quality documentation for problem statements, requirements, solutions, and
designs.
•Support pre-sales activities, including whiteboarding sessions, solution architecture design,
and proposal development.
•Develop reusable and repeatable collateral for use across the practice.
•Obtain and maintain certifications in relevant cloud and data technologies.
•Collaborate with the marketing team to produce technical content promoting the company’s
expertise in data engineering.
REQUIREMENTS
•Bachelor’s degree in Computer Science, Information Technology, or a related field.
•5-8 years of experience in data engineering, database architecture, and data management.
•Strong problem-solving, debugging, and analytical skills.
•Ability to work both independently and in a team environment.
•Excellent communication skills to convey complex ideas to team members and clients.
•Strong SQL skills and understanding of relational database concepts.
•Experience integrating structured and unstructured data from multiple sources in batch and
streaming modes.
•Proficiency in cloud computing platforms: AWS, GCP, or Azure.
•Hands-on experience with ETL tools or cloud data services such as Azure Data Factory, dbt,
AWS Glue, or Matillion.
•Familiarity with modern data warehousing solutions like Snowflake, Redshift, BigQuery, or
Synapse.
•Experience with data visualization tools like Power BI, Looker, Tableau, or QuickSight.
•Knowledge of Docker and Kubernetes for containerized deployments.
•Ability to debug and optimize existing data infrastructure and processes.
•Proficient in English.
*** Nice to have
•Experience with high-throughput, large-scale data systems.
•Relevant certifications in cloud, data engineering, or data visualization.
•Proficiency in at least one programming language (Python, Java, or Scala).
•Exposure to machine learning, AI, and LLMs with practical implementations.
•Familiarity with legacy data systems (Hadoop, Informatica).