We are looking for an experienced Data Engineer to design, build, and optimize scalable data systems that support a high-performance live-streaming platform. You will work with large-scale data, develop ETL workflows, and ensure data reliability and accessibility.
Responsibilities
Design and develop scalable, high-performance data systems from scratch
Build and maintain ETL workflows for various data sources
Optimize query execution and performance in large-scale environments
Troubleshoot and resolve data integration issues
Maintain datasets and ensure data quality and accessibility
Collaborate with cross-functional teams to improve data-driven decision-making
Stay up to date with best practices in data engineering and backend development
Requirements
5+ years of experience as a Data Engineer or Software Engineer in data-related roles
Strong proficiency in Python and data libraries
3+ years of experience with cloud platforms (GCP and/or AWS)
Expertise in data modeling, ETL processes, data warehousing, and SQL scripting
Experience with big data technologies and data streaming (e.g., Kafka)
Knowledge of microservices, GitOps, and CI/CD pipelines
Strong problem-solving skills, adaptability, and ability to work in a fast-paced environment
Nice to have
Experience with BigQuery
Background in large-scale infrastructure or distributed systems
Familiarity with Terraform or other CI/CD tools
Understanding of functional programming
If you're passionate about data engineering and want to work on cutting-edge technology, we’d love to hear from you.
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job