Position Title: Data Engineer
Experience Required: 8–10 Years
Work Location: Hybrid (3 Days in Office) – Locations: Pune, Bangalore, Noida, Mumbai, Hyderabad
Working Hours: 11:00 AM – 8:00 PM (Flexibility required for project overlaps)
Role Overview
We are seeking a highly skilled and experienced Data Engineer to join our dynamic and growing team. This role is ideal for someone passionate about working with large datasets, designing scalable data pipelines, and driving data architecture for high-impact business outcomes. The successful candidate will play a key role in developing and maintaining our data infrastructure while collaborating closely with analytics, engineering, and business teams.
Key Responsibilities
- Design, develop, and maintain robust, scalable, and efficient data pipelines to process large volumes of structured and unstructured data from multiple sources.
- Write complex SQL queries and Python scripts to clean, transform, and validate data to ensure data quality and consistency.
- Work with cloud platforms, preferably Google Cloud Platform (GCP) or Amazon Web Services (AWS), to develop and manage cloud-native data processing solutions.
- Collaborate with stakeholders to understand data requirements and translate them into technical solutions, including data ingestion, transformation, and storage strategies.
- Implement effective data modeling techniques to support analytics and reporting needs while ensuring optimized performance and cost-efficiency.
- Participate in code reviews, system design discussions, and provide technical mentorship to junior team members.
- Monitor and troubleshoot data workflows and pipelines to maintain a high level of reliability and performance.
- Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review.
- Maintain thorough documentation of data engineering processes and systems for transparency and knowledge sharing.
Must-Have Skills
- 8–10 years of relevant experience as a Data Engineer or in a similar data-centric engineering role.
- Strong command of Python programming for data transformation, automation, and scripting.
- Expertise in SQL, including query optimization and database performance tuning.
- Solid hands-on experience with at least one major cloud platform, with a preference for GCP or AWS.
- Demonstrated capability in building end-to-end data pipelines for ETL/ELT processes in production environments.
- Experience in data modeling and designing scalable data structures to support business intelligence and analytics.
Nice-to-Have
- Familiarity with DevOps practices, CI/CD pipelines for data applications.
- Experience with modern data tools such as Apache Airflow, dbt, BigQuery, Redshift, or Snowflake.
- Knowledge of version control systems like Git.
Skills: sql,modeling,data,design,gcp,bigquery,cloud,python,etl,snowflake,redshift,analytics,data modeling,dbt,git,pipelines,apache airflow,aws