Role: GCP Data Engineer Experience: 5-8 years (Mid-Level) Work Mode: Onsite (Client Office) Preferred Locations: Delhi, Pune, Bangalore (Hyderabad & Chennai also acceptable) Time zone: Aligned to UK time zone
JD: We are looking for experienced Data Engineers to design, develop, and optimize large-scale data processing systems. The ideal candidate will have strong experience in data ingestion and a solid understanding of distributed computing, data pipelines, and performance optimization techniques.
Technical skills required :
GCP, Dataproc, BigQuery, Dataflow, PySpark, Python, Java, CDC, Data Ingestion
Key Responsibilities:
Design, build, and manage scalable and reliable data pipelines.
Implement data processing solutions.
Develop data models and analytics queries.
Optimize data processing for performance and cost-efficiency.
Collaborate with data scientists, analysts, and other engineers to integrate and transform data.
Ensure high data quality, validation, and monitoring.
Participate in code reviews, deployment activities, and production support.
Required Skills:
Strong hands-on experience with cloud services.
Proficient in building distributed data processing pipelines.
Experience in developing scalable data pipelines.
Solid understanding of data structures, algorithms, and performance tuning.
Familiarity with data governance and security practices in cloud environments.
Excellent problem-solving and communication skills.
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job