Nagarro

Big Data GCP Engineer

Bengaluru, KA, IN

19 days ago
Save Job

Summary

Role: GCP Data Engineer

Required Experience: 5 - 8 years

Job Location: Bengaluru


Job Description:

Must have Skills: Spark, Python, GCP (Dataproc, Pub/Sub, Bigquery etc.), Hadoop, Kafka, SQL Server/NoSQL.


Details

  • Bachelor's Degree in Engineering, Computer Science, CIS, or related field (or equivalent work experience in a related field)
  • 4 years of experience in Data Engineering, Data Lake, Data Mesh, Data Warehousing/ETL
  • 4 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)
  • Experience working with Continuous Integration/Continuous Deployment tools with Git, Bitbucket, Jenkins
  • 4 years of experience in systems analysis, including defining technical requirements and performing high-level design for complex solutions


Required Skills

  • Strong expertise in GCP services: Google Cloud Storage, BigQuery, Dataflow, Cloud Composer, Dataproc, and Pub/Sub
  • 4 years of experience in Hadoop, Google Cloud (GCP), and Big Data components (specific to the Data Engineering role)
  • Expertise in Python, advanced SQL, PySpark (Spark Batch and Streaming), Airflow, Kafka
  • Exposure to REST API creation and usage
  • Experience in functional programming, test-first architecture (unit testing and regression testing), OOPS concepts (classes and objects)
  • Exposure to DB2, Teradata is a plus

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: