Please go through role below and share your profile with details -
Role: GCP Data Engineer
Location: Richardson, Dallas- TX (Day 1 onsite)
Duration: Open for C2C / C2H / Full Time
Responsibilities
Experience with GCP services such as Compute Engine, Data Proc, Kubernetes Engine, Cloud Storage, BigQuery, PUB/SUB, Cloud Functions and Dataflow.
Cloud Composer, ETL experience - working with large data sets, PySpark, Python, Spark SQL, DataFrames, PyTest
Develop and implement proactive monitoring and alert mechanism for data issues.
Familiarity with CI/CD pipelines and automation tools such as Jenkins, GitHub & GitHub actions.
Able to write complex SQL queries for business results computation
Develop architecture recommendations based on GCP best practices and industry standards.
Work through all stages of a data solution life cycle: analyze/profile data, create conceptual, logical & physical data model designs, architect and design ETL, reporting, and analytics solutions.
Conduct technical reviews and ensure that GCP solutions meet functional and non-functional requirements.
Strong knowledge of GCP architecture and design patterns