UST

Senior Big Data Engineer

Pune, MH, IN

26 days ago
Save Job

Summary

Job Title: Senior Big Data Engineer

Experience Range: 5 to 10 years

Hiring Location: Pune


Job Description:

We are seeking a Big Data Engineer with 5+ years of experience to play a key role in building, managing, and optimizing our big data pipelines. This role involves working with cutting-edge technologies to enhance product capabilities while collaborating with cross-functional teams.


Key Responsibilities:

  • Work closely with Technical Leaders to design and implement robust Big Data solutions.
  • Participate in design discussions and brainstorming sessions to evaluate tools and frameworks for Big Data processing.
  • Develop and maintain data processing pipelines using distributed frameworks like Apache Spark and Hadoop.
  • Implement ETL processes and optimize data workflows using Apache Airflow, dbt, and related tools.
  • Ensure data quality, security, and performance in large-scale data processing environments.
  • Collaborate with multi-region software and support teams to ensure seamless data integration.
  • Stay up to date with the latest advancements in Big Data technologies and contribute to continuous improvements.


Must-Have Skills:

  • 4+ years of experience in Data Engineering.
  • Strong understanding of Big Data ecosystems, including Spark and Hadoop.
  • Proficiency in ETL processes and data pipeline orchestration using Apache Airflow, dbt, or similar tools.
  • Strong programming skills in Python, Java, or Scala, along with SQL expertise.
  • Experience with version control tools (Git) and build tools like Gradle, Maven, or SBT.
  • Experience in cloud-based data platforms (AWS, GCP, or Azure).
  • Strong understanding of object-oriented programming, data structures, algorithms, and performance optimization.
  • Excellent problem-solving, communication, and collaboration skills.


Good-to-Have Skills:

  • Experience with data warehouse platforms (Snowflake, Redshift, BigQuery).
  • Familiarity with streaming data processing frameworks (Kafka, Flink).
  • Knowledge of containerization and orchestration tools (Docker, Kubernetes).
  • Exposure to DevOps practices and CI/CD pipelines.


Qualifications:

  • Bachelor’s/Master’s degree in Computer Science, Information Technology, or a related field.
  • 4-7 years of professional experience in Big Data Engineering.

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: