Edgematics Group

Edgematics - Technical Lead - Data Engineering

Pune, MH, IN

27 days ago
Save Job

Summary

Job Title: Tech Lead (Sr.

Data Engineer with 9+ Years of Experience).

Location : Pune.

Job Type : Full-time, Onsite.

Edgmatics Technologies is a leading global consulting and solutions integrator in the field of data management, business intelligence, and analytics.

We specialize in delivering end-to-end data management solutions to help organizations make data-driven decisions and achieve their business objectives.

About This Role.

We are looking for a talented and experienced Data Engineer with Tech Lead with hands-on expertise in any ETL Tool with full knowledge about CI/CD practices with leading a team technically more than 5 and client facing and create Data Engineering, Data Quality frameworks.

As a tech lead must ensure to build ETL jobs, Data Quality Jobs, Big Data Jobs performed performance optimization by understanding the requirements, create re-usable assets and able to perform production deployment and preferably worked in DWH appliances Snowflake / redshift / Synapse.

Responsibilities

  • Work with a team of engineers in designing, developing, and maintaining scalable and efficient data solutions using Any Data Integration (any ETL tool like Talend / Informatica) and any Big Data technologies.
  • Design, develop, and maintain end-to-end data pipelines using Any ETL Data Integration (any ETL tool like Talend / Informatica) to ingest, process, and transform large volumes of data from heterogeneous sources.
  • Have good experience in designing cloud pipelines using Azure Data Factory or AWS Glues/Lambda.
  • Implemented Data Integration end to end with any ETL technologies.
  • Implement database solutions for storing, processing, and querying large volumes of structured and unstructured and semi-structured data.
  • Implement Job Migrations of ETL Jobs from Older versions to New versions.
  • Implement and write advanced SQL scripts in SQL Database at medium to expert level.
  • Work with technical team with client and provide guidance during technical challenges.
  • Integrate and optimize data flows between various databases, data warehouses, and Big Data platforms.
  • Collaborate with cross-functional teams to gather data requirements and translate them into scalable and efficient data solutions.
  • Optimize ETL, Data Load performance, scalability, and cost-effectiveness through optimization techniques.
  • Interact with Client on a daily basis and provide technical progress and respond to technical questions.
  • Implement best practices for data integration.
  • Implement complex ETL data pipelines or similar frameworks to process and analyze massive datasets.
  • Ensure data quality, reliability, and security across all stages of the data pipeline.
  • Troubleshoot and debug data-related issues in production systems and provide timely resolution.
  • Stay current with emerging technologies and industry trends in data engineering technologies, CI/CD, and incorporate them into our data architecture and processes.
  • Optimize data processing workflows and infrastructure for performance, scalability, and cost-effectiveness.
  • Provide technical guidance and foster a culture of continuous learning and improvement.
  • Implement and automate CI/CD pipelines for data engineering workflows, including testing, deployment, and monitoring.
  • Perform migration to production deployment from lower environments, test & validate.

Must Have Skills

  • Must be certified in any ETL tools, Database, Cloud.

(Snowflake certified is more preferred).

  • Must have implemented at least 3 end-to-end projects in Data Engineering.
  • Must have worked on performance management optimization and tuning for data loads, data processes, data transformation in big data.
  • Must be flexible to write code using JAVA/Scala/Python etc.as required.
  • Must have implemented CI/CD pipelines using tools like Jenkins, GitLab CI, or AWS CodePipeline.
  • Must have managed a team technically of min 5 members and guided the team technically.
  • Must have the Technical Ownership capability of Data Engineering delivery.
  • Strong communication capabilities with client facing.
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • 5 years of experience in software engineering or a related role, with a strong focus on Any ETL Tool, database, integration.
  • Proficiency in Any ETL tools like Talend , Informatica etc for Data Integration for building and orchestrating data pipelines.
  • Hands-on experience with relational databases such as MySQL, PostgreSQL, or Oracle, and NoSQL databases such as MongoDB, Cassandra, or Redis.
  • Solid understanding of database design principles, data modeling, and SQL query optimization.
  • Experience with data warehousing, Data Lake , Delta Lake concepts and technologies, data modeling, and relational databases.

(ref:hirist.tech)

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job