Devlats Pvt Ltd

ETL Integration Engineer - Apache Airflow/Informatica

SC, TS, IN

about 1 month ago
Save Job

Summary

We are seeking a highly skilled ETL Integration Engineer with 6-8 years of experience in data integration, cloud data platforms, and ETL/ELT pipeline development. The ideal candidate should have expertise in ETL tools such as dbt, Apache Airflow, Informatica, Talend, or Fivetran, strong SQL proficiency, and hands-on experience in cloud data platforms (AWS Redshift, Snowflake, Google BigQuery, Azure Synapse). The role involves integrating enterprise applications like ERP, CRM, OMS, and PIM systems into a centralized data ecosystem while ensuring data security and governance.

Key Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines for large-scale data ingestion, transformation, and processing.
  • Work with ETL tools such as dbt, Apache Airflow, Informatica, Talend, or Fivetran in cloud or hybrid environments.
  • Optimize data workflows to improve efficiency, scalability, and reliability.
  • Work with SQL, NoSQL, and Graph databases for data modeling, schema design, and query optimization.
  • Optimize SQL queries for performance, indexing, and partitioning.
  • Manage cloud-based data platforms such as AWS Redshift, Snowflake, Google BigQuery, and Azure Synapse.
  • Implement role-based access control (RBAC), row-level security (RLS), column-level security (CLS), and dynamic data masking (DDM) to enforce data security.
  • Ensure compliance with data governance frameworks, industry standards, and privacy regulations (GDPR, CCPA).
  • Monitor and maintain data quality, lineage, and security policies.
  • Integrate ERP, CRM, OMS, and PIM systems with cloud data platforms for centralized data management.
  • Develop and maintain real-time and batch data processing pipelines for enterprise applications.
  • Work with APIs, webhooks, and data streaming technologies for seamless system integration.
  • Enable commerce analytics and BI solutions by ensuring timely and accurate data availability.
  • Support BI teams by optimizing data structures for reporting and dashboarding in Power BI, Tableau, or Looker.
  • Collaborate with data science teams to facilitate predictive analytics and machine learning workflows.
  • Deploy data pipelines in cloud environments (AWS, Azure, GCP) using Infrastructure as Code (IaC) tools like Terraform or Automate data pipeline deployments using CI/CD pipelines (Jenkins, GitHub Actions, or AWS CodePipeline).
  • Monitor pipeline performance, failures, and optimizations with logging and alerting tools (Datadog, Prometheus, Grafana).

Required Skills & Qualifications

  • 6+ years of hands-on experience with ETL/ELT tools such as dbt, Apache Airflow, Informatica, Talend, or Fivetran.
  • Expertise in SQL and database technologies (SQL, NoSQL, Graph databases) with experience in data modeling, schema design, and

query optimization.

  • Strong cloud experience with platforms like AWS Redshift, Snowflake, Google BigQuery, or Azure Synapse.
  • Knowledge of data security measures including RBAC, RLS, CLS, and DDM.
  • Experience in integrating enterprise applications (ERP, CRM, OMS, PIM) into a centralized data ecosystem.
  • Strong understanding of data architecture, governance, and compliance standards.
  • Experience with APIs, data streaming (Kafka, Kinesis, or Pub/Sub), and batch processing.
  • Familiarity with Python, Scala, or Java for scripting and automation in ETL workflows.
  • Experience with Data Mesh and Data Fabric architectures.
  • Knowledge of GraphQL and REST API-based data integration.
  • Hands-on experience with Kubernetes, Docker, and serverless computing.
  • Background in eCommerce data solutions and commerce analytics.

(ref:hirist.tech)

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job