Job Title: Senior Data Engineer - Snowflake + dbt
Location: Hyderabad, India
Job Type: Full-time
Job Description
We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team at Logic Pursuits. In this role, you will lead the design and implementation of scalable, high-performance data pipelines using Snowflake and dbt, define architectural best practices, and drive data transformation at scale. You’ll work closely with clients to translate business needs into robust data solutions and play a key role in mentoring junior engineers, enforcing standards, and delivering production-grade data platforms.
Key Responsibilities
- Architect and implement modular, test-driven ELT pipelines using dbt on Snowflake.
- Design layered data models (e.g., staging, intermediate, mart layers / medallion architecture) aligned with dbt best practices.
- Lead ingestion of structured and semi-structured data from APIs, flat files, cloud storage (Azure Data Lake, AWS S3), and databases into Snowflake.
- Optimize Snowflake for performance and cost: warehouse sizing, clustering, materializations, query profiling, and credit monitoring.
- Apply advanced dbt capabilities including macros, packages, custom tests, sources, exposures, and documentation using dbt docs.
- Orchestrate workflows using dbt Cloud, Airflow, or Azure Data Factory, integrated with CI/CD pipelines.
- Define and enforce data governance and compliance practices using Snowflake RBAC, secure data sharing, and encryption strategies.
- Collaborate with analysts, data scientists, architects, and business stakeholders to deliver validated, business-ready data assets.
- Mentor junior engineers, lead architectural/code reviews, and help establish reusable frameworks and standards.
- Engage with clients to gather requirements, present solutions, and manage end-to-end project delivery in a consulting setup.
Required Qualifications:
- 5 to 8 years of experience in data engineering roles, with 3+ years of hands-on experience working with Snowflake and dbt in production environments.
Technical Skills:
Cloud Data Warehouse & Transformation Stack:
- Expert-level knowledge of SQL and Snowflake, including performance optimization, storage layers, query profiling, clustering, and cost management.
- Experience in dbt development: modular model design, macros, tests, documentation, and version control using Git.
Orchestration and Integration:
- Proficiency in orchestrating workflows using dbt Cloud, Airflow, or Azure Data Factory.
- Comfortable working with data ingestion from cloud storage (e.g., Azure Data Lake, AWS S3) and APIs.
Data Modelling and Architecture:
- Dimensional modelling (Star/Snowflake schemas), Slowly changing dimensions.
- Knowledge of modern data warehousing principles.
- Experience implementing Medallion Architecture (Bronze/Silver/Gold layers).
- Experience working with Parquet, JSON, CSV, or other data formats.
Programming Languages:
- Python: For data transformation, notebook development, automation.
- SQL: Strong grasp of SQL for querying and performance tuning.
- Jinja (nice to have): Exposure to Jinja for advanced dbt development.
Data Engineering & Analytical Skills:
- ETL/ELT pipeline design and optimization.
- Exposure to AI/ML data pipelines, feature stores, or MLflow for model tracking (good to have).
- Exposure to data quality and validation frameworks.
Security & Governance:
- Experience implementing data quality checks using dbt tests.
- Data encryption, secure key management and security best practices for Snowflake and dbt.
Soft Skills & Leadership:
- Ability to thrive in client-facing roles with competing/changing priorities and fast-paced delivery cycles.
- Stakeholder Communication: Collaborate with business stakeholders to understand objectives and convert them into actionable data engineering designs.
- Project Ownership: End-to-end delivery including design, implementation, and monitoring.
- Mentorship: Guide junior engineers and establish best practices; Build new skill in the team.
- Agile Practices: Work in sprints, participate in scrum ceremonies, story estimation.
- Education:
- Bachelor’s or master’s degree in computer science, Data Engineering, or a related field.
- Certifications such as Snowflake SnowPro Advanced, dbt Certified Developer are a plus.
Why Join Us?
- Work on modern data architecture at scale using cutting-edge Azure technologies.
- Collaborate with a dynamic and talented team of data professionals.
- Grow your leadership skills through hands-on project ownership and mentorship.
- Competitive compensation, flexible work options, and continuous learning culture.