The Data Engineer will play a key role in building and maintaining a modern data infrastructure. They will design, develop, and optimize ELT/ETL pipelines and implement efficient data models to support analytics, reporting, and business decision-making. The role involves deep collaboration with cross-functional teams to deliver scalable and high-performing data solutions using tools such as Snowflake, ADF, dbt, and Python.
Responsibilities include:
Design, build, and maintain scalable and robust data pipelines using ELT/ETL patterns to ingest, transform, and integrate data
Architect and implement efficient data models using Star, Snowflake, and One Wide Table (OWD) design patterns
Maintain and create documentation of data architecture, data pipelines, and processes to ensure transparency and reproducibility
Integrate data from multiple sources including databases, APIs, and third-party platforms using tools like Azure Data Factory (ADF) and dbt
Lead technical discussions, advocate for best practices, and ensure solid data foundations and high standards in data engineering workflows
Optimize data systems for performance and cost efficiency using partitioning, clustering, caching, indexing, and fine-tuning techniques
Perform QA audits, manage data loads, generate memo files, and handle ad hoc data requests to ensure data integrity and reliability
Support analytics and reporting by developing reusable metrics, dashboards, and self-service tools in Power BI and/or Sisense
Enhance SDLC by incorporating CI/CD pipelines, version control (e.g., Git), and continuous improvement practices into data engineering processes
Collaborate with internal and external stakeholders to gather requirements and deliver comprehensive data solutions
Education:
Bachelor’s Degree in Computer Science, Mathematics, Statistics, Finance, Information systems or equivalent related technical field experience
Experience:
5+ years of professional experience in data engineering, data analytics, or a similar technical role
Strong SQL skills with advanced knowledge of Joins, Unions, CTEs, Aggregations, Lag/Lead, and optimization techniques
Proficiency in Python for data manipulation, scripting, and automation
Experience working with Snowflake, dbt, and Azure Data Factory (ADF)
Demonstrated experience in data modeling, including dimensional and modern approaches (Star, Snowflake, OWD)
Hands-on experience in building and maintaining data pipelines (ETL/ELT)
Understanding of cost optimization, caching, partitioning, and indexing strategies for performance tuning
Familiarity with BI tools such as Power BI, Sisense, Looker, Tableau, and Domo
Experience with customer personalization solutions and handling large datasets
Exposure to scripting languages like Python, Perl, or Shell
Tools & Skills:
Deep understanding of complex SQL and Snowflake SQL syntax, including Time Travel, Streams, Cloning, and Role-Based Access
Strong knowledge of Snowflake, Azure Data Factory, and dbt
Experience with version control systems and CI/CD workflows
Knowledge of DataBricks (ADB preferred) and ability to interpret existing solutions
Familiarity with reporting tools, especially Power BI and/or Sisense
Advanced proficiency in Python and Excel for data analysis and transformation
Understanding of data warehousing, proactive data quality monitoring, and structured/unstructured data formats including JSON
Key Competencies:
Proven problem-solving skills and high attention to detail
Ability to partner with business stakeholders to define questions and build data sets to answer them
Capable of navigating ambiguity and balancing multiple priorities in a fast-paced environment
Excellent communication and presentation skills for technical and non-technical audiences
Self-starter with a spirit of innovation and consistent delivery
Demonstrated ability to work collaboratively in multi-disciplinary teams and produce results quickly
Assets:
Experience in Telecom or banking industries, especially related to data collection or retention
Hands-on experience with ADF data transformations for custom reporting models
Experience in scripting and automation using Python, Perl, or Shell
Familiarity with data transformations using tools like dbt
Data analysis, report development, and business analysis
Experience with tools like Looker, Excel, Power BI, Tableau, R, SAS
Powered by JazzHR
HtB8yT6x1J
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job