Data Engineer
We are seeking a motivated and skilled Mid-Level Data Engineer to join our Data Engineering Team. In this role, you will assist in designing, developing, and maintaining data solutions primarily using Microsoft Azure data services. You will collaborate closely with senior data engineers, data and BI analysts, and business stakeholders to ensure reliable, efficient, and secure data processes.
Key Responsibilities:
* Implement and support ETL processes for effective data ingestion, transformation, and storage from various data sources.
* Work with senior engineers, analysts, and stakeholders to translate business requirements into technical solutions.
* Monitor and troubleshoot data pipeline issues, ensuring data accuracy, quality, and availability.
* Participate in optimizing data workflows for enhanced performance and cost-efficiency.
* Assist in enforcing data governance, compliance, and security best practices.
* Create and manage detailed documentation for data pipelines and integration processes.
* Perform source data analysis, data mapping, and support data model implementations.
* Utilize orchestration tools such as Apache Airflow to automate and streamline data workflows.
* Participate in Agile ceremonies, including sprint planning, retrospectives, and code reviews.
* Provide support in testing, validation, and deployment phases of data solutions.
Qualifications:
* Bachelor's degree in Computer Science, Information Technology, or a related discipline.
* Minimum of 4 years of relevant experience in Data Engineering with a focus on Microsoft Azure.
Required Skills:
* Solid experience with Azure Data Factory, Azure Databricks, Azure SQL Database, and Azure Data Lake Storage.
* Proficiency in SQL and Python, with experience in Shell scripting.
* Hands-on experience in ETL processes, data warehousing, and data integration.
* Good understanding of data modeling, data governance, and data security practices.
* Experience with data orchestration tools like Apache Airflow.
* Familiarity with APIs, file transfer protocols (SFTP/FTPS), streaming data, and database connectors.
* Practical knowledge of Agile methodologies and software development lifecycle (SDLC).
Preferred Skills:
* Exposure to BI tools such as Tableau or Power BI.
* Familiarity with performance tuning and optimization techniques.
Personal Attributes:
* Strong analytical and problem-solving abilities.
* Ability to collaborate effectively with teams and stakeholders across various disciplines.
* Good communication skills, capable of clearly conveying technical information.
* Organized, detail-oriented, and committed to delivering high-quality results.
* Enthusiasm for continuous learning and professional growth.
Why Join Us at APCO Holdings
At APCO Holdings, we are a trusted leader in the automotive industry, providing F&I products and services through our renowned brands, including EasyCare, GWC Warranty and MemberCare. With over 35 years of experience, we've protected more than 11 million drivers and paid out over $3.5 billion in claims, underscoring our commitment to excellence and customer satisfaction.
This role will be critical in driving the successful execution of complex data engineering initiatives, supporting scalable and secure data pipelines, and helping build a modern data ecosystem for one of the nation's leading automotive F&I products and services companies.