Lead, design, configure, implement, monitor, and manage Data Integration Frameworks for optimal performance and reliability (20%).
Develop and maintain big data infrastructure systems (data warehouses, data lakes, APIs), using Hadoop or MapReduce platforms (20%).
Provide guidance and hands-on work in modeling Data Warehouse solutions (cloud/on-premise), including dimensional modeling and data warehousing concepts (15%).
Oversee and support engineering data initiatives, both long term and ad-hoc, including data collection frameworks and ETL/ELT activities (15%).
Enforce best practices for data auditing, scalability, reliability, and application performance; develop ETL/ELT solutions (15%).
Analyze and interpret data using statistical techniques, provide actionable insights, and ensure data quality control (10%).
Continually improve job knowledge through professional development and networking (5%).
Required Skills & Experience
10+ years of experience in database design, ETL development, and leading cross-functional data engineering teams.
7+ years as a Big Data Engineer in Hadoop and AWS ecosystem; healthcare industry experience required, BCBS preferred.
Hands-on experience with Ab Initio ETL development, Hadoop, AWS technologies, relational database design, and data modeling.
Experience developing both batch and streaming data applications using Cloudera/Hadoop and/or AWS.
Strong knowledge of data integration, modeling, and warehousing (dimensional modeling, OLAP, de-normalized structures).
Proficiency in implementing scalable and fine-tuned ETL/ELT solutions.
Certifications (Must Have):
AWS Certified Big Data – Specialty
Cloudera Certified Developer for Apache Hadoop (CCDH)
(Good to have) OCP Java SE 6 Programmer Certification
Health insurance customer experience required.
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job