Check similar jobs, what people also searched, or create a job alert for Principal Data Engineer - H&B Data Platform jobs in Arlington County, VA
WTW
Principal Data Engineer - H&B Data Platform
Arlington County, VA
10 days ago
Save Job
Summary
About Us
Our Health and Benefits business helps large and mid-size clients control health and welfare plan costs, improve health outcomes and promote employee engagement through broad-based, state-of-the-art interventions. We provide solutions encompassing creative plan design, vendor evaluation and management, pricing and funding strategies, data analytics, valuation support, legal compliance and governance strategies. We also provide specialty consulting services including clinical/health management program design, pharmacy solutions, disability/absence management strategies and claims audit services. Product based solutions such as our pharmacy purchasing coalition round out our broad-based suite of offerings.
Candidates in the following locations will be considered: United States, Canada, and United Kingdom
About The Role
In this role, we are seeking an experienced, hands-on, and innovative Principal Data Engineer with expertise in the Azure Cloud platform to design, implement, and optimize scalable data solutions. The ideal candidate will have deep expertise in data engineering, data architecture, cloud solutions, and experience with healthcare and benefits data systems. They will work closely with a cross functional team of Data, DevOps, and Analytics engineers to architect a robust data platform for H&B, ensure efficient data management, and support enterprise- level decision making process. The role will work on products which include high concurrency, highly transactional systems as well as big data systems with complex reporting and business intelligence requirements. The Principal Data Engineer will evaluate and improve existing data pipelines and architectures and play a crucial role in creating evolutionary data models that converge into one global data platform.
The Role • Lead the end-to-end data engineering of the health and benefits data platform using Azure services, ensuring scalability, flexibility, and reliability. • Develop broad understanding of the data lakehouse architecture, including the impact of changes on a whole system, built for analytic purposes, the onboarding of client data assets and the security implications of the solution. • Design a new or improve upon existing architecture including data ingestion, storage, transformation and consumption layers. • Provide technical direction and guidance to a team of data engineers • Define flexible, scalable and dynamic ingestion pipelines that can handle all data formats such as CSV, XML, Parquet, Replication Services, APIs, etc. • Define flexible, scalable and repeatable ETL/ELT pipelines to load the data warehouse with precision and quality. • Define data models, schemas, and database structures optimized for H&B use cases including claims, census, placement, broking and finance sources. • Design solutions for seamless integration of diverse health and benefits data sources. • Implement data governance and security best practices in compliance with industry standards and regulations using Microsoft Purview. • Evaluate data lakehouse architecture, within the medallion architecture method, to understand how technical decisions may impact business outcomes and suggest new solutions/technologies that better align to the Health and Benefits Data strategy. • Draw on internal and external practices to establish data lakehouse architecture best practices and standards within the team and ensure that they are shared and understood. • Continuously develop technical knowledge and be recognised as a key resource across the global team. • Collaborate with other specialists and/or technical experts to ensure H&B Data Platform is delivering to the highest possible standards and that solutions support stakeholder needs and business requirements. • Initiate practices that will increase code quality, performance and security. • Develop recommendations for continuous improvements initiatives, applying deep subject matter knowledge to provide guidance at all levels on the end-to-end potential implications of changes. • Build the team’s technical expertise/capabilities/skills through the delivery of regular feedback, knowledge sharing, and coaching. • Analyze existing database designs, pipelines and code base to suggest improvements that promote performance, stability and interoperability. • Work with product management and business subject matter experts to translate business requirements into a scalable and flexible data lakehouse design that can be deployed globally. • Maintain the governance model on the data lakehouse architecture through training, design reviews, code reviews, and progress reviews. • Participate in the development of Data Lakehouse Architecture and development roadmaps in support of business strategies and objectives • Enable business intelligence, reporting and analytic stakeholders with a data warehouse that is performant, optimized, flexible and scalable • Communication with key stakeholders and development teams on technical solutions. Convince and present proposals by way of high-level solutions to end users and/or stakeholders. • High learning adaptability, demonstrating understanding of the implications of technical issues on business requirements and / or operations.
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job