Mondo

Lead Data Engineer

United States

16 days ago
Save Job

Summary

About the Role

We’re looking for a Lead Data Engineer who can balance hands-on engineering with team leadership. You’ll play a key role in managing and optimizing the performance of our enterprise data warehouse (EDW), ensuring timely and accurate data delivery across multiple systems. This role is a hybrid of technical execution and people leadership—ideal for someone who thrives in both domains and wants to help evolve and modernize our data infrastructure.

You’ll work in a complex data environment that blends legacy systems with modern technologies—maintaining current pipelines while helping transition to our future state powered by Spark, Iceberg, and Airflow.


What You'll Do (Day-to-Day):

  • Oversee and ensure all ETL/data jobs are completed on time and meet SLA requirements.
  • Lead and manage a small team, delegating incidents, requests, and backlog items efficiently.
  • Provide production support, resolving issues quickly and maintaining system stability.
  • Guide and contribute to the development of new ETL frameworks and data platform enhancements.
  • Interpret Cobol copybooks to process mainframe EBCDIC files (no coding in Cobol required).
  • Document and improve data engineering processes, reporting metrics, and internal policies.
  • Collaborate cross-functionally with product, business, and other data teams.
  • Balance leadership responsibilities (approximately 33% of the time) with technical execution (66%).


Must-Have Qualifications:

  • 7+ years of data engineering experience (flexible for the right candidate; capped at 15).
  • Proven experience in a production support environment managing ETL/ELT workflows in an EDW.
  • Strong knowledge of both legacy frameworks (Cobol/EBCDIC, Apache Hive, Impala) and modern platforms (Apache Spark, Iceberg, Airflow, Python).
  • Demonstrated leadership experience, including task delegation and mentoring team members.
  • Comfortable working in small to mid-sized teams or in siloed teams within large enterprises.
  • Ability to interpret and work with Cobol copybooks for mainframe file ingestion (no Cobol development needed).


Nice to Haves:

  • Experience with Precisely Connect, Cloudera, or Hadoop platforms.
  • Financial services, student lending, or loan processing background.
  • Bachelor’s degree in Computer Science, Engineering, or a related field.

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: