The United States Postal Service, Office of Inspector General is seeking a highly qualified and versatile individual to fill our Data Engineer position in the Research and Insights Solution Center located in Arlington, Virginia.
Bring your skills and voice to our team!
About the Position:
The Research and Insights Solution Center (RISC) is the chief data and research component of the OIG, comprised of data scientists, data analysts, programmatic subject matter experts, geographic information system professionals, data engineers, research specialists, economists, and public policy analysts. Our analytics group offers the opportunity to drive value to the organization by designing and developing analytical solutions for auditors, investigators, and researchers.
The RISC analytics group is currently seeking an experienced Data Engineer and Full Stack Developer who will provide expert-level advice around data engineering to build data pipelines using sound DevSecOps. In this role, you will be responsible for designing, developing, and maintaining data pipelines, web applications, and machine learning workflows.
The USPS OIG uses a Pay Banding system, which is equivalent to the Federal GS scale. Grade and salary determinations will be made based upon a candidate's education and professional experience.
This position is being advertised at the Journey Band level, equivalent to a GS-9 to GS-12. The salary range for this position is $69,923.00 - $131,826.00. The salary figures include locality pay. Promotion potential to a GS-13 equivalent is at management's discretion.
- Work with cross-functional teams to deploy scalable data and artificial intelligence (AI) solutions on cloud platforms such as Azure, ensuring alignment with organizational goals.
- Design and implement data warehousing solutions that support analytics and data science initiatives, using advanced knowledge of database structures, data models, and performance optimization techniques.
- Develop and manage automated data pipelines to maintain data integrity, deploy machine learning models, and facilitate collaboration with data scientists and analysts.
- Use programming languages and tools like Python, Databricks, and Azure Data Lake to manipulate structured and unstructured data, creating efficient and scalable data pipelines.
- Integrate diverse data sources, including flat files, relational databases, SaaS applications, and web services using techniques such as JDBC/ODBC connections, REST APIs, and web scraping.
- Implement monitoring solutions, troubleshoot and resolve performance and production issues within data pipelines, leveraging findings to propose process improvements.
- Assess new data engineering tools and technologies, providing management with recommendations for enhancing data operations.
- Apply agile methodologies using tools like Azure DevOps and Git to streamline development processes.
- Design and develop responsive web applications using modern front-end frameworks (e.g., React, Angular, Vue.js) as part of full-stack development initiatives.
- Oversee the software development lifecycle, ensuring automated testing and quality assurance for data and analytics products.
- Create and integrate RESTful APIs with back-end services to enhance system interoperability and data accessibility.
- Development automated solutions to improve operational efficiency and data governance.
- Provide strategic recommendations regarding data architecture and integration to meet evolving needs.
- Work closely with data owners to establish and enforce data quality and documentation standards.
- Collaborate with data analysts, data scientists, investigators, auditors, and researchers to address and fulfill the data requirements of the organization.
Minimum Requirements for Position:
Note: You must meet all the minimum qualifications listed below
- Bachelor's Degree from an accredited college or university
- Must have specialized experience in building and maintaining data pipelines in cloud-based tools such as Azure Databricks, Azure Data Lake, or similar platforms
- Must have specialized experience with Python or SQL
- Must have at least 1 year of specialized experience with containerization and building CI/CD pipelines using tools such as Docker, Kubernetes, Azure DevOps, or Jenkins
- Must have at least 1 year of specialized experience integrating REST and SOAP APIs to create and access data or to trigger procedures or commands
How to Apply:
Click Apply to view the official announcement. You will be able to download and review the announcement details, requirements, and additional information on submitting your application.
Job Announcement Notifications:
Want to be notified when new job announcements are posted? Subscribe to email updates for 'Employment Opportunities' at U.S. Postal Service Office of Inspector General