Comments/Special Instructions:
Will accept people with only green cards or US citizenship.
Key Responsibilities
Enterprise Data Lake Development
Contribute to building and scaling enterprise data lake, which will support critical company operations.
Work with agile teams to identify source data from various applications, ingest it into designated landing zones, and ensure compliance with best practices.
Ensure data ingestion processes are robust and do not disrupt production systems.
Agile Collaboration
Function within cross-functional agile teams.
Pair with team members to provide support and share expertise when needed.
Data Processing
Enable and support both batch and real-time processing to drive data-centric decision-making.
Job Responsibilities
Design and implement robust, scalable, and secure data solutions on Azure Cloud to support real-time and batch processing
Architect and maintain data pipelines that handle high data volume, velocity, and variety
Collaborate with cross-functional teams to gather requirements and deliver data solutions aligned with business objectives
Apply cloud security best practices to ensure data compliance and protection
Build and maintain dashboards and custom reports for ongoing analysis and insights
Optimize workflows and processes to accelerate analytics implementation and improve efficiency
Troubleshoot, refine, and enhance data architecture to support business growth
Guide teams on analytics best practices, ensuring consistency and accuracy across implementations
Identify opportunities for process improvement and data innovation through actionable insights
Foster a collaborative team environment and engage with stakeholders at all levels to align goals and priorities
*
Qualifications & Skills
General Skills
Senior-level experience (7+ years). Open to 5-6 yrs if they meet skill set also open to someone with 1-3 yrs of Azure exp if they have worked on AWS ,GCP etc prior years.
Strong communication and interpersonal skills.
Ability to mentor, coach, and collaborate within the team.
Technical Skills
Azure Cloud: Expertise in designing scalable and secure data solutions using Azure services (e.g., Data Lake, Synapse Analytics, Databricks).
Security Practices: Proficiency in Azure security features (e.g., Active Directory, network security groups, encryption).
Python: Knowledge required; ability to ramp up quickly if not proficient.
Strong understanding of industry standards and production practices for data engineering.
Experience with batch processing critical for supporting data-driven initiatives.
Additional Preferences
ITIL v3 or v4 certification is a plus.
Utility industry experience is preferred but not required.
Bachelor’s degree in Computer Science or equivalent work experience.
Job Type: Contract
Pay: $65.00 - $70.00 per hour
Expected hours: 40 per week
Benefits:
401(k)
Dental insurance
Health insurance
Schedule:
8 hour shift
Monday to Friday
Application Question(s):
Only Green cards and US Citizens can apply for this job.
Education:
Bachelor's (Required)
Experience:
Azure: 5 years (Required)
ETL: 5 years (Required)
Azure Data Lake: 5 years (Required)
Ability to Commute:
Columbus, OH 43207 (Required)
Ability to Relocate:
Columbus, OH 43207: Relocate before starting work (Required)
Work Location: In person
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job