Develop and Deploy Apache Spark and Scala programs on Azure Databricks in a very dynamic and challenging work environment
Help write analytics code, services and components in Java, Apache Spark, and related technologies such as Scala and Pyspark (Python)
Responsible for systems analysis - Design, Coding, Unit Testing and other SDLC activities
Requirement gathering and understanding, Analyze and convert functional requirements into concrete technical tasks and able to provide reasonable effort estimates
Work proactively, independently and with global teams to address project requirements, and articulate issues/challenges with enough lead time to address project delivery risks
Work as a Dev-Ops team member to successfully monitor and maintain Operational processes
Required Qualifications:
10+ years on working experience in Python, Pyspark, Scala
8+ years of experience working on MS Sql Server and NoSQL DBs like Cassandra, etc.
Hands on working experience in Azure Databricks
Ability to understand the existing application codebase, perform impact analysis and update the code when required based on the business logic or for optimization
Exposure to following DevOps methodology and creating CI/CD deployment pipeline
Exposure to following Agile methodology specifically using tools like Rally
Proven excellent Analytical and Communication skills (Both Verbal and Written)
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job