Designation / Job title : 261312-Developer Programmer
Primary or Mandatory skills :-
· Datawarehouse solutioning
· Azure - Databricks
· Big Data Platform - Cloudera
· Teradata Tools and utilities
· Linux
· Hadoop Spark
Good to have skills :
Data lake, Azure, Telecom Domain Knowledge
Detailed Job description : -
Minimum 6.5 years of experience in Enterprise Data Warehouse solutioning, Exposure to Unified Data Platform including azure databricks, Apache Airflow, Microsoft SQL server and big data technology stacks like Cloudera, HBase, Hive, Impala and Spark, Analytics tools like Python, R, Azure
In depth knowledge of Teradata Utilities, Macros
Data lake, Azure
Strong SQL analytical skills
Knowledge in Unix system
Good to have knowledge in Control M scheduling, Apache Airflow
Involve in Business requirement gatherings, Analysis of requirements, Design, solution walkthrough, Workshops & Identify gap’s in solution & Business requirement with Business & IT team.
Create detailed technical design document based on the requirements and High-Level Solution Design.
Basic knowledge in Spark leveraging Scala or Python and Optimize the performance of the built Spark applications in Big data Platform and databricks.
Strong analytical mindset and ability to work independently and in fast-paced and quickly changing environment.
Work and continuously improve the DevOps pipeline and tooling to provide active management of the continuous integration/continuous deployment processes
Good to have experience in any ETL tools
Experience in Teradata, Informatica, Linux, Hadoop Spark, Control-M
Experience working in an Agile delivery model
Preparing implementation plans, reports, manuals and other documentation on the status, operation, and maintenance of Data ware housing Applications.
Job Type: Full-time
Pay: $74,000.00 – $90,000.00 per year
Work Location: In person