Databricks, Python, Pyspark, B4HANA, SQLhands on experience, Lakehouse knowledge, CI&CD
Tasks to ingest data from a different internal source system via Kafka connector (will be built by another team) into bronze, clean data and implement data quality checks (a.o. reconciliation, business rules).
Code business rules efficiently and effectively using good coding principles that other developers in the team easily understand and can build upon.
Make data available on a regular frequency without human intervention for a consumption layer according to business requirements and with 99% availability and trustworthiness
Drive functional and technical discussions independently with stakeholders.
DevOps understanding
Should be flexible to work on both Development and L2 Support tasks
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job