About us:
Intuitive is an innovation-led engineering company delivering business outcomes for 100’s of Enterprises globally. With the reputation of being a Tiger Team & a Trusted Partner of enterprise technology leaders, we help solve the most complex Digital Transformation challenges across following Intuitive Superpowers:
Modernization & Migration
- Application & Database Modernization
- Platform Engineering (IaC/EaC, DevSecOps & SRE)
- Cloud Native Engineering, Migration to Cloud, VMware Exit
- FinOps
Data & AI/ML
- Data (Cloud Native / DataBricks / Snowflake)
- Machine Learning, AI/GenAI
Cybersecurity
- Infrastructure Security
- Application Security
- Data Security
- AI/Model Security
SDx & Digital Workspace (M365, G-suite)
- SDDC, SD-WAN, SDN, NetSec, Wireless/Mobility
- Email, Collaboration, Directory Services, Shared Files Services
Intuitive Services:
- Professional and Advisory Services
- Elastic Engineering Services
- Managed Services
- Talent Acquisition & Platform Resell Services
About the job:
Title: Data Platform Engineer
Start Date: Immediate
Position Type: Full Time
Location: Remote across India (Ready to relocate to Abu Dhabi)
Role Overview:
We are seeking a highly skilled and motivated Data Platform Engineer to join their team. The key aspects of the role will be in building additional platform capabilities, optimising, and maintaining robust data pipelines and platforms that empower data-driven decision-making across ADIA organisation. The ideal candidate is a passionate and motivated individual with extensive experience in data platform engineering, ETL development, and an expert in PySpark and Python. Good communications skills are a must and candidates needs to be articulate, precise and concise.
Responsibilities:
- Collaborate with stakeholders during requirements clarification and sprint planning sessions to ensure alignment with business objectives.
- Design and implement technical solutions, including ETL pipelines, leveraging PySpark to extract, transform, and load data efficiently.
- Build solutions to integrate the two main data platforms (Palantir Foundry & Databricks)
- Integrate the data platforms with other platforms, inc incident & monitoring tools (e.g. ServiceNow), identity management, data observability (e.g. Dynatrace)
- Optimize existing ETL processes for improved performance, scalability, and reliability.
- Develop and maintain unit and integration tests to ensure quality and resilience.
- Provide support to QA teammates during the acceptance process.
- Resolve production incidents as a third-line engineer, ensuring system stability and uptime.
Required Skills & Experience:
- Education: Bachelor’s degree in IT or a related field.
- Experience: Minimum 8 years in IT/Data-related roles.
Technical Expertise:
- Proficient in PySpark for distributed computing and Python for ETL development.
- Advanced SQL skills for writing and optimizing complex queries (e.g Oracle, MS SQL).
- Familiarity with ETL tools, processes, and data warehousing platforms, particularly Databricks.
- Solid understanding of data modelling, including dimensional modelling, normalization, and schema design.
- Experienced with version control tools such as Git.
- Knowledge of monitoring tools (e.g., ServiceNow, Prometheus, Grafana) to track and optimize pipeline performance.
- Knowledge of scheduling tools (e.g. Stonebranch, Control-m, Airflow)
- Proficiency in data freshness & quality frameworks (e.g. Great expectations, ideally Monte Carlo
- Agile Methodologies: Comfortable with Agile practices, including sprint planning, stand-ups, and retrospectives.
- Collaboration Tools: Skilled in using Azure DevOps for team collaboration and project management.
- Problem-Solving: Strong debugging and troubleshooting abilities for complex data engineering issues.
- Communication: Exceptional written and verbal communication skills, with the ability to articulate technical concepts to non-technical stakeholders.
Preferred Experience:
- Understanding of the investment data domain.