We’re looking for an AWS ETL Cloud Data Engineer to support our team in Singapore.
This is an exciting opportunity to use and further expand your AWS cloud skill set, and join Income Data Team to build Hybrid Datalake as part of key IFRS project.
This role offers the opportunity to work on meaningful projects, collaborate with talented colleagues, and contribute to the success of a growing company. If you’re someone who takes initiative, values continuous learning, and thrives in a collaborative setting, we’d love to hear from you.
Key Responsibilities:
• Design, build and operationalize large scale enterprise data solutions and applications
using one or more of AWS data and analytics services in combination with 3rd parties –
Spark/Python on Glue, Redshift, S3, Athena, RDS-PostgreSQL, Airflow, Lambda, DMS,
Code Commit, Code Pipeline, Code build, etc.
• Design and build production ETL data pipelines from ingestion to consumption within a big data architecture, using DMS, DataSync & Glue.
• Understand the existing applications(including on-premise Cloudera Datalake) and
infrastructure architecture.
• Analyze, re-architect and re-platform on-premise data warehouses to data platforms on
AWS cloud using AWS or 3rd party services.
• Design and implement data engineering, ingestion and curation functions on AWS cloud
using AWS native or custom programming.
• Perform detail assessments of current state data platforms and create an appropriate
transition path to AWS cloud.
• Collaborate with development, infrastructure and data center teams to define Continuous
Integration and Continuous Delivery processes in accordance with industry standards.
• Work on hybrid Datalake.
• Work closely with multiple stakeholders to ensure high standards are maintained.
Mandatory Skill-set
• Bachelors Degree in Computer Science, Information Technology or other relevant fields
• 5+ years of work experience with ETL, Data Modelling, Data Architecture to build Datalake.
Proficient in ETL optimization, designing, coding, and tuning big data processes using
Pyspark.
• 3+ yrs of extensive experience in working on AWS platform using core services like AWS
Athena, Glue Pyspark, Redshift, RDS-PostgreSQL, S3 & Airflow(for orchestration).
Good to have Skill-set
• Fundamental of Insurance domain
• Functional knowledge on IFRS17
About CLPS RiDiK
RiDiK is a global technology solutions provider and a subsidiary of CLPS Incorporation (NASDAQ: CLPS), delivering cutting-edge end-to-end services across banking, wealth management, and e-commerce. With deep expertise in AI, cloud, big data, and blockchain, we support clients across Asia, North America, and the Middle East in driving digital transformation and achieving sustainable growth. Operating from regional hubs in 10 countries and backed by a global delivery network, we combine local insight with technical excellence to deliver real, measurable impact. Join RiDiK and be part of an innovative, fast-growing team shaping the future of technology across industries.