Oracle

Data Engineer

Bengaluru, KA, IN

14 days ago
Save Job

Summary

About Oracle FSGIU - Finergy:

The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals

Responsibilities:

  • Snowflake Data Modeling & Architecture: Design and implement scalable Snowflake data models using best practices such as Snowflake Data Vault methodology.
  • Real-Time Data Replication & Ingestion: Utilize Oracle GoldenGate for Big Data to manage real-time data streaming and optimize Snowpipe for automated data ingestion.
  • Cloud Integration & Management: Work with AWS services (S3, EC2, Lambda) to integrate and manage Snowflake-based solutions.
  • Data Sharing & Security: Implement SnowShare for data sharing and enforce security measures such as role-based access control (RBAC), data masking, and encryption.
  • CI/CD Implementation: Develop and manage CI/CD pipelines for Snowflake deployment and data transformation processes.
  • Collaboration & Troubleshooting: Partner with cross-functional teams to address data-related challenges and optimize performance.
  • Documentation & Best Practices: Maintain detailed documentation for data architecture, ETL processes, and Snowflake configurations.
  • Performance Optimization: Continuously monitor and enhance the efficiency of Snowflake queries and data pipelines.

Mandatory Skills:

  • Should have 4 years of experience as Data Engineer
  • Strong expertise in Snowflake architecture, data modeling, and query optimization.
  • Proficiency in SQL for writing and optimizing complex queries.
  • Hands-on experience with Oracle GoldenGate for Big Data for real-time data replication.
  • Knowledge of Snowpipe for automated data ingestion.
  • Familiarity with AWS cloud services (S3, EC2, Lambda, IAM) and their integration with Snowflake.
  • Experience with CI/CD tools (e.g., Jenkins, GitLab) for automating workflows.
  • Working knowledge of Snowflake Data Vault methodology.

Good to Have Skills:

  • Exposure to Databricks for data processing and analytics.
  • Knowledge of Python or Scala for data engineering tasks.
  • Familiarity with Terraform or CloudFormation for infrastructure as code (IaC).
  • Experience in data governance and compliance best practices.
  • Understanding of ML and AI integration with data pipelines.

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: