Capgemini

Cloud Data Engineer (Snowflake)

Lower Silesian Voivodeship, PL

18 days ago
Save Job

Summary

Location- Krakow/Gdansk/Poznan/Wroclaw/Warsaw



Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world

.YOUR TEAM

Insights & Data delivers state-of-the-art Data solutions. Our expertise primarily lies in Cloud & Big Data engineering, where we develop robust systems capable of processing extensive and complex datasets, utilizing specialized Cloud Data services across platforms like AWS, Azure, and GCP. We oversee the entire Software Development Life Cycle (SDLC) of these solutions, which involves not only leveraging data processing tools such as ETL but also extensive programming in languages like Python, Scala, or Java, coupled with the adoption of DevOps tools and best practices. The processed data is then made accessible to downstream systems through APIs, outbound interfaces, or is visualized via comprehensive reports and dashboards. Additionally, within our AI Center of Excellence, we undertake Data Science and Machine Learning projects with a focus on cutting-edge areas such as Generative AI, Natural Language Processing (NLP), Anomaly Detection, and Computer Vision

.YOUR TASK

  • design, develop, and maintain Snowflake data pipelines to support various business functions
  • ;collaborate with cross-functional teams to understand data requirements and implement scalable solutions
  • ;optimize data models and schemas for performance and efficiency
  • ;ensure data integrity, quality, and security throughout the data lifecycle
  • ;implement monitoring and alerting systems to proactively identify and address issues
  • ;plan and execute migration from on-prem data warehouses to Snowflake
  • ;develop AI, ML and Generative AI solution
  • ;stay updated on Snowflake best practices and emerging technologies to drive continuous improvement

.YOUR PROFILE

  • at least 3 years of experience in Big Data or Cloud projects in the areas of processing and visualization of large and/or unstructured datasets (including at least 1 year of hands-on Snowflake experience)
  • ;understanding of Snowflake's pricing model and cost optimization strategies for managing resources efficiently
  • ;experience in designing and implementing data transformation pipelines natively with Snowflake or Service Partners
  • ;familiarity with Snowflake’s security model
  • ;practical knowledge of at least one Public Cloud platform in Storage, Compute (+Serverless), Networking and DevOps area supported by commercial project work experience
  • ;at least basic knowledge of SQL and one of programming languages: Python/Scala/Java/bash
  • ;very good command of English


.

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job