Tarento Group

Senior Data Architect

San Francisco, CA, US

8 days ago
Save Job

Summary

About The Role

We are looking for a skilled and motivated Snowflake Data Engineer to join our data engineering team. The ideal candidate will bring hands-on experience with Snowflake, Iceberg and DBT, and play a key role in building and optimizing scalable, high-performance data pipelines. Exposure to modern ELT tools such as Fivetran is a strong plus.

This role is ideal for professionals who thrive in a fast-paced environment, are passionate about data, and are eager to contribute to the development of a robust data infrastructure.

Key Responsibilities

  • Design, develop, and maintain scalable and efficient data pipelines using Snowflake and DBT
  • Build and manage data models and transformations that support business intelligence and analytics needs
  • Collaborate with data analysts, scientists, and stakeholders to gather requirements and deliver clean, well-modeled datasets
  • Optimize performance and ensure reliability and scalability of the data infrastructure
  • Implement best practices in data quality, testing, and monitoring
  • Participate in code reviews and contribute to continuous improvement of the data engineering process
  • Document data workflows, transformation logic, and system configurations

Must-Have Skills & Qualifications

  • 10 - 12 years of hands-on experience in data engineering or a similar role
  • Proficient in Snowflake architecture, development, and performance tuning
  • Strong experience in DBT (Data Build Tool) for managing transformations and version-controlled data models
  • Strong SQL skills with the ability to write optimized, complex queries for large datasets
  • Experience working with structured and semi-structured data (e.g., JSON, Parquet)
  • Understanding of data warehousing concepts, dimensional modeling, and ELT workflows
  • Familiarity with cloud platforms (AWS, Azure, or GCP)

Good-to-Have Skills

  • Experience with Fivetran or similar ELT tools for automated data ingestion
  • Knowledge of orchestration tools like Airflow or dbt Cloud
  • Familiarity with Git-based workflows and CI/CD practices in data engineering
  • Exposure to data governance, security, and privacy frameworks
  • Scripting skills (e.g., Python or Bash) for automation and tooling

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job