CriticalRiver Inc.

Data Architect

Hyderabad, TS, IN

29 days ago
Save Job

Summary

Role: Data Architect

Experience:- 10 Years

Work Location :- Hyderabad


Key Responsibilities:

  • Design and implement scalable, high-performance data architectures using Snowflake.
  • Develop ETL/ELT pipelines using dbt (Data build tool), FiveTran and Airflow to ingest, transform, and process large volumes of data.
  • Strong experience of data modelling using dbt
  • Optimize data pipelines for efficiency, reliability, and scalability.
  • Ensure data integrity, governance, and security across the data ecosystem.
  • Strong understanding of dimensional data modelling and data warehousing concepts.
  • Strong experience in SQL and Python
  • Collaborate with data engineers, analysts, and business stakeholders to define data requirements and architecture.
  • Work with cloud platforms (AWS, GCP or Azure) to manage and scale data infrastructure.
  • Implement reverse ETL solutions using Hightouch for data activation and operational analytics.
  • Monitor and troubleshoot data pipelines and workflows to ensure smooth operations.
  • Drive best practices in data modeling, performance tuning, and cost optimization.
  • Stay updated with emerging technologies and trends in cloud-based data engineering.
  • Understanding of ERP (NetSuite) data is added advantage.
  • Strong communication skills

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: