Dicetek LLC

Data Architect

Bengaluru, KA, IN

4 months ago
Save Job

Summary

Job Description - Data Architect

Data Architect to design data solutions that meet business requirements. Should focus on design data models, pipelines, end to end data architecture and ensuring smooth data flows and implemented effectively with architectural standards.

Responsibilities

  • Strong experience in design data solutions using data platforms Cloudera/any data lake, Confluent/Apache Kafka, Azure Data services like ADF, Spark/ Databricks, Cosmos DB, Data modelling with NoSQL Databases i.e. MongoDB or any document NoSQL db.
  • Experience in design data models and architectures optimized for high performance, scalability, and fast response times, particularly in NoSQL databases (Cosmos DB/ MongoDB / any Document DB ).
  • Implement data solutions that ensure data accuracy, integrity, consistency, scalable, reliable etc.
  • Collaborate with stakeholders to understand data requirements and design effective data architecture.
  • Recommend the best tools and practices to build batch and streaming data pipelines.
  • Manage capacity planning and cost optimization for Azure services to ensure efficient and cost effective use of cloud resources.
  • Define data governance policies to ensure data quality and security across data layers.
  • Provide direction, review and mentorship to Data Engineers, assisting with pipeline design, optimization, and troubleshooting as needed to ensure alignment with data architecture standards.
  • Strong communication skills to collaborate across teams and translate business needs into data solutions.

Qualifications

  • 15+ years of IT experience with at least 7+ years in data architecture and 5+ years on Azure cloud platform.
  • Expertise in data platforms (Cloudera/any Data Lake, Confluent/Apache Kafka, Azure Data Services like ADF, Databricks, Cosmos DB).
  • Strong knowledge of NoSQL databases (Cosmos DB, MongoDB) and data modeling for performance and scalability.
  • Proven experience in designing and implementing batch and streaming data pipelines using tools like Databricks, Spark, or Kafka.
  • Skilled in creating scalable, reliable, and high-performance data architectures with robust data governance policies for quality and security.
  • Strong collaboration skills to work with stakeholders, mentor Data Engineers, and translate business needs into solutions.
  • Bachelor’s or Master degree in Computer Science, IT, or a related field.

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: