Requirement - Data Modeler - IBM DB2, Snowflake, Azure Fabric
Location - Bangalore
Experience 6 – 9 years
Project Timing - 2PM – 11PM (Candidate should be flexible)
Work Mode - Hybrid
Role - B3/Lead Consultant
JD
We are seeking a highly skilled Senior Data Modeler with expertise
in Azure Fabric and experience working with IBM DB2 and
Snowflake as primary data sources. The ideal candidate will be
responsible for designing, developing, and optimizing data models
that support enterprise analytics, data warehousing, and reporting
needs. This role requires a deep understanding of data modeling
techniques, cloud-based data platforms, and data governance to
ensure efficient and scalable data solutions.
Key Responsibilities:
1. Data Modeling & Architecture:
Design and implement conceptual, logical, and physical data
models to support enterprise data solutions.
Develop highly scalable and efficient data models optimized
for Azure Fabric.
Ensure data integrity, consistency, and normalization across
different data sources, including IBM DB2 and Snowflake.
Define and enforce data modeling best practices for
transactional, operational, and analytical systems.
2. Data Integration & Transformation:
Collaborate with data engineers, architects, and BI
developers to integrate data from multiple sources.
Optimize ETL/ELT pipelines to efficiently transform data and
support analytical workloads.
Design and implement dimensional and relational data
models suitable for reporting and analytics.
3. Azure Fabric Implementation & Optimization:
Leverage Azure Data Fabric to design and manage data
lakes, data warehouses, and real-time data streaming
architectures.
Ensure efficient data partitioning, indexing, and performance
tuning for cloud-based data solutions.
Work with Azure Synapse, Azure Data Factory, and other
Microsoft cloud services to enable seamless data ingestion,
storage, and analytics.
4. Data Governance & Security:
Define and implement data governance policies, ensuring
compliance with industry standards.
Establish data lineage, metadata management, and data
cataloging best practices.
Work closely with security teams to ensure proper
encryption, access control, and compliance with regulatory
requirements.
5. Performance Tuning & Optimization:
Conduct performance tuning, indexing, and partitioning
strategies to optimize query execution.
Optimize data modeling and query performance in Snowflake
and DB2 environments.
Identify bottlenecks and recommend solutions to
enhance query performance and storage efficiency.
6. Collaboration & Stakeholder Management:
Work closely with business analysts, product owners, and
data scientists to translate business requirements into scalable
data models.
Provide technical leadership, mentorship, and training to junior
data modelers and data engineers.
Document data models, data flow diagrams, and technical
specifications for cross-functional teams.