Minfy

Principal Data Architect - Data Engineering

Hyderabad, TS, IN

about 1 month ago
Save Job

Summary

Role & Responsibilities

We are seeking a highly skilled and experienced Architect to lead the design, implementation, and optimization Intelligent data Platform. The ideal candidate will have a strong background in cloud-based data processing systems, data warehousing, and big data technologies. They will work closely with our data engineering team to ensure that the data platform is optimized for performance, scalability, and reliability.

  • Collaborate with stakeholders to understand business objectives and translate them into data architecture requirements.
  • Design and implement data models, develop Data attribute maps, database schemas, and data integration strategies that comply with regulatory requirements and industry best practices.
  • Develop and maintain data governance policies and procedures to ensure the confidentiality, integrity, and availability of sensitive data.
  • Implement data security measures and access controls to protect against unauthorized access and mitigate potential risks.
  • Architect and optimize data storage and retrieval processes to meet the performance and scalability demands of banking applications.
  • Leverage AWS services such as Amazon Redshift, Amazon RDS, Amazon Aurora, and AWS Glue to build scalable and cost-effective data solutions.
  • Architect and design solutions to meet functional and non-functional requirements.
  • Lead the design, implementation, and optimization of Intelligent Data platform.
  • Develop and maintain a comprehensive understanding of data pipeline and data architecture.
  • Develop and maintain documentation for our Intelligent data platform including architecture diagrams, deployment guides, and operational procedures.
  • Provide guidance and support to our data engineering team.
  • Create and review architecture and solution design artifacts.
  • Evangelize re-use through the implementation of shared assets.
  • Enforce adherence to architectural standards/principles, global product-specific guidelines, usability design standards, etc.
  • Proactively guide engineering methodologies, standards, and leading practices.
  • Identify, communicate, and mitigate Risks, Assumptions, Issues, and Decisions throughout the full lifecycle.
  • Considers the art of the possible, compares various architectural options based on feasibility and impact, and proposes actionable plans.
  • Demonstrate strong analytical and technical problem-solving skills.
  • Ability to analyze and operate at various levels of abstraction.
  • Ability to balance what is strategically right with what is practically realistic.
  • Growing the Data Engineering business by helping customers identify opportunities to deliver improved business outcomes, designing and driving the implementation of those solutions.
  • Leading team in the definition of best practices & repeatable methodologies in Cloud Data Engineering, including Data Storage, ETL, Data Integration & Migration, Data Warehousing and Data Governance
  • Should have Technical Experience in AWS Cloud Data Engineering services and solutions.
  • Contributing to Sales & Pre-sales activities including proposals, pursuits, demonstrations, and proof of concept initiatives
  • Evangelizing the Data Engineering service offerings to both internal and external stakeholders
  • Development of Whitepapers, blogs, webinars and other though leadership material
  • Development of Go-to-Market and Service Offering definitions for Data Engineering
  • Expand the business within existing accounts and help clients, by building and sustaining strategic executive relationships, doubling up as their trusted business technology advisor.
  • Position differentiated and custom solutions to clients, based on the market trends, specific needs of the clients and the supporting business cases.
  • Build new Data capabilities, solutions, assets, accelerators, and team competencies.

Mandatory Skills Description

  • - Provide technical leadership and mentorship to junior members.
  • - Proven experience as a Data Architect with a deep understanding of Enterprise systems and processes.
  • - Strong proficiency in SQL and database technologies, with experience in designing and optimizing data models for core applications.
  • - Hands-on experience with AWS cloud services, particularly those relevant to data architecture such as Amazon Redshift, Amazon RDS, AWS Glue, and AWS Lambda.
  • - Technology Agnostic approach for multitude of data source systems and API gateways

Nice-to-Have Skills

  • Familiarity with compliance requirements such as GDPR, PII protection frameworks, PCI DSS.
  • Excellent communication and interpersonal skills, with the ability to effectively collaborate with diverse teams and stakeholders.
  • AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Data Analytics Specialty) are highly desirable.

Minimum Qualifications

Excellent technical architecture skills, enabling the creation of future-proof, complex global Platform solutions.

  • Excellent interpersonal communication and organizational skills are required to operate as a leading member of global, distributed teams that deliver quality services and solutions.
  • Ability to rapidly gain knowledge of the organizational structure of the firm to facilitate work with groups outside of the immediate technical team.
  • Knowledge and experience in IT methodologies and life cycles that will be used.
  • Familiar with solution implementation/management, service/operations management, etc.
  • Maintains close awareness of new and emerging technologies and their potential application for service offerings and products.
  • Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience.
  • Experience in architecting and designing technical solutions for cloud-centric solutions based on industry standards using IaaS, PaaS, and SaaS capabilities.
  • Must have strong hands-on experience on various cloud services like Lambda, S3, Security, Monitoring, Governance & Compliance.
  • Must have good knowledge of Data Engineering concept and related services of cloud.
  • Must have good experience in Python and Spark.
  • Must have good experience in setting up development best practices.
  • Experience with claims-based authentication (SAML/OAuth/OIDC), MFA,RBAC, SSO etc.
  • Knowledge of cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc.
  • Experience building and supporting mission-critical technology components with DR capabilities.
  • Experience with multi-tier system and service design and development for large enterprises
  • Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies.
  • Exposure to infrastructure and application security technologies and approaches
  • Familiarity with requirements gathering techniques.
  • Preferred Qualifications
  • Must have experience in DevSecOps working closely on Data Engineering based project
  • Strong expertise in Data platform component
  • Delta lake
  • db API 2.0
  • SQL Endpoint – Photon engine
  • Delta Sharing
  • Unity Catalog
  • Security management
  • Platform governance, Auditing & Compliance
  • Data Security
  • Proficiency in AWS services including but not limited to S3, EC2, IAM, VPC, EKS, Lambda, Glue, Private Link, KMS, CloudWatch, EMR etc.
  • Must know how to enable geo redundancy and DR capabilities on databricks.
  • Proficient in designing and implementing
  • Everything as a code
  • Infrastructure as a code
  • Configuration as a code
  • Configuration as a code
  • Security configuration as a code
  • Must have strong expertise in designing platform with strong observability and Monitoring standards.
  • Proficient in developing and setting best practices of various DevSecOps activities including CI/CD.
  • Good to have Rest API knowledge.
  • Good to have understanding around cost distribution.
  • Good to have if worked on migration project to build Unified data platform.
  • Good to have knowledge of DBT.
  • Software development full lifecycle methodologies, patterns, frameworks, libraries, and tools
  • Knowledge of programming and scripting languages such as JavaScript, Bash, SQL, Python, etc.
  • Experience in distilling complex technical challenges to actionable decisions for stakeholders and guiding project teams by building consensus and mediating compromises when necessary.
  • Experience coordinating the intersection of complex system dependencies and interactions
  • Experience in solution delivery using common methodologies especially SAFe Agile but also Waterfall, Iterative, etc.

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job