JD For – Big Data, Enterprise Data Architect
Role & Responsibilities:
Support technical pre-sales efforts and attend customer meetings as needed, to function as a trusted advisor to clients.
Guide clients in setting up and execution of the roadmap and vision for enterprise data warehouse, big data, BI & analytics, and data management.
Recommend & drive the standards for Data Quality across multiple datasets, Governance best practices and drive for data trust across client enterprise.
Lead Cloud Strategy development, modernization defining business drivers, cloud adoption roadmap and business value.
Provide program level support on client development projects in Big Data, Datawarehouse/Data Lake.
Participate and oversee the design of Cloud solutions, from conceptual, logical, and physical design to meet client’s business and technical requirements leveraging architecture patterns and following development processes.
Partner with client business and technology stakeholders to drive future state architecture for enterprise Data, Reporting and Analytics platform and solutions.
Define architecture/blueprints, and advise clients on technology strategy, Migration/modernization, cloud adoption.
Lead data migration, modernization projects of databases/data warehouses (Oracle or Teradata) from on-premises to AWS/GCP cloud.
Qualifications:
· 15+ Years of IT experience in Big data, Datawarehouse & Data Architecture
· Experience with RDBMS, Data warehouse/Data Lake platforms like Oracle, Teradata, Snowflake, BI and ETL tools DataStage, Tableau
· Implementation, management and tuning experience of Data Warehouse with complex ETL Data Pipelines
· Experience in Data Modelling, Schema Design, Query Tuning and Optimization, and Data Migration and Integrations
· Prior project experience in data normalization, meta data management, source to target data mapping documentation using industry standard tools
· Prior experience in executing projects in data quality & Data Governance
· Detailed understanding of the Cloud Infrastructure for building & maintaining large data enterprise in cloud platforms like GCP (BigQuery, BigTable, DataFlow, DataProc, DataPrep, Pub/Sub, Machine Learning)
· Minimum of 3 years of development, and implementation of enterprise-level data solutions utilizing large data sets
· Good knowledge data governance and compliance regulations GDPR/CCPA/PCI
Job Type: Full-time
Pay: $148,775.63 - $165,084.64 per year
Benefits:
401(k)
Dental insurance
Health insurance
Schedule:
8 hour shift
Monday to Friday
Experience:
Azure: 1 year (Preferred)
AWS: 1 year (Preferred)
APIs: 1 year (Preferred)
Ability to Commute:
Dallas, TX 75201 (Required)
Ability to Relocate:
Dallas, TX 75201: Relocate before starting work (Required)
Work Location: In person