Kforce

Data Architect

Charlotte, NC, US

Remote
Contract
2 days ago
Save Job

Summary

Kforce has a client that is seeking a Data Architect in Charlotte, NC. This is a contract, potential to hire role. Overview: We are seeking a seasoned Technical Architect to spearhead our data modernization initiatives within the AWS ecosystem. This pivotal role demands a multifaceted professional adept in AWS architecture, data warehousing, and modern data management practices. The ideal candidate will possess a deep understanding of AWS best practices, data modeling, and have hands-on experience with tools like Apache Iceberg, S3, Python, and PySpark. This position offers the potential for full-time employment based on performance and organizational fit. Key Responsibilities: * Data Architect will design and implement scalable, secure, and efficient data architectures on AWS, adhering to industry best practices * Lead the modernization of data warehousing solutions, with a focus on integrating Apache Iceberg as the primary Data Lakehouse format * Develop and optimize data pipelines using Python and PySpark to facilitate seamless data ingestion, transformation, and loading processes * Architect and manage Master Data Management (MDM) systems, ensuring data consistency and integrity across OLTP systems * As a Data Architect, you will collaborate with cross-functional teams to understand business requirements and translate them into technical solutions * Stay abreast of emerging technologies and evaluate their applicability to our data architecture * Provide technical leadership and mentorship to junior team members* Bachelor's or Master's degree in Computer Science, Information Systems, or a related field * Proven experience as a Technical Architect or in a similar senior technical role * Extensive experience with AWS services, including but not limited to S3, Glue, EMR, and Redshift * Hands-on experience with Apache Iceberg, or a strong willingness and ability to learn and implement it effectively * Proficiency in Python and PySpark for data processing tasks * Strong understanding of data warehousing concepts and experience with platforms like Redshift, Hive, or Snowflake * Experience in designing and managing MDM systems and OLTP databases * Excellent problem-solving skills and the ability to work independently * Strong communication skills, both verbal and written Preferred Qualifications: * AWS Certified Solutions Architect or equivalent certification * Experience with infrastructure-as-code tools like Terraform or CloudFormation * Familiarity with CI/CD pipelines and DevOps practices * Knowledge of data governance and security best practices

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: