IMPACT Technology Recruiting

Enterprise Architect (Kafka)

Salt Lake County, UT, US

about 2 months ago
Save Job

Summary

Our client in Phoenix, AZ. Murray, UT is currently hiring a Enterprise Architect (Kafka) on a contract to permanent basis.

Note:

  • Must be willing to relocate within 2-3 weeks of offer (Local candidates preferred)
  • Role is Hybrid, onsite 2 days and remote 3 days
  • US Citizen, GC holder, TN, H4 or GC EAD (No Sponsorship available)
  • No 3rd parties/Corp to Corp


Job Description

We are looking for an experienced Enterprise Architect with deep expertise in event-driven architecture (EDA) and Apache Kafka to join our team. In this role, you will be responsible for designing, implementing, and optimizing our enterprise event-driven systems to enable real-time data processing and integration across the organization. You will lead the creation of scalable, fault-tolerant, and high-throughput data pipelines and advise on best practices for implementing Kafka and related technologies.


Required Skills & Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field, or related career experience
  • 8+ years of experience in enterprise architecture, with a focus on distributed systems and event-driven architectures.
  • Proven expertise in designing, implementing, and scaling Kafka-based solutions for real-time data processing.
  • Advanced knowledge of Apache Kafka, Kafka Streams, Kafka Connect, and related ecosystem components.
  • Proficiency in designing data streaming architectures, message-driven microservices, and asynchronous communication patterns.
  • Strong experience with DevOps practices, including Kafka cluster deployment, monitoring, and automation tools.
  • Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and container orchestration (e.g., Kubernetes) for Kafka management.
  • Exceptional analytical skills, strong communication abilities, and experience working with diverse, cross-functional teams.

Preferred Qualifications:

  • Certifications in enterprise architecture (e.g., TOGAF) or Kafka (e.g., Confluent Certified Developer for Apache Kafka).
  • Experience with complementary streaming tools, such as Spark, Flink, or Pulsar.
  • Understanding of data governance frameworks and practices for secure data streaming.

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: