TekStream Solutions

Kafka Cloud Architect

Woodlawn, MD, US

14 days ago
Save Job

Summary

This position will require onsite work in Woodlawn, MD five days a week


Day to Day Responsibilities:


  • Lead and organize a team of Kafka administrators and developers, assign tasks, and facilitate weekly Kafka Technical Review meetings with the team.
  • Work alongside customer to determine expanded use of Kafka within the Agency.
  • Strategize within Leidos to set up opportunities to explore new technologies to use with Kafka.
  • Architect, design, code, and implement next-generation data streaming and event-based architecture / platform on Confluent Kafka.
  • Define strategy for streaming data to data warehouse, and integrating event-based architect with microservice based applications.
  • Establish Kafka best practices and standards for implementing the Kafka platform based on identified use cases and required integration patterns.
  • Mentor existing team members by imparting expert knowledge to build a high-performing team in our event-driven architecture. Assist developers in choosing correct patterns, event modelling and ensuring data integrity.
  • Provide software expertise in one or more of these areas: application integration, enterprise services, service-oriented architectures (SOA), security, business process management/business rules processing, data ingestion/data modeling.
  • Triage, investigate, advise in a hands-on capacity to resolve platform issues regardless of component.
  • Brief management, customer, team, or vendors using written or oral skills at appropriate technical level for audience. Share up-to-date insights on the latest Kafka-based solutions, formulate creative approaches to address business challenges, present and host workshops with senior leaders and translate technical jargons into layman’s language and vice-versa.
  • All other duties as assigned or directed.



Foundation for Success (Required Qualifications):


This experience is the foundation a candidate needs to be successful in this position:


  • Bachelor's Degree in Computer Science, Mathematics, Engineering, or a related field with 12 years of relevant experience OR Master degree with 10 years of relevant experience. Additional years of experience may be substituted/accepted in lieu of degree.
  • 12+ years of experience with modern software development including systems/application analysis and design.
  • 7+ years of combined experience with Kafka (One or more of the following: Confluent Kafka, Apache Kafka, and/or Amazon MSK).
  • 2+ years of combined experience with designing, architecting, and deploying to AWS cloud platform.
  • 1+ years of leading a technical team.
  • Must be able to obtain and maintain a Public Trust security clearance.



Factors to Help You Shine (Required Qualifications):


These skills will help you succeed in this position:


  • Expert experience with Confluent Kafka with hands-on production experience, capacity planning, installation, administration / platform management, and a deep understanding of the Kafka architecture and internals.
  • Expert Experience in Kafka cluster, security, disaster recovery, data pipeline, data replication and/or performance optimization.
  • Kafka installation & partitioning on OpenShift or Kubernetes, topic management, HA & SLA architecture.
  • Strong knowledge and application of microservice design principles and best practices: distributed systems, bounded contexts, service-to-service integration patterns, resiliency, security, networking, and/or load balancing in large mission critical infrastructure.
  • Expert experience with Kafka Connect, KStreams, and KSQL, with the ability to know how to use effectively for different use cases.
  • Hands-on experience with scaling Kafka infrastructure including Broker, Connect, ZooKeeper, Schema Registry, and/or Control Center.
  • Hands-on experience in designing, writing, and operationalizing new Kafka Connectors.
  • Solid experience with data serialization using Avro and JSON and data compression techniques.
  • Experience with AWS services such as ECS, EKS, Flink, Amazon RDS for PostgreSQL, and/or S3.
  • Basic knowledge of relational databases (PostgreSQL, DB2, or Oracle), SQL, and ORM technologies (JPA2, Hibernate, and/or Spring JPA).


How to Stand Out from the Crowd (Desired Qualifications):



Showcase your knowledge of modern development using data streaming and event-based architecture through the following experience or skills:


  • AWS cloud certifications.
  • Delivery (CI/CD) best practices and use of DevOps to accelerate quality releases to Production.
  • PaaS using Red Hat OpenShift/Kubernetes and Docker containers.
  • Experience with configuration management tools (Ansible, CloudFormation / Terraform).
  • Solid experience with Spring Framework (Boot, Batch, Cloud, Security, and Data).
  • Solid knowledge with Java EE, Java generics, and concurrent programming.
  • Solid experience with automated unit testing, TDD, BDD, and associated technologies (Junit, Mockito, Cucumber, Selenium, and Karma/Jasmine).
  • Working knowledge of open-source visualization platform Grafana and open-source monitoring system Prometheus and uses with Kafka.

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: