LHH

Data Engineer

United Kingdom

5 days ago
Save Job

Summary

Job Description

Contract Data Engineers (x3) – Defence Sector – Bristol


Outside IR35 | DV Clearance Required


Location: Bristol (Onsite/Hybrid) | Contract Duration: 6-12 months+


We are recruiting three experienced Data Engineers for a leading defence client in Bristol.


This is a contract opportunity operating outside IR35, offering a chance to work in a high-security, mission-critical environment. You must already hold UK DV Clearance,


Key Responsibilities:


  • Design, develop, and manage complex, large-scale data pipelines in secure or regulated environments
  • Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana)
  • Build and maintain robust data flows with Apache NiFi
  • Implement best practices for handling sensitive data, including encryption, anonymisation, and access control
  • Monitor and troubleshoot real-time data pipelines to ensure high performance and reliability
  • Write efficient, secure scripts and code using Python, Bash, or similar languages
  • Collaborate with cross-functional teams to meet technical and operational requirements


Essential Skills and Experience:


  • 3+ years’ experience as a Data Engineer in secure, regulated, or mission-critical environments
  • Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana)
  • Solid experience with Apache NiFi
  • Strong understanding of data security, governance, and compliance requirements
  • Experience building real-time, large-scale data pipelines
  • Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments
  • Experience using Infrastructure as Code tools
  • Excellent communication and stakeholder management skills
  • Detail-oriented with a strong focus on data accuracy, quality, and reliability


Desirable (Nice to Have):


  • Background in defence, government, or highly regulated sectors
  • Familiarity with Apache Kafka, Spark, or Hadoop
  • Experience with Docker and Kubernetes
  • Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK
  • Understanding of machine learning algorithms and data science workflows
  • Proven ability to deliver end-to-end data solutions
  • Knowledge of Terraform, Ansible, or similar IaC tools


Interested?


If you're ready to work on cutting-edge data engineering projects that make a difference, get in touch to learn more.

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: