Data Engineering Team Lead

Job summary
Permanent contract
London
Salary: Not specified
No remote work
Experience: > 5 years
Skills & expertise
Java
Clojure
Scala
Terraform
Redis
+8
Apply

Trainline
Trainline

Interested in this job?

Apply
Questions and answers about the job

The position

Job description

As a Data Engineering Team Lead at Trainline you will... 🚄 

  • A committed lead and coach to an agile team of polyglot Data Platform Engineers, building a cutting-edge data lakehouse platform on AWS using Kafka and cloud-native streaming technologies, Iceberg, Trinio, DataHub and Airflow. 
  • Be a brilliant people manager for the Data Engineering function that motivates and engages their team to develop their skills and increase their impact. 
  • Deliver the platform to enable deliver world class data products, including the latest in LLM and other AI technologies.  
  • Lead the technical direction of the cross-functional team, making good choices on technologies and approach to get the biggest impact for the least risk. 
  • Foster an obsession with quality and engineering excellence through automated, repeatable processes using CI/CD, TDD, BDD. 
  • Drive incremental growth in dev ops maturity, building tools and practices that allow repeatable and efficient delivery of models to production, with strong governance in the workflow and continuous monitoring of models in production. 
  • Own the operation of the products built by your team and continuously improve operation performance. 
  • Lead and coach a self-organised Agile team and continuously improve agile maturity and delivery predictability. 

Preferred experience

We'd love to hear from you if you... 🔍 

  • Thrive in a diverse, open and collaborative environment. 
  • Expert in JVM technologies (primarily Scala but ideally with a working knowledge of Clojure or Java) and a familiarity with other key languages (especially Python). 
  • Expert in key data engineering platforms such as Kafka or other streaming technologies, data lakes (AWS S3, Iceberg, Parquet), analytics technologies (Trinio, Spark), automation technologies (Airflow, ML Flow) and data governance (DataHub). 
  • People management and technical leadership experience. 
  • Are passionate about agile software delivery with a track record of leading effective agile and lean software teams. 
  • A consistent background in software development in high volume environments 
  • Have a strong background in Dev Ops, deploying, managing and maintaining services using Docker, Terraform and AWS CLI tools to achieve infrastructure-as-code and automated deployments. 
  • Have an excellent working knowledge of AWS services (EMR, ECS, IAM, EC2, S3, DynamoDB, MSK). 

Our Technology Stack 💻 

  • Scala and Python 
  • Kafka, Spark, Kafka Streams, Kinesis, Akka and KSQL 
  • AWS, S3, Iceberg, Parquet, Glue and Spark/EMR for our Data Lake 
  • Elasticsearch, Dynamodb and Redis 
  • Starburst and Athena 
  • Airflow and ML Flow 

Want to know more?

These job openings might interest you!

These companies are also recruiting for the position of “Data / Business Intelligence”.

Apply