Key Skills Required
· Technical Ability: has a high level of current, technical competence in relevant technologies, and be able to independently learn new technologies and techniques as our stack changes.
· The role requires close coordination to design and implement data pipelines that support machine learning models, analytical dashboards, and experimental frameworks.
· Clear communication; can communicate effectively in both written and verbal forms with technical and non-technical audiences alike.
· Complex problem-solving ability; structured, organised, process-driven and outcome-oriented. Able to use historical experiences to help with future innovations.
· Passionate about data; enjoy being hands-on and learning about new technologies, particularly in the data field.
Technical Skills Required
· Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD).
· Experience with Apache Spark or any other distributed data programming frameworks.
· Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake.
· Experience with cloud infrastructure like AWS or Azure.
· Experience with Linux and containerisation (e.g Docker, shell scripting).
· Understanding Data modelling and Data Cataloguing principles.
· Understanding of Data Management principles (security and data privacy) and how they can be applied to Data Engineering processes/solutions (e.g. access management, data privacy, handling of sensitive data (e.g. GDPR).
· Experience with CI/CD tools, in particular GitHub actions.
· Hands-on IaC development experience with Terraform or CloudFormation.
Desirable Skills
· Hands-on development experience in an airline, e-commerce or retail industry
· Experience in event-driven architecture, ingesting data in real time in a commercial production environment with Spark Streaming, Kafka, DLT or Beam.
· Experience implementing end-to-end monitoring, quality checks, lineage tracking and automated alerts to ensure reliable and trustworthy data across the platform.
· Experience of building a data transformation framework with dbt.
· Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture.
What you’ll get in return
· Competitive base salary
· Up to 20% bonus
· 25 days holiday
· BAYE, SAYE & Performance share schemes
· 7% pension
· Life Insurance
· Work Away Scheme
· Flexible benefits package
· Excellent staff travel benefits
About easyJet
At easyJet our aim is to make low-cost travel easy – connecting people to what they value using Europe’s best airline network, great value fares, and friendly service.
It takes a real team effort to carry over 90 million passengers a year across 35 countries. Whether you’re working as part of our front-line operations or in our corporate functions, you’ll find people that are positive, inclusive, ready to take on a challenge, and that have your back. We call that our ‘Orange Spirit’, and we hope you’ll share that too.
Apply
Complete your application on our careers site.
We encourage individuality, empower our people to seize the initiative, and never stop learning. We see people first and foremost for their performance and potential and we are committed to building a diverse and inclusive organisation that supports the needs of all. As such we will make reasonable adjustments at interview through to employment for our candidates.