Job description.
Data Engineer - AI & Automation
Location: Belfast (Hybrid)
Eligibility: Must have the right to work in the UK (no sponsorship available)
I am working with a fast-growing AI automation company that is expanding into Belfast and building a brand-new technical hub.They are now looking for Data Engineers to help build the data foundations that power intelligent, agent-driven software testing at scale.
You'll be joining a team where data is the backbone of everything: model training, real-time decision making, predictive analytics, and the next generation of AI-powered testing capabilities. If you enjoy architecting pipelines, tackling messy data, and building reliable systems that ML and engineering teams depend on, this is a chance to help shape a new Belfast function from the ground up.
Why join?
- Be part of the founding Data Engineering team in a brand-new Belfast hub
* Work on high-impact pipelines feeding cutting-edge AI and automation systems
* Build modern, cloud-native data architecture with freedom to influence tools and direction
* Strong collaboration with ML, software engineering, and platform teams
* £55k - £75k per annum
What you'll be doing:
- Design and build scalable, reliable data pipelines for ingestion, transformation, and feature delivery
* Develop and maintain cloud-native data infrastructure supporting AI-driven testing platforms
* Own ETL/ELT workflows, data models, and real-time processing components
* Partner with Data Scientists to productionise features and ensure high-quality training data
* Build systems for monitoring, observability, and data quality across multiple sources
* Work with engineering teams to integrate data workflows into core platform services
* Influence architecture and tool choices as we scale the new Belfast operation
* Contribute to best practices in data governance, documentation, and automation
What you'll bring:
- Strong experience building data pipelines using Python, SQL, or modern ETL frameworks
* Hands-on experience with cloud data platforms (GCP ideal, but AWS/Azure welcome)
* Solid understanding of data modelling, warehousing, and distributed processing
* Experience with workflow/orchestration tools (Airflow, Prefect, etc.)
* Ability to collaborate with ML teams and support model deployment workflows
* Comfortable working in a fast-moving environment with lots of autonomy
* Strong problem-solving mindset and attention to data quality, performance, and scalability
Interested?
If you'd like to join a growing AI team building genuinely impactful systems, reach out to Justin Donaldson for a confidential chat or send your CV to learn more.
