, an industrial AI system that lives inside refineries and upstream assets, serving real-time insights to operators, technologists, and engineers. As a
Forward Deployed Full Stack Engineer
, you'll make Orbital usable on the ground: building interfaces, APIs, and integration layers that bring AI outputs directly into operational workflows.
This isn't a typical web dev role. You'll work across
back-end services, APIs, and industrial integrations
, while also shaping
front-end interfaces
that can survive operator control rooms and engineering workflows. You'll be customer-facing: working directly with site teams, adapting features in real time, and making sure the system sticks in production.
You won't just productionise models, you'll
install Orbital on customer sites
, integrate with live historian and process data pipelines, and ensure the system runs inside customer IT/OT networks.
Location:
Whilst you will be based in the Europe and or eligible to work here - this role will involve travel to other locations in India & USA.
Core Responsibilities
Application Development
Build and maintain
front-end dashboards and interfaces
for refinery operators, technologists, and engineers.
Develop
back-end APIs and services
that integrate Orbital's AI outputs into customer systems.
Ensure applications are secure, reliable, and performant in both cloud and on-prem environments.
Microservices & Integration
Develop services as
containerised microservices
, orchestrated in Kubernetes/EKS.
Connect front-end and back-end layers with message brokers (Kafka, RabbitMQ) and API gateways.
Integrate with industrial data sources (historians, LIMS, OPC UA, IoT feeds).
Forward Deployment & Customer Adaptation
Deploy full-stack applications in customer on-premise networks.
Work with process engineers, IT/OT, and operations teams to customise UI/UX for their workflows.
Debug and iterate features live in the field, ensuring adoption and usability.
Software Engineering Best Practices
Write clean, modular, and testable code across front-end and back-end.
Set up CI/CD pipelines for fast iteration and deployment.
Collaborate closely with product owners and ML engineers to align UI with model capabilities.
Adapt UX to site-specific workflows (control room, process engineering, production tech teams).
Collaborate with ML Engineers to surface inference + RCA results in usable, real-time dashboards.
Customer Facing
Deploy AI microservices in
customer on-prem
(often air-gapped or tightly firewalled)
/ our cloud clusters.
Connect Orbital pipelines to
customer historians, OPC UA servers, IoT feeds, and unstructured data sources
.
Build data ingestion flows tailored to each site, ensuring schema, tagging, and drift handling are robust.
Work with customer IT/OT to manage network, security, and performance constraints.
Requirements
Strong proficiency in
JavaScript/TypeScript (React, Node.js)
and back-end frameworks (FastAPI, Express, Django).
Solid working knowledge of
Python
for scripting, APIs, and data integration.
Experience building
containerised microservices
(Docker, Kubernetes/EKS).
Familiarity with
message brokers
(Kafka, RabbitMQ).
Proficiency with
Linux environments
(deployment, debugging, performance tuning).
Hands-on experience with
AWS (EKS, S3, IAM, CloudWatch, etc.)
.
Bonus: exposure to
time-series/industrial data
and operator-facing dashboards.
Comfort working in