The role
You will be part of the ML team at Mavenoid, shaping the next product features to help people around the world get better support for their hardware devices. The core of your work will be to understand users' questions and problems to fill the semantic gap.
The incoming data consists mostly of textual conversations, search queries and documents (more than 1M text conversations per month and growing volume on voice). You will help to process this data and assess new LLM and NLP models to build and improve the set of ML features in the products.
Tech stack
We work:in Python
with NLP/ML libs, including langchain, langfuse, huggingface, pytorch (among others)
major LLM providers (OpenAI, Anthropic, Google, Mistral) and hosted models
deploying with docker on GCP cloud services
We are pragmatic on which tool to use for each approach, as long as it can be properly packaged for production.
Way of working
We are a small team -- by design -- and share responsibilities. We care about:shipping to production and see usage
keeping up with the ML developments
balance between speed and codebase quality
You will:work fully remote and meet IRL few times a year
focus on specific features and own the process from scoping to production delivery
evaluate ideas and propose the right metrics to explore/implement/ship new things
contribute on ML models and features but also service architecture and the platform at scale
MNCJobs.co.uk will not be responsible for any payment made to a third-party. All Terms of Use are applicable.