Back to blog

Introducing LeemerLabs

Local European inference, custom models, and proof through shipped products.

LeemerLabs is the infrastructure arm of the Leemer Group: Ireland-hosted inference, custom model creation through LeemerFoundry, and the systems powering products like LeemerChat.

Repath 'Ray' Khan, Founder of LeemerLabsApril 17, 20266 min read

In one line

LeemerLabs is the infrastructure company behind the Leemer Group: an Ireland-hosted inference gateway, a custom model foundry, and the operating layer powering products like LeemerChat.

Ireland-hosted

Waterford + Dublin

Gateway

OpenAI-compatible

Hardware

Nvidia H200

Scope

Inference + Foundry

What LeemerLabs is

A lab, a gateway, and a model foundry.

LeemerLabs is the infrastructure arm of the Leemer Group. It exists because our own products outgrew generic AI APIs. We needed local European inference, exportable custom models, and a path from research to production that did not depend on opaque third-party defaults.

So we built the stack ourselves. LeemerLabs now covers three connected layers: a public inference gateway for hosted open-weight models, LeemerFoundry for custom model creation, and the infrastructure that powers products like LeemerChat and Critique.

The point is simple. We do not want companies in Europe renting intelligence from black-box APIs forever. We want them to own the model layer they build their business on.

Why now

Ownership has become cheaper than dependence.

Training costs have fallen hard. Open-weight models are now strong enough to justify serious product bets. And enterprises have become far less comfortable shipping sensitive workflows through shared US inference paths dressed up as compliance.

That changes the economics. For many high-volume use cases, a custom model is materially cheaper than repeated prompt engineering against closed APIs. It is also easier to govern, easier to tune, and far more defensible once the behaviour actually matches your domain.

LeemerLabs is built around that shift. Not AI as a demo. AI as owned infrastructure.

The structure

Three surfaces, one system.

The public gateway is the clean entry point: OpenAI-compatible, EU-hosted, and built for teams that want a fast path into local inference. It is the practical layer.

LeemerFoundry is the deeper layer: dataset creation, fine-tuning, evaluation, deployment, and exportable weights. It is the ownership layer.

LeemerChat and our other products are the proof layer. They are not mockups for investors. They are the systems that force our infrastructure to survive real users, real load, and real edge cases.

Core products

01

LeemerChat

The flagship workspace. Frontier models, consensus routing, deep research, codebase chat, and long-horizon execution in one product.

02

LeemerFoundry

Custom model creation from data forge to deployment. Your data, your model, our GPUs.

03

Critique

GitHub-native pull request review backed by sandbox execution and inspectable output.

Related Posts

February 22, 2026

The Foundry Report: Why Fine-Tuned Models Are Still the Sharpest Weapon in Enterprise AI

Tinker is now generally available. Vision input, Kimi K2 Thinking, and LoRA Without Regret are reshaping what custom model training looks like in 2026. Here's why fine-tuning is more strategically important than ever — and how LeemerLabs Model Foundry is building the infrastructure to prove it.

Read more
November 22, 2025

Introducing LeemerLabs Model Foundry: Your Data. Your Model. Our GPUs.

We're launching Ireland's first custom LLM creation studio. Fine-tune frontier models up to 235B parameters using Tinker distributed training, powered by Thinking Machines Lab. Build domain-specific intelligence layers that you own and deploy anywhere.

Read more
March 2, 2026

Get Ready for Mission Control: The Next Evolution of Agentic Execution

Mission Control is our next-generation agentic research and execution platform. It represents a fundamental shift in how we interact with AI—moving away from rigid pipelines and chat interfaces, and stepping into the era of autonomous, goal-oriented swarms.

Read more
January 25, 2026

PDF AI Processing: From Static Documents to Solved Answers

Upload a PDF, get it back solved. No prompts required. Introducing our standalone PDF processing workflow with Mistral OCR, Multi-Agent Consensus, and Visual Overlay.

Read more

Try These Features

Explore more:All PostsReleasesModelsBenchmarksEngineeringInsightsAll FeaturesAbout UsTermsPrivacy