Law firms sit on some of the most consequential data in the country. Privileged communications. Unfiled litigation strategy. Merger terms six weeks before announcement. Whistleblower identities. Witness statements in active criminal matters. Custody files. Restructuring timelines. Internal investigations that, if leaked, would end careers and break companies.
For most of that work, a well-designed SaaS platform is the right answer. Strong encryption, tenant isolation, UK-region hosting, a proper contract, and an auditable processing trail solve the problem convincingly. That is how the default deployment of LegalAI Space runs today, and that is how it will run for most mid-market firms that adopt us. Monthly subscription, no infrastructure to operate, fast to deploy.
But not every firm, and not every matter, fits inside a SaaS contract. Some risk committees will not approve any third-party AI cloud for live matter data, full stop. Some client engagement letters now contain clauses that rule it out. Some firms already run a mature Azure or AWS estate with a "no external processors" posture they are not prepared to break for an AI vendor.
For those firms, we now offer a self-hosted deployment of LegalAI Space as an option alongside SaaS. This post explains what it is, how it is built, when it makes sense, and what technical teams need to know before evaluating it.
Two ways to run the same product
SaaS
Monthly subscription. UK-region hosting. Per-firm tenancy. Encryption in transit and at rest. Audit logs queryable by the firm. This is how most customers run, and how most customers should start.
Self-hosted
Annual licence. The full platform deploys inside the firm's own cloud tenancy or data centre. Matter data stays behind the firm's network boundary. Supported end to end by us.
The product is identical on both sides. Same agents, same governance engine, same verification layer, same audit database. The difference is where the software runs and who holds the encryption keys.
When self-hosted is the right call
Self-hosted is not the right answer for most firms. It adds operational overhead, needs an IT function that is comfortable running it, and costs more to deploy than SaaS. It earns its place when one or more of the following is true:
- The risk committee will not approve any third-party AI cloud for live matter data.
- A client engagement letter requires that the data never leaves the firm's infrastructure.
- The firm already operates a tightly controlled Azure, AWS, or GCP tenancy with mature SecOps and prefers AI to live there.
- The firm has an established in-house IT function with the capability to deploy and maintain enterprise software on its own infrastructure.
- The firm handles regulated-sector work (financial services, defence, healthcare, reputation, whistleblowing) where a third-party processor triggers additional disclosures or contractual breaches.
- The firm wants to launch a named internal AI programme as a differentiator to clients.
If none of those apply, SaaS is almost always the better choice, and we will say so directly before quoting self-hosted.
What self-hosted actually is, in technical terms a non-technical reader can follow
Self-hosted LegalAI Space is a set of containerised services that deploy inside the customer's own infrastructure. In practical terms, that usually means a dedicated Azure subscription, an isolated AWS account, or a private Kubernetes cluster inside the firm's data centre. The firm owns the compute, the storage, the network, and the keys. Matter data, prompts, and audit logs are written to databases inside that boundary and stay there.
The deployment has six components, all running inside the firm's environment:
- The agents. Research Agent, Contract Agent, Compliance Monitor, and Audit and Risk. Each runs as a service (a container) inside the firm's environment and can be scaled independently.
- The inference layer. The language models the agents depend on. Served either from a firm-owned GPU pool, or routed through a private-link connection to a dedicated model endpoint with no public-internet path. Inference is never shared across tenants.
- The governance engine. Our proprietary policy layer that decides what each agent may see, what it may do, and what it must log. Maps every action back to SRA principles and the firm's own acceptable-use policy.
- The verification layer. Every citation, clause reference, and agent-produced claim is checked against the source document before reaching the lawyer. Unverified claims are stopped before they become part of a live matter.
- The governance database. A tamper-evident audit store of every prompt, response, document touched, and decision made. Queryable on demand for regulators, clients, insurers, and internal risk reviews.
- The integration layer. Single sign-on through Azure AD or Okta. Document access through iManage, NetDocuments, or SharePoint. Data classification alignment with Microsoft Purview. Secrets in the firm's own vault. All configured to the existing stack, no parallel identity system.
Everything ships as signed container images through a secure release channel. The firm's IT team applies them the way any enterprise software update is applied: staging first, production second, rollback path in place.
Connected or air-gapped
Self-hosted supports two network postures, chosen by the firm's SecOps team:
Connected
The deployment runs inside the firm's tenancy, with outbound-only network access to our release channel for signed updates, opt-in metadata telemetry, and remote support on request. No matter data flows outward. This is how most self-hosted customers run.
Air-gapped
No outbound connection of any kind. Updates arrive as signed offline bundles, applied on a schedule the firm controls. Support is handled through an encrypted ticketing channel with no live system access. For firms with the highest data-sovereignty posture.
The product behaves the same in both modes. The difference is purely network.
What the firm keeps. What we do.
The firm holds every meaningful asset: the data, the encryption keys, the audit logs, the user accounts, the backups, the governance database, the network boundary. Nothing of substance sits with us.
Our responsibilities are software delivery and support. We ship signed, versioned releases on a published cadence. We provide runbooks, monitoring templates, and a named contact on call. In normal operation we do not see matter data, user activity, or document content at any point.
White-label and practice-niche customisation
Self-hosted deployments are white-labelled by default. The login screen, the agent responses, and audit reports can all carry the firm's brand rather than ours. For firms launching a named internal AI programme, the engine is LegalAI Space; the name on every screen belongs to the firm.
For practice niches that mainstream tools have not thought about, such as specialist regulatory work, unusual jurisdictions, or boutique commercial structures, we offer bespoke agent development inside the same deployment, governed by the same verification and audit layer.
Updates without compromise
Self-hosted software has a reputation for meaning "stuck on an old version forever." That is not how this is designed. Signed releases arrive with human-readable notes, a staging test plan, and a rollback path. IT teams pin, defer, or promote to production on their own schedule. The platform stays current regardless of deployment mode.
The practical case for self-hosted
Privacy by architecture. The network boundary is the guarantee, not a contract clause. Processing happens inside the firm's perimeter because there is no path out of it.
Regulatory clarity. SRA principles, EU AI Act obligations, UK GDPR, and client engagement-letter disclosures become easier to answer when the full processing chain is inside a single, firm-owned boundary.
Partner acceptance. Partners who will not sign off on a third-party AI cloud will sign off on software that runs inside the firm's infrastructure. The internal political case becomes winnable.
Vendor independence. If our company disappeared tomorrow, a self-hosted deployment keeps operating. Software, data, audit trail, and governance database remain in the firm's hands. Continuity is a property of the architecture, not a promise in a contract.
Where this sits in the wider market
Self-hosting is not a common posture in legal AI. I spent time this week going through the 54 vendors I track as competitors or close adjacencies, and checked which of them publicly offer a self-hosted, on-premise, or customer-cloud deployment. The shape of the answer is worth sharing, because it tells you something about where the market actually is, rather than where marketing pages imply it is.
Eight vendors across the full landscape publish a clear self-hosted option. Most of those sit in contract or document infrastructure (Icertis, iManage RAVN, Zuva, Robin AI), or are AI-forward products built primarily for large law firms and in-house legal teams (Alexi, LegalFly, Callidus Legal AI, Paxton AI). A further six offer partial patterns, usually a legacy on-premise product on a sunset path towards cloud. Roughly two-thirds of the landscape is either explicitly cloud-only or silent on the question.
The more pointed finding, for UK firms weighing this decision, is inside the cohort that matters most: none of the UK-first, SRA-focused vendors currently publishes a self-hosted deployment. The same is true of the feature-overlap platforms a UK mid-market firm is most likely to be comparing us against, including Harvey, CoCounsel, Lexis+ AI, Legora, and Clio Duo. All of them run as managed cloud services.
So far as I can see from public material, LegalAI Space is among the first, and quite possibly the first, to offer a self-hosted deployment built specifically for UK mid-market firms working inside the SRA framework. I would be glad to be corrected if a competitor has shipped something equivalent that I have missed. What the evidence says today is that this kind of deployment is not yet on the shelf in this segment of the market.
A note from me
LegalAI Space runs as SaaS for most customers, and that will continue to be how we scale. The SaaS product is secure, fast to deploy, and the right fit for the majority of mid-market firms.
Self-hosted exists because I kept meeting compliance officers, COLPs, and IT directors who could not say yes to any SaaS AI tool, for reasons that were entirely legitimate. Rather than walk away from those firms, we built the deployment they needed. It is a premium option, supported end to end, and designed for the cases where nothing less will do.
If you lead compliance, risk, or technology at a UK mid-market firm and a self-hosted deployment matches how the firm thinks about data, I would welcome a conversation. If SaaS is the right answer for where you are today, I am happy to walk through that too.
Contact: daman@legalaispace.com
Website: legalaispace.com