Ghostboard pixel

Thunderbolt Wants to Do for AI Clients What Thunderbird Did for Email

This self-hostable enterprise AI client lets you bring your own models and keep your data off third-party servers.
Warp Terminal

MZLA Technologies Corporation, the Mozilla Foundation subsidiary behind Thunderbird, has announced Thunderbolt, an open source, self-hostable AI client for organizations that want to run AI on their own infrastructure.

The project is funded through investment from Mozilla and is a standalone product, separate from Thunderbird, built by a different team within MZLA that's focused on enterprise AI products.

Offered under Mozilla Public License 2.0, Thunderbolt offers an AI workspace where users can interact with AI through chat, search, and research, connect to enterprise data, and choose the models and tools that fit their needs.

It runs natively on Linux, Windows, macOS, iOS, and Android, with a web client also being made available.

A thing to note…

You should know that Thunderbolt ships with telemetry on by default.

According to the project's telemetry documentation on GitHub, it uses PostHog to collect usage data covering chat activity, model selections, settings changes, and location information.

This can be switched off in settings, and the project states no personally identifiable information (PII) is collected without explicit consent.

Who is it for?

The intended audience for this could be organizations with strict data residency or compliance requirements. So think healthcare providers, legal firms, and financial institutions that cannot afford sensitive internal data flowing through third-party AI services.

As for its competition, Thunderbolt is a direct challenge to Microsoft Copilot, ChatGPT Enterprise, and Claude Enterprise. In the open source space, it sits alongside tools like Open WebUI and LibreChat, both of which offer self-hosted AI frontends.

Announcing Thunderbolt, the CEO of MZLA Technologies Corporation, Ryan Sipes, added that:

AI is too important to outsource. With Thunderbolt, we’re giving organizations a sovereign AI client that allows them to decide how AI fits into their workflows – on their infrastructure, with their data, and on their terms.

What can you expect?

this multi-colored (white, yellow, purple, pink) banner shows some screenshots of thunderbolt running on a laptop and smartphone

Thunderbolt connects to frontier models from Anthropic, OpenAI, and Mistral, handles local inference through Ollama, and accepts custom providers, with the workspace offering Chat and Search modes.

It can also handle scheduled work, pulling together briefings, tracking topics over time, or kicking off actions when set conditions are met.

deepset's Haystack integration ties the client into enterprise agent and RAG pipelines within the same architecture, whereas MCP (Model Context Protocol) support is in preview, and ACP (Agent Client Protocol) is in active development with an April 2026 target.

How to get it?

You can get started with Thunderbolt by visiting thunderbolt.io. Organizations interested in enterprise deployment, professional support, or custom development can get in touch with the team.

As for the source code, it lives on GitHub.

Other than that, the FAQ does mention that a Thunderbolt version for regular users is on the cards, but there's no release date for it yet.

About the author
Sourav Rudra

Sourav Rudra

A nerd with a passion for open source software, custom PC builds, motorsports, and exploring the endless possibilities of this world.

Become a Better Linux User

With the FOSS Weekly Newsletter, you learn useful Linux tips, discover applications, explore new distros and stay updated with the latest from Linux world

itsfoss happy penguin

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to It's FOSS.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.