Chakra now supports Model Context Protocol (MCP)

Links
Intro
Yesterday, Anthropic announced Claude 3.7 Sonnet, its latest state-of-the-art model. It’s an impressive upgrade from Claude 3.5 Sonnet launched 8 months ago, boasting increased performance, larger context windows, and much better reasoning abilities.

However, it’s not the most important upgrade over the last 6 months from Anthropic - that title belongs to Model Context Protocol (MCP). MCP is more than a model iteration, it's an important shift in how consumers interact with their favorite pieces of software.
What is MCP?
Anthropic describes MCP as a USB-C port for AI applications. MCP is an open-standard for bi-directional communication between AI models and data sources. It works out of the box with Sonnet 3.7 and other providers like ChatGPT and Grok.

To understand MCP - let's observe what Perplexity built on top of ChatGPT.

Perplexity built an impressive tool for conducting web search. They pass the results from this web search to the LLM as context. This allows the LLM to now have access to current events, which makes the end output that much better and more relevant for users.
MCP enables any application developer to now provide LLMs with access to mission critical data, within their existing chat experience (like Claude Desktop).
The Case for the Niche Chatbot Experience
Previously, if you were a software provider, to add your data to chat, you would build a full chat experience. Separate UI, domain, experience - users would have to juggle between N different experiences to get the data they need.

While Perplexity has largely achieved this and hit escape velocity, it’s challenging for a typical software provider to accomplish this feat and change user preference.
Users like their existing chat experiences - Claude Desktop, Grok3 WebUI, ChatGPT mobile. Rather than change those user preferences, MCP allows software providers to integrate their data into existing experiences.
The Case for a Consolidated Chat Experience
A consolidated experience is a single chat platform with many integrations. MCP launched in November of last year and already has an impressive catalog of providers including:
You can check out an ever growing list of new MCP integrations on OpenTools.

Now, rather than paging through 10’s of chat applications, your Claude Desktop has access to all of your organization’s critical data.
Why MCP + Chakra?
Chakra is on a mission to organize the world’s structured data. We’re building an open data marketplace on top of our data warehousing platform to allow users to share and monetize mission critical datasets.
The world is changing and the consumption of those datasets won’t live in adhoc BI dashboards or one-off experiences. They’ll live in chat. They will be real-time, high quality, and service customer needs much faster than previously.

Today, when users ask questions to an LLM client, our MCP server spins up locally. This server then initializes the connection with our data orchestrator and enriches the LLM with metadata on the user's datasets and any subscribed data assets. The LLM can then inspect those assets to understand if they help answer the user's questions. If they can, the LLM will traverse the data, write SQL, retrieve the data, and now incorporate those results into its local context.
Even more cool - the LLM can actually create new data assets in the user's data estate. We'll share some exciting demoes on this in the next few weeks.
The future of software caters experience to developer (DX) and agents alike (AX). MCP allows us to bring our data to closer to both end users - no SQL required, batteries fully included.
Check it out!
Try out our MCP integration and give us feedback!
Start asking Claude about financial data and feel free suggest new datasets for us to add to our marketplace.
This is just the start - we're continuing to refine MCP performance and the experience will only improve as we work with enterprises to improve our open data marketplace.
Disclaimers
- MCP is extremely early. The experience in Claude Desktop is suboptimal - every time you use the server, you have to grant access explicitly. This is a design decision on Anthropic's part and is not yet configurable.
- Setup is rough around the edges. We have worked closely with the folks at OpenTools to make this as seamless as possible, but there is room for improvement. We are looking forward to an MCP GUI experience in the future, but for now, users must use the command-line.
- Today, the server runs on the user's local machine. Anthropic's roadmap includes a hosted server option, which we will support. This will make authentication, setup, and performance much better.