Phonemos User Guide

Supported AI systems and troubleshooting

Supported AI interfaces

Phonemos supports the following interfaces.

Interface

Description

Embeddings API
(/v1/embeddings)

For semantic search, Phonemos needs an LLM that implements the Embeddings API with an OpenAI request/response format. Most vendors adopt this format even if the underlying model differs.

Chat Completions API (/v1/chat/completions)

For chatbot-like scenarios, Phonemos needs an LLM that implements the Chat Completions API. Allmost all vendors now use this API.

Model Context Protocol
(MCP)

For enterprise LLM integration scenarios, Phonemos comes with native support for Anthropic’s Model Context Protocol (MCP). With this interface, you can make information in Phonemos available to your enterprise LLM.

AI systems with known interface support

Please note that not all of the above features are supported in every subscription plan of each product. Please refer to your AI systems product documentation to see what subscription plan you need.

When using our Phonemos cloud offering, we use Mistral AI. Thus Phonemos is verified to work with Mistral AI by ourselves. All other information is provided to the best of our knowledge, but strictly without any guarantees.

The following information is provided by us for your convenience based on vendor’s documentation and user feedback. Please note that we are unable to verify the correctness and freshness of the information. If you have any updates or experiences in combination with Phonemos that you would like to share with other Phonemos customers, please notify us. As we rely on third party experiences, we usually cannot provide more information, how this information was gathered and it might be outdated.

 

AI system

Embeddings API

Chat Completions API

Model Context Protocol

AI Systems with Full API & MCP Client Support

Mistral AI (recommended)

1

OpenAI (ChatGPT)

2

Google Gemini

3

Microsoft Copilot

4

xAI (Grok)

5

LLama (Meta/Stack)

6

Amazon Bedrock7

AI systems with partial interface support

Anthropic (Claude 3.5/3.7)

🚫

Cohere

🚫

Groq

🚫

DeepSeek

🚫

Perplexity

🚫

🚫

Ollama (Local)

community8

Notes:

  1. Mistral AI: supported via Mistral Agent; our recommendation as we work with this and thus we have verified the information

  2. Open AI: supported via “Developer Mode” connectors

  3. Google Gemini: supported via Gemini CLI & Vertex AI agents; gemini.google.com support not yet live

  4. Microsoft Copilot: supported via Copilot studio MCP wizard. Guide

  5. xAI (Grok): supported via Grok API & client tools

  6. Llama (Meta/Stack): supported via Llama Stack Client

  7. Amazon Bedrock: not an end user client, but good for building custom, scalable applications or “headless” agents

  8. Works in local use, but requires a middleman (like Open WebUI) to connect to Phonemos

Troubleshooting

The AI market is very dynamic and things change fast. Please note that we are unable to provide help with the configuration of every single AI system out there. We work with Mistral AI in our cloud environment.

The respective LLM vendors or your SW integrator of choice are usually most competent first line of support and can provide documentation and help for your specific environment.