AI / MLPython

mcp-client-for-ollama

A text-based user interface client for interacting with MCP servers using Ollama.

Free
N/A rating (0 reviews)0 installs588 GitHub stars
MCPOllamaTUILLMPython

The MCP Client for Ollama is a powerful Python client designed to facilitate interaction with Model Context Protocol (MCP) servers. It enables developers to leverage local large language models (LLMs) effectively by providing a range of features including agent mode, multi-server support, model switching, and streaming responses. The client also supports tool management, allowing users to integrate various tools seamlessly into their workflows. In addition to its core functionalities, the client offers advanced features such as human-in-the-loop capabilities, thinking mode, and customizable model parameters. Users can configure MCP prompts, set custom system prompts, and save preferences for a personalized experience. This makes it an ideal solution for developers looking to enhance their productivity while working with local LLMs.

Compatible with

Claude DesktopCursor

Install

Claude Desktop

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "mcp-client-for-ollama": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-client-for-ollama"
      ]
    }
  }
}

Config File Location

Mac: ~/Library/Application Support/Claude/claude_desktop_config.json

Windows: %APPDATA%\Claude\claude_desktop_config.json

Linux: ~/.config/claude/claude_desktop_config.json

Some servers require additional setup - check the GitHub README for specific instructions.

Permissions

ok0
ok1