AI / MLPython

uber-eats-mcp-server

This MCP server facilitates the integration of LLM applications with external tools using the Model Context Protocol.

Free
N/A rating (0 reviews)0 installs216 GitHub stars
MCPUber EatsLLM

The Uber Eats MCP Server is a proof of concept that demonstrates how to build servers utilizing the Model Context Protocol (MCP) on top of the Uber Eats platform. This server allows seamless communication between large language models (LLMs) and various external tools, enhancing the capabilities of applications that leverage AI. By following the setup instructions, users can quickly get the server up and running, enabling them to experiment with LLM integrations in a controlled environment. With support for the Anthropic API and other LLM providers, this server provides a flexible foundation for developers looking to explore the potential of LLMs in their applications. The MCP protocol ensures that interactions are efficient and streamlined, making it easier for developers to focus on building innovative features rather than dealing with complex integration issues.

Compatible with

Claude DesktopCursor

Install

Claude Desktop

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "uber-eats-mcp-server": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-uber-eats-mcp-server"
      ]
    }
  }
}

Config File Location

Mac: ~/Library/Application Support/Claude/claude_desktop_config.json

Windows: %APPDATA%\Claude\claude_desktop_config.json

Linux: ~/.config/claude/claude_desktop_config.json

Some servers require additional setup - check the GitHub README for specific instructions.

Permissions

ok0
ok1