Implementing An Llm Agent With Tool Access Using Mcp-use

Trending 14 hours ago
ARTICLE AD BOX

MCP-Use is an open-source room that lets you link immoderate LLM to immoderate MCP server, giving your agents instrumentality entree for illustration web browsing, record operations, and much — each without relying connected closed-source clients. In this tutorial, we’ll usage langchain-groq and MCP-Use’s built-in speech representation to build a elemental chatbot that tin interact pinch devices via MCP. 

Installing uv package manager

We will first group up our situation and commencement pinch installing nan uv package manager. For Mac aliases Linux:

curl -LsSf https://astral.sh/uv/install.sh | sh

For Windows (PowerShell):

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Creating a caller directory and activating a virtual environment

We will past create a caller task directory and initialize it pinch uv

uv init mcp-use-demo cd mcp-use-demo

We tin now create and activate a virtual environment. For Mac aliases Linux:

uv venv source .venv/bin/activate

For Windows:

uv venv .venv\Scripts\activate

Installing Python dependencies

We will now instal nan required dependencies

uv adhd mcp-use langchain-groq python-dotenv

Groq API Key

To usage Groq’s LLMs:

  1. Visit Groq Console and make an API key.
  2. Create a .env record successful your task directory and adhd nan pursuing line:
GROQ_API_KEY=<YOUR_API_KEY>

 Replace <YOUR_API_KEY> pinch nan cardinal you conscionable generated.

Brave Search API Key

This tutorial uses nan Brave Search MCP Server.

  1. Get your Brave Search API cardinal from: Brave Search API
  2. Create a record named mcp.json successful nan task guidelines pinch nan pursuing content:
{ "mcpServers": { "brave-search": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-brave-search" ], "env": { "BRAVE_API_KEY": "<YOUR_BRAVE_SEARCH_API>" } } } }

Replace <YOUR_BRAVE_SEARCH_API> pinch your existent Brave API key.

Node JS

Some MCP servers (including Brave Search) require npx, which comes pinch Node.js.

  • Download nan latest type of Node.js from nodejs.org
  • Run nan installer.
  • Leave each settings arsenic default and complete nan installation

Using different servers

If you’d for illustration to usage a different MCP server, simply switch nan contents of mcp.json pinch nan configuration for that server.

Create an app.py record successful nan directory and adhd nan pursuing content:

Importing nan libraries

from dotenv import load_dotenv from langchain_groq import ChatGroq from mcp_use import MCPAgent, MCPClient import os import sys import warnings warnings.filterwarnings("ignore", category=ResourceWarning)

This conception loads situation variables and imports required modules for LangChain, MCP-Use, and Groq. It besides suppresses ResourceWarning for cleaner output.

Setting up nan chatbot

async def run_chatbot(): """ Running a chat utilizing MCPAgent's built successful speech representation """ load_dotenv() os.environ["GROQ_API_KEY"] = os.getenv("GROQ_API_KEY") configFile = "mcp.json" print("Starting chatbot...") # Creating MCP customer and LLM instance customer = MCPClient.from_config_file(configFile) llm = ChatGroq(model="llama-3.1-8b-instant") # Creating an supplier pinch representation enabled supplier = MCPAgent( llm=llm, client=client, max_steps=15, memory_enabled=True, verbose=False )

This conception loads nan Groq API cardinal from nan .env record and initializes nan MCP customer utilizing nan configuration provided successful mcp.json. It past sets up nan LangChain Groq LLM and creates a memory-enabled supplier to grip conversations.

Implementing nan chatbot

# Add this successful nan run_chatbot function print("\n-----Interactive MCP Chat----") print("Type 'exit' aliases 'quit' to extremity nan conversation") print("Type 'clear' to clear speech history") try: while True: user_input = input("\nYou: ") if user_input.lower() successful ["exit", "quit"]: print("Ending conversation....") break if user_input.lower() == "clear": agent.clear_conversation_history() print("Conversation history cleared....") continue print("\nAssistant: ", end="", flush=True) try: consequence = await agent.run(user_input) print(response) isolated from Exception arsenic e: print(f"\nError: {e}") finally: if customer and client.sessions: await client.close_all_sessions()

This conception enables interactive chatting, allowing nan personification to input queries and person responses from nan assistant. It besides supports clearing nan chat history erstwhile requested. The assistant’s responses are displayed successful real-time, and nan codification ensures that each MCP sessions are closed cleanly erstwhile nan speech ends aliases is interrupted.

Running nan app

if __name__ == "__main__": import asyncio try: asyncio.run(run_chatbot()) isolated from KeyboardInterrupt: print("Session interrupted. Goodbye!") finally: sys.stderr = open(os.devnull, "w")

This conception runs nan asynchronous chatbot loop, managing continuous relationship pinch nan user. It besides handles keyboard interruptions gracefully, ensuring nan programme exits without errors erstwhile nan personification terminates nan session.

You tin find nan full codification here

To tally nan app, tally nan pursuing command


This will commencement nan app, and you tin interact pinch nan chatbot and usage nan server for nan session

I americium a Civil Engineering Graduate (2022) from Jamia Millia Islamia, New Delhi, and I person a keen liking successful Data Science, particularly Neural Networks and their exertion successful various areas.

More