4.9 KiB
4.9 KiB
MCP Client
A TypeScript client library for interacting with Model Context Protocol (MCP) services, providing a production-ready interface for AI conversations with tool support.
Installation
npm install https://git.everydayseries.io/kroy665/client-mcp.git
Features
- TypeScript support out of the box
- Streaming chat completions
- Tool and function calling support
- Automatic conversation management
- Configurable connection to MCP servers
- Debug logging
- Request cancellation support
Quick Start
import { ClientMCP } from 'mcp-client';
// Create a new client instance
const client = new ClientMCP({
apiKey: 'your-api-key',
model: 'gemini-2.0-flash',
debug: true
});
// Connect to the MCP server
await client.connectToServer('http://localhost:3003/mcp');
// Stream chat responses
for await (const chunk of client.chat("Hello, how are you?")) {
console.log(chunk.choices[0]?.delta?.content);
}
// Don't forget to clean up
await client.disconnect();
API Reference
new ClientMCP(config: ClientMCPConfig)
Creates a new MCP client instance.
Parameters
config
(Object): Configuration objectapiKey
(string): Your API key for authenticationmodel
(string): The model to use (default: "gemini-2.0-flash")baseUrl
(string): Base URL for the APItimeout
(number): Request timeout in milliseconds (default: 30000)debug
(boolean): Enable debug logging (default: false)systemMessages
(string): Custom system messages (optional)
Methods
connectToServer(serverPath: string | URL, sessionId?: string): Promise<void>
Establishes connection to MCP server.
serverPath
: URL or string path to the MCP serversessionId
: Optional session ID for reconnection
disconnect(): Promise<void>
Disconnects from the MCP server and cleans up resources.
chatCompletionStream(options: ChatCompletionStreamOptions): AsyncGenerator<ChatCompletionChunk>
Performs streaming chat completion.
const stream = client.chatCompletionStream({
messages: [{ role: 'user', content: 'Hello!' }],
reasoningEffort: 'high',
tools: [...]
});
for await (const chunk of stream) {
console.log(chunk.choices[0]?.delta?.content);
}
chat(content: string, options?: ChatOptions): AsyncGenerator<ChatChunk>
Main chat interface with automatic tool handling and conversation management.
for await (const chunk of client.chat("What's the weather like?", {
maxDepth: 3,
autoSummarize: true
})) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
cancelRequests(): void
Cancels all ongoing requests.
Examples
import ClientMCP from 'client-mcp';
import { v4 as uuidv4 } from 'uuid';
const serverId = uuidv4();
async function main() {
const mcpClient = new ClientMCP({
apiKey: "AIzaSyBS0xT1myuCfMfdNvA9FgVZm258PBoM4hY",
model: "gemini-2.0-flash",
baseUrl: "https://generativelanguage.googleapis.com/v1beta/openai/",
debug: true
});
// MCPClientList.push(mcpClient)
try {
// const serverPath = "/Users/koushikroy/Documents/temp_projects/mcp_server/dist/index.js";
const serverPath = new URL("/mcp", "http://localhost:3003");
serverPath.searchParams.set("email", "kroy963@gmail.com");
serverPath.searchParams.set("teamSlug", "mcp");
serverPath.searchParams.set("apiKey", "es-AghrcEgk0G2cFvFqlZSNSG1EPrXb");
await mcpClient.connectToServer(serverPath, serverId);
await chatLoop(mcpClient);
} finally {
console.log("MCP Client Closed!");
process.exit(0);
}
}
// chat loop
async function chatLoop(mcpClient: ClientMCP) {
while (true) {
const userMessage = await new Promise<string>((resolve) => {
const readline = require('readline');
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout
});
rl.question('Q: ', (message: string) => {
rl.close();
resolve(message);
});
});
if (userMessage.trim().toLowerCase() === 'quit') {
console.log('Goodbye!');
process.exit(0);
}
const response = mcpClient.chat(userMessage);
for await (const chunk of response) {
console.log(chunk.choices[0].delta.content);
}
}
}
main();
Development
- Clone the repository
- Install dependencies:
npm install
- Build the project:
npm run build
- Run tests:
npm test
License
MIT
Notes
- The client automatically manages conversation state and tool calls
- Supports both HTTP and WebSocket transports
- Includes comprehensive error handling and logging
- Thread-safe for concurrent usage
- Memory efficient with streaming support