readme updated

This commit is contained in:
Koushik Roy 2025-06-16 16:36:54 +05:30
parent 445f29025f
commit 89ff981a0f

130
README.md
View File

@ -1,70 +1,108 @@
# Gemini MCP Client
# MCP Client
A TypeScript client library for interacting with Gemini MCP (Multi-Cloud Platform) services.
A TypeScript client library for interacting with Model Context Protocol (MCP) services, providing a production-ready interface for AI conversations with tool support.
## Installation
```bash
npm install gemini-mcp
npm install https://git.everydayseries.io/kroy665/client-mcp.git
```
## Features
- TypeScript support out of the box
- Promise-based API
- Comprehensive type definitions
- Easy integration with Node.js applications
- Streaming chat completions
- Tool and function calling support
- Automatic conversation management
- Configurable connection to MCP servers
- Debug logging
- Request cancellation support
## Usage
## Quick Start
```typescript
import { ClientMCP } from 'gemini-mcp';
import { ClientMCP } from 'mcp-client';
// Create a new instance
// Create a new client instance
const client = new ClientMCP({
apiKey: 'your-api-key',
baseUrl: 'https://api.gemini-mcp.com/v1'
model: 'gemini-2.0-flash',
debug: true
});
// Use the client
async function getBlogs() {
try {
const blogs = await client.getBlogs();
console.log(blogs);
} catch (error) {
console.error('Error fetching blogs:', error);
}
// Connect to the MCP server
await client.connectToServer('http://localhost:3003/mcp');
// Stream chat responses
for await (const chunk of client.chat("Hello, how are you?")) {
console.log(chunk.choices[0]?.delta?.content);
}
getBlogs();
// Don't forget to clean up
await client.disconnect();
```
## API Reference
### `new ClientMCP(config: ClientMCPConfig)`
Creates a new Gemini MCP client instance.
Creates a new MCP client instance.
#### Parameters
- `config` (Object): Configuration object
- `apiKey` (string): Your Gemini MCP API key
- `baseUrl` (string): Base URL for the API (default: 'https://api.gemini-mcp.com/v1')
- `apiKey` (string): Your API key for authentication
- `model` (string): The model to use (default: "gemini-2.0-flash")
- `baseUrl` (string): Base URL for the API
- `timeout` (number): Request timeout in milliseconds (default: 30000)
- `debug` (boolean): Enable debug logging (default: false)
- `systemMessages` (string): Custom system messages (optional)
### Methods
#### `getBlogs(): Promise<Blog[]>`
#### `connectToServer(serverPath: string | URL, sessionId?: string): Promise<void>`
Fetches all blogs for the authenticated user.
Establishes connection to MCP server.
#### `getBlog(id: string): Promise<Blog>`
- `serverPath`: URL or string path to the MCP server
- `sessionId`: Optional session ID for reconnection
Fetches a specific blog by ID.
#### `disconnect(): Promise<void>`
#### `createBlog(blog: BlogCreate): Promise<Blog>`
Disconnects from the MCP server and cleans up resources.
Creates a new blog.
#### `chatCompletionStream(options: ChatCompletionStreamOptions): AsyncGenerator<ChatCompletionChunk>`
Performs streaming chat completion.
```typescript
const stream = client.chatCompletionStream({
messages: [{ role: 'user', content: 'Hello!' }],
reasoningEffort: 'high',
tools: [...]
});
for await (const chunk of stream) {
console.log(chunk.choices[0]?.delta?.content);
}
```
#### `chat(content: string, options?: ChatOptions): AsyncGenerator<ChatChunk>`
Main chat interface with automatic tool handling and conversation management.
```typescript
for await (const chunk of client.chat("What's the weather like?", {
maxDepth: 3,
autoSummarize: true
})) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
```
#### `cancelRequests(): void`
Cancels all ongoing requests.
## Development
@ -86,34 +124,10 @@ Creates a new blog.
MIT
## Notes
TASK LIST
npm version control
add memory in it
- function that can add data in memory
- function that can get data from memory dynamically using ai
- update existing data in memory
add tools search capabilities
- function that can search tools dynamically using ai
- function that can add tools dynamically using ai
- save the tool usage in memory
auto update prompts
- prompts will be updated if user doesnot specify the result in the response.
dynamicaly change the prompt when user query.
- when user query it will determine what prompts to use as system prompt and other prompts
add task listing capabilities in it
- function that can list all the tasks and solve it
Add branch out task capabilities in it(Advanced feature)
- The client automatically manages conversation state and tool calls
- Supports both HTTP and WebSocket transports
- Includes comprehensive error handling and logging
- Thread-safe for concurrent usage
- Memory efficient with streaming support