Documentation Index
Fetch the complete documentation index at: https://docs.agentfront.dev/llms.txt
Use this file to discover all available pages before exploring further.
The @frontmcp/react/ai entry point bridges FrontMCP tools to popular AI SDK formats. Tools are automatically formatted for the target platform and results are converted back.
The core hook that formats MCP tools for a specific platform.
import { useAITools } from '@frontmcp/react/ai';
function AIChat() {
const { tools, callTool, loading, error } = useAITools('openai');
// `tools` is formatted for the OpenAI SDK
// `callTool(name, args)` calls the MCP tool and formats the result
}
Parameters
| Parameter | Type | Description |
|---|
platform | LLMPlatform | 'openai' | 'claude' | 'vercel-ai' | 'langchain' |
options.server | string | Target a named server |
Return Value
| Field | Type | Description |
|---|
tools | PlatformToolsMap[P] | null | Tools formatted for the target platform |
callTool | (name, args) => Promise<FormattedToolResult> | Execute a tool and format the result |
loading | boolean | True while formatting tools |
error | Error | null | Formatting or execution error |
Higher-level hook that adds batch tool call processing via processPlatformToolCalls.
import { useTools } from '@frontmcp/react/ai';
function ChatAgent() {
const { tools, processToolCalls, loading } = useTools('openai');
// Process a batch of tool calls from the AI response
const results = await processToolCalls(toolCalls);
}
Return Value
| Field | Type | Description |
|---|
tools | PlatformToolsMap[P] | null | Formatted tools |
processToolCalls | (calls) => Promise<PlatformToolCallsOutput[P]> | Batch process tool calls |
loading | boolean | Loading state |
error | Error | null | Error state |
Non-React utility for vanilla JavaScript usage. Creates a handler that executes tools and formats results.
import { createToolCallHandler } from '@frontmcp/react/ai';
const handler = createToolCallHandler(server, 'openai');
const result = await handler.callTool('search', { query: 'react hooks' });
OpenAI
import { useTools } from '@frontmcp/react/ai';
import OpenAI from 'openai';
function OpenAIChat() {
const { tools, processToolCalls } = useTools('openai');
// ⚠️ dangerouslyAllowBrowser exposes your API key in client-side code.
// In production, proxy requests through your backend instead.
const openai = new OpenAI({ apiKey: '...', dangerouslyAllowBrowser: true });
async function chat(message: string) {
const response = await openai.chat.completions.create({
model: 'gpt-4o',
messages: [{ role: 'user', content: message }],
tools: tools ?? undefined,
});
const toolCalls = response.choices[0].message.tool_calls;
if (toolCalls) {
const results = await processToolCalls(toolCalls);
// Continue the conversation with tool results...
}
}
}
Vercel AI SDK
import { useTools } from '@frontmcp/react/ai';
import { useChat } from 'ai/react';
function VercelAIChat() {
const { tools, processToolCalls } = useTools('vercel-ai');
const { messages, input, handleInputChange, handleSubmit } = useChat({
api: '/api/chat',
// Pass tools to the chat configuration
});
}
Claude (Anthropic)
import { useTools } from '@frontmcp/react/ai';
function ClaudeChat() {
const { tools, processToolCalls } = useTools('claude');
// `tools` is formatted as Claude tool_use blocks
// Each tool has: { name, description, input_schema }
}
Target tools from a specific server:
const { tools: analyticsTools } = useAITools('openai', { server: 'analytics' });
const { tools: mainTools } = useAITools('openai'); // defaults to primary server