useChat
useChat(
options?:object):UseChatResult
Defined in: src/expo/useChat.ts:146
A React hook for managing chat completions with authentication.
React Native version - This is a lightweight version that only supports API-based chat completions. Local chat and client-side tools are not available in React Native.
Parameters
| Parameter | Type | Description |
|---|---|---|
|
|
|
Optional configuration object |
|
|
|
Which API endpoint to use. Default: “responses”
|
|
|
|
Optional base URL for the API requests. |
|
|
() => |
An async function that returns an authentication token. |
|
|
( |
Callback function to be called when a new data chunk is received. |
|
|
( |
Callback function to be called when an unexpected error is encountered. Note: This callback is NOT called for aborted requests (via |
|
|
( |
Callback function to be called when the chat completion finishes successfully. Receives raw API response - either Responses API or Completions API format. |
|
|
( |
Callback function to be called when a server-side tool (MCP) is invoked during streaming. Use this to show activity indicators like “Searching…” in the UI. |
|
|
( |
Callback function to be called when thinking/reasoning content is received. This is called with delta chunks as the model “thinks” through a problem. |
|
|
( |
Callback function to be called when a tool call is requested by the LLM. This is called for tools that don’t have an executor or have autoExecute=false. The app should execute the tool and send the result back. |
|
|
|
Controls adaptive output smoothing for streaming responses. Fast models can return text faster than is comfortable to read — smoothing buffers incoming chunks and releases them at a consistent, adaptive pace.
Default |
Returns
UseChatResult
An object containing:
isLoading: A boolean indicating whether a request is currently in progresssendMessage: An async function to send chat messagesstop: A function to abort the current request
Example
const { isLoading, sendMessage, stop } = useChat({
getToken: async () => await getAuthToken(),
onFinish: (response) => console.log("Chat finished:", response),
onError: (error) => console.error("Chat error:", error)
});
const handleSend = async () => {
const result = await sendMessage({
messages: [{ role: 'user', content: [{ type: 'text', text: 'Hello!' }] }],
model: 'gpt-4o-mini'
});
};