Prerequisites
Our blocks examples are built using the Vercel AI SDK, which provides powerful utilities for building AI-powered user interfaces. The examples use:
useChathook for client-side chat state managementstreamTextfor server-side streaming responses- OpenAI's GPT-4 model through the
@ai-sdk/openaiprovider
Client-Side Usage
import { useChat } from 'ai/react'
export function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat({
api: '/api/chat'
})
return (
// Your chat UI components
)
}Server-Side Implementation
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: openai("gpt-4-mini"),
messages,
});
return result.toDataStreamResponse();
}Environment Setup
To use the OpenAI provider in our examples, you'll need an OpenAI API key. Create a .env file in your project root and add:
OPENAI_API_KEY=your_api_key_hereFor more information about:
- useChat, see the Vercel AI SDK Chat Documentation
- streamText, see the Vercel AI SDK Text Streaming Documentation
- OpenAI provider setup, see the OpenAI Provider Documentation