OceanAI

Prompts

Prompts are instructions that you give a large language model (LLM) to tell it what to do. It's like when you ask someone for directions; the clearer your question, the better the directions you'll get.

Many LLM providers offer complex interfaces for specifying prompts. They involve different roles and message types. While these interfaces are powerful, they can be hard to use and understand.

In order to simplify prompting, the AI SDK supports text, message, and system prompts.


Text Prompts

Text prompts are strings. They are ideal for simple generation use cases, e.g. repeatedly generating content for variants of the same prompt text.

You can set text prompts using the prompt property made available by AI SDK functions like streamText or generateObject. You can structure the text in any way and inject variables, e.g. using a template literal.

System Prompts

System prompts are the initial set of instructions given to models that help guide and constrain the models' behaviors and responses. You can set system prompts using the system property. System prompts work with both the prompt and the messages properties.

Message Prompts

A message prompt is an array of user, assistant, and tool messages. They are great for chat interfaces and more complex, multi-modal prompts. You can use the messages property to set message prompts. Each message has a role and a content property. The content can either be text (for user and assistant messages), or an array of relevant parts (data) for that message type.

In the next section, you will learn about the difference between models providers and models, and which ones are available in the AI SDK.