Overview
The Prompt Block is used to create a chat message, which is a string of text with an attached “type” indicating who sent the message (User, Assistant, System) and optionally an attached “name”. The Prompt Block also provides the same interpolation capabilities as a Text Block, allowing you to dynamically insert values into the message. This block can also compute a token count for the generated chat message, which can be useful for things like switching the LLM used based on the size of a message. A useful pattern is to use the default message{{input}} to convert any text into a prompt message.
- Inputs
- Outputs
- Editor Settings
Inputs
| Title | Data Type | Description | Default Value | Notes |
|---|---|---|---|---|
| Function Call | object | An optional input that can be used to attach a function call to the chat message. | (empty) | This input is only available if Enable Function Call is enabled. |
| Type | string | The type of the chat message. This input is only available if Use Type Input is enabled. | (empty) | The input will be coerced into a string if it is not a string. |
| Name | string | The name to attach to the chat message. This input is only available if Use Name Input is enabled. | (empty) | The input will be coerced into a string if it is not a string. |
| (custom names) | string | The values to be interpolated into the prompt text. The names of these inputs are dynamically generated based on the prompt text. | (empty) | The input will be coerced into a string if it is not a string. Each input creates a corresponding input port on the block. |
Example 1: Generate a chat message with interpolation
- Create a Prompt Block.
- Set the
Typetouser. - Set the
Prompt TexttoHello, {{name}}!. - Create a Text Block and set the text to
John Doe. - Connect the Text Block to the
nameinput of the Prompt Block. - Run the flow. The
Outputof the Prompt Block should be a chat message with the typeuserand the messageHello, John Doe!.
Example 2: Convert an LLM response into an Assistant message
- Create a Prompt Block. Leave the content as the default
{{input}}. Set theTypetoassistant. - Create a Chat Block and connect its
Outputto theinputof the Prompt Block. - Give the LLM a prompt text and run the flow. You should see the LLM response as an Assistant message in the Prompt Block.
Error Handling
The Prompt Block will error if thePrompt Text is not provided or if the Type is not one of the allowed types (system, user, assistant, function).

