Skip to main content

Overview

The Prompt Block is used to create a chat message, which is a string of text with an attached “type” indicating who sent the message (User, Assistant, System) and optionally an attached “name”. The Prompt Block also provides the same interpolation capabilities as a Text Block, allowing you to dynamically insert values into the message. This block can also compute a token count for the generated chat message, which can be useful for things like switching the LLM used based on the size of a message. A useful pattern is to use the default message {{input}} to convert any text into a prompt message.

Inputs

TitleData TypeDescriptionDefault ValueNotes
Function CallobjectAn optional input that can be used to attach a function call to the chat message.(empty)This input is only available if Enable Function Call is enabled.
TypestringThe type of the chat message. This input is only available if Use Type Input is enabled.(empty)The input will be coerced into a string if it is not a string.
NamestringThe name to attach to the chat message. This input is only available if Use Name Input is enabled.(empty)The input will be coerced into a string if it is not a string.
(custom names)stringThe values to be interpolated into the prompt text. The names of these inputs are dynamically generated based on the prompt text.(empty)The input will be coerced into a string if it is not a string. Each input creates a corresponding input port on the block.

Example 1: Generate a chat message with interpolation

  1. Create a Prompt Block.
  2. Set the Type to user.
  3. Set the Prompt Text to Hello, {{name}}!.
  4. Create a Text Block and set the text to John Doe.
  5. Connect the Text Block to the name input of the Prompt Block.
  6. Run the flow. The Output of the Prompt Block should be a chat message with the type user and the message Hello, John Doe!.

Example 2: Convert an LLM response into an Assistant message

  1. Create a Prompt Block. Leave the content as the default {{input}}. Set the Type to assistant.
  2. Create a Chat Block and connect its Output to the input of the Prompt Block.
  3. Give the LLM a prompt text and run the flow. You should see the LLM response as an Assistant message in the Prompt Block.

Error Handling

The Prompt Block will error if the Prompt Text is not provided or if the Type is not one of the allowed types (system, user, assistant, function).

See Also