Skip to main content
POST
/
workflows
/
v1
/
nodes
Create Node
curl --request POST \
  --url https://api-qa.interactly.ai/workflows/v1/nodes \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '
{
  "logical_id": "<string>",
  "name": "<string>",
  "primary_category": "System",
  "secondary_category": "LLM",
  "description": "<string>",
  "is_start": false,
  "global_node_config": {
    "is_global": false,
    "condition": {
      "condition_freeform": "<string>",
      "condition_expression": "<string>",
      "args_schema": {},
      "static_messages_config": {
        "static_messages": [
          "<string>"
        ],
        "static_messages_selection_mode": "random"
      }
    }
  },
  "disabled": false,
  "workflow_id": "5eb7cf5a86d9755df3a6c593",
  "miscellaneous": {},
  "main_response_config": {
    "prompt": "<string>"
  },
  "llms_config": {
    "logical_id": "llm_2f4fa4bd-0369-41b3-9d84-e6ff03a7fed2",
    "provider": "default_provider",
    "streaming": false,
    "model_kwargs": {},
    "do_not_split_sentences": false,
    "type": "global_default_llm"
  },
  "tools_config": {
    "tools": [
      {
        "logical_id": "<string>",
        "tool_id": "<string>",
        "name": "<string>",
        "description": "<string>",
        "category": "<string>",
        "signature": "<string>",
        "args_schema": {},
        "static_messages_config": {
          "static_messages": [
            "<string>"
          ],
          "static_messages_selection_mode": "random"
        },
        "result_runtime_variable_name": "tool_result",
        "type": "inline_python",
        "code": "<string>"
      }
    ]
  },
  "self_loop": false,
  "wait_for_user_message": true,
  "max_consecutive_tool_calls": 1,
  "default_error_message": "I am sorry, there seems to be an issue. Could you please repeat?",
  "type": "worker_llm",
  "structured_output_schema": {},
  "backchannel_response_config": {
    "prompt": "<string>"
  }
}
'
{
  "node": {
    "team_id": "<string>",
    "created_by": "<string>",
    "updated_by": "<string>",
    "created_at": "2023-11-07T05:31:56Z",
    "updated_at": "2023-11-07T05:31:56Z",
    "_id": "5eb7cf5a86d9755df3a6c593",
    "node_config": {
      "logical_id": "<string>",
      "name": "<string>",
      "primary_category": "System",
      "secondary_category": "LLM",
      "description": "<string>",
      "is_start": false,
      "global_node_config": {
        "is_global": false,
        "condition": {
          "condition_freeform": "<string>",
          "condition_expression": "<string>",
          "args_schema": {},
          "static_messages_config": {
            "static_messages": [
              "<string>"
            ],
            "static_messages_selection_mode": "random"
          }
        }
      },
      "disabled": false,
      "workflow_id": "<string>",
      "miscellaneous": {},
      "main_response_config": {
        "prompt": "<string>"
      },
      "llms_config": {
        "logical_id": "llm_2f4fa4bd-0369-41b3-9d84-e6ff03a7fed2",
        "provider": "default_provider",
        "streaming": false,
        "model_kwargs": {},
        "do_not_split_sentences": false,
        "type": "global_default_llm"
      },
      "tools_config": {
        "tools": [
          {
            "logical_id": "<string>",
            "tool_id": "<string>",
            "name": "<string>",
            "description": "<string>",
            "category": "<string>",
            "signature": "<string>",
            "args_schema": {},
            "static_messages_config": {
              "static_messages": [
                "<string>"
              ],
              "static_messages_selection_mode": "random"
            },
            "result_runtime_variable_name": "tool_result",
            "type": "inline_python",
            "code": "<string>"
          }
        ]
      },
      "self_loop": false,
      "wait_for_user_message": true,
      "max_consecutive_tool_calls": 1,
      "default_error_message": "I am sorry, there seems to be an issue. Could you please repeat?",
      "type": "worker_llm",
      "structured_output_schema": {},
      "backchannel_response_config": {
        "prompt": "<string>"
      }
    }
  }
}

Authorizations

Authorization
string
header
required

Retrieve your API Key from Dashboard API Keys Section.

Body

application/json
logical_id
string | null

Unique identifier for the node

name
string | null

Name of the node

primary_category
string | null
default:System

Primary category of the node

secondary_category
string | null
default:LLM

Secondary category of the node

description
string | null

Description of the node

is_start
boolean
default:false

Whether this node is the starting node of the workflow

global_node_config
GlobalNodeConfig · object

Configuration for when this node is a global node

disabled
boolean
default:false

If true, this node will be disabled and will not execute its function. Useful for testing workflows without executing node logic.

workflow_id
string | null

The ID of the workflow this node belongs to

Required string length: 24
Example:

"5eb7cf5a86d9755df3a6c593"

miscellaneous
Miscellaneous Config · object

Miscellaneous config data that can be used by the node

main_response_config
PromptConfig · object

Main response configuration. Contains either a LLM system prompt or exact static messages

llms_config
Azure OpenAI · object

LLM or a group of LLMs to be used in this node

tools_config
Tools Configuration · object

List of tools available for this node

self_loop
boolean
default:false

Whether this node will execute again if not transitioned to another node

wait_for_user_message
boolean
default:true

Whether the node should wait for a user message before processing

max_consecutive_tool_calls
integer
default:1

Maximum number of consecutive tool calls allowed in a single node execution

default_error_message
string | null
default:I am sorry, there seems to be an issue. Could you please repeat?

Default error message to be returned if the LLM invocation fails

type
string
default:worker_llm

Type of the node. Must be 'worker_llm'

Allowed value: "worker_llm"
structured_output_schema
Structured Output Schema · object

Schema for the structured output of the worker node. Example: { "name": "SearchQuery", "description": "A search query with justification", "input_schema": { "title": "AnswerWithJustification", "type": "object", "properties": { "search_query": { "title": "Search Query", "type": "string", "description": "The field where search query is stored" }, "justification": { "title": "Justification", "type": "string", "description": "The field where justification string is stored" } }, "required": ["search_query", "justification"] } }

backchannel_response_config
PromptConfig · object

Backchannel response configuration. Contains either a LLM system prompt or exact static messages

Response

Successful Response

Response model for a single node. Contains a NodesModel object.

node
NodesModel · object
required

Single node object