Skip to main content

AI Tools

Tool nodes provide capabilities that AI Agents can invoke during reasoning. Each tool defines a schema that the LLM understands and uses to decide when and how to call the tool.

How Tools Work

  1. Connect Tool nodes to the AI Agent’s input-tools handle (diamond shape)
  2. Agent discovers connected tools and their schemas
  3. LLM decides when to call tools based on the user’s request
  4. Tool executes and returns results to the agent
  5. Agent continues reasoning with the tool’s output
[Calculator Tool] --+
                    +--(tools handle)--> [AI Agent] --> [Output]
[Web Search Tool] --+

Available Tools

ToolDescriptionUse Case
CalculatorMath operationsCalculations, arithmetic
Current TimeDate/time with timezoneScheduling, time-based logic
Web SearchSearch the internetCurrent information, research
Android ToolkitAndroid device controlDevice automation
AI AgentDelegate to child agentComplex sub-tasks (fire-and-forget)
Specialized AgentsDomain-specific delegationAndroid, coding, web, task, social

Calculator Tool

Performs mathematical operations including basic arithmetic and advanced functions.

Operations

OperationDescriptionExample
addAddition5 + 3 = 8
subtractSubtraction10 - 4 = 6
multiplyMultiplication6 * 7 = 42
divideDivision20 / 4 = 5
powerExponentiation2^8 = 256
sqrtSquare rootsqrt(16) = 4
modModulo17 % 5 = 2
absAbsolute valueabs(-5) = 5

Parameters

toolName
string
default:"calculator"
Tool identifier shown to the LLM
toolDescription
string
Description of the tool’s capabilities for the LLM

Schema (LLM View)

{
  "operation": "add | subtract | multiply | divide | power | sqrt | mod | abs",
  "a": "number (first operand)",
  "b": "number (second operand, optional for sqrt/abs)"
}

Example Interaction

User: "What's 15% of 230?"
Agent: [Calls calculator with multiply(230, 0.15)]
       "15% of 230 is 34.5"

Current Time Tool

Gets the current date and time with timezone support.

Parameters

toolName
string
default:"currentTime"
Tool identifier shown to the LLM
toolDescription
string
Description of the tool’s capabilities
defaultTimezone
string
default:"UTC"
Default timezone (e.g., “America/New_York”, “Europe/London”, “Asia/Tokyo”)

Schema (LLM View)

{
  "timezone": "string (optional, defaults to UTC)"
}

Output

{
  "datetime": "2025-01-30T14:30:00-05:00",
  "date": "2025-01-30",
  "time": "14:30:00",
  "timezone": "America/New_York",
  "day_of_week": "Thursday",
  "unix_timestamp": 1738262000
}

Example Interaction

User: "What time is it in Tokyo?"
Agent: [Calls currentTime with timezone="Asia/Tokyo"]
       "It's currently 4:30 AM on Friday in Tokyo."

Web Search Tool

Searches the internet using DuckDuckGo (free) or Serper API (requires API key).

Parameters

toolName
string
default:"webSearch"
Tool identifier shown to the LLM
toolDescription
string
Description of the tool’s capabilities
provider
select
default:"duckduckgo"
Search provider: “duckduckgo” (free) or “serper” (API key required)
maxResults
number
default:"5"
Maximum number of search results to return

Schema (LLM View)

{
  "query": "string (search query)",
  "max_results": "number (optional, 1-10)"
}

Output

{
  "results": [
    {
      "title": "Result Title",
      "url": "https://example.com/page",
      "snippet": "Brief description of the page..."
    }
  ],
  "query": "original search query"
}

Example Interaction

User: "What are the latest developments in quantum computing?"
Agent: [Calls webSearch with query="quantum computing news 2025"]
       "Here are the latest developments in quantum computing:
        1. IBM announced a new 1000-qubit processor...
        2. Google achieved quantum error correction milestone..."

Provider Comparison

FeatureDuckDuckGoSerper
CostFreePaid (API key)
Rate LimitsModerateHigh
Result QualityGoodExcellent
SetupNoneAPI key required

Android Toolkit

Gateway tool that aggregates Android service nodes for device control.

Architecture

The Android Toolkit follows the Sub-Node pattern from n8n and Toolkit pattern from LangChain:
[Battery Monitor] --+
[WiFi Automation] --+--(main input)--> [Android Toolkit] --(tools handle)--> [AI Agent]
[Location] --------+
Connected Android service nodes define what the toolkit can do. The LLM sees a single android_device tool with capabilities based on connected services.

Parameters

toolName
string
default:"android_device"
Tool identifier shown to the LLM
toolDescription
string
Description of the toolkit’s capabilities

Schema (Dynamic)

The schema is built dynamically based on connected Android service nodes:
{
  "service_id": "battery | wifi_automation | location | ...",
  "action": "status | enable | disable | get | set | ...",
  "parameters": {
    // Action-specific parameters
  }
}

Connecting Services

  1. Add Android service nodes (Battery Monitor, WiFi Automation, etc.)
  2. Connect them to the Android Toolkit’s main input
  3. Connect the Toolkit to the AI Agent’s tools handle
Only connected services are available to the agent.

Tool Schema Editor

The Android Toolkit includes a schema editor for customizing how the LLM sees each service:
  1. Select the Android Toolkit node
  2. Open the Tool Schema Editor
  3. Select a connected service
  4. Customize the description, fields, and types
  5. Save changes

Example Interaction

User: "Check the battery level and turn on WiFi if it's above 50%"
Agent: [Calls android_device with service_id="battery", action="status"]
       Battery is at 72%
       [Calls android_device with service_id="wifi_automation", action="enable"]
       "Battery is at 72%, which is above 50%. I've enabled WiFi."

Available Services

When corresponding Android service nodes are connected:
ServiceActionsDescription
batterystatusBattery level, charging state, health
wifi_automationstatus, enable, disable, scanWiFi control
bluetooth_automationstatus, enable, disable, pairedBluetooth control
locationgetGPS coordinates
app_launcherlaunchLaunch apps by package name
app_listlistInstalled applications
audio_automationget, set, mute, unmuteVolume control
camera_controlinfo, captureCamera operations
motion_detectiongetAccelerometer/gyroscope data
environmental_sensorsgetTemperature, humidity, light

Tool Execution Flow

When the AI Agent calls a tool:
  1. Status broadcast - executing_tool status sent to frontend
  2. Tool node highlighted - Shows cyan border and pulse animation
  3. Handler executed - Backend runs the tool’s handler function
  4. Result returned - Output sent back to the agent
  5. Agent continues - Incorporates result into response

Tool Output in Variables

Tool outputs are available as template variables:
{{calculatorTool.result}}
{{webSearchTool.results}}
{{androidTool.output}}

Agent Delegation Tool

AI Agents and specialized agents can be connected as tools to parent agents, enabling hierarchical task delegation.

How It Works

  1. Connect any agent to a parent agent’s input-tools handle
  2. Parent calls delegate_to_<agent_type>(task, context)
  3. Child spawns as background task (fire-and-forget)
  4. Parent continues immediately without waiting

Schema

{
  "task": "string (required) - The task to delegate",
  "context": "string (optional) - Additional context"
}

Example

[AI Agent] <--tools-- [Coding Agent] <--tools-- [Python Executor]
Parent delegates: delegate_to_coding_agent(task="Write a function to parse JSON") Child agent executes independently with its own connected tools.
Use agent delegation for complex sub-tasks that don’t need immediate results.

Creating Custom Tools

To add custom tools, see the AI Tool Node Guide in the repository. Key steps:
  1. Define node in client/src/nodeDefinitions/toolNodes.ts
  2. Add schema in server/services/ai.py
  3. Add handler in server/services/handlers/tools.py
  4. Update TOOL_NODE_TYPES array

Tips

Connect only the tools the agent needs. Too many tools can confuse the LLM.
Provide clear toolDescription to help the LLM understand when to use each tool.
Use Web Search for current information not in the model’s training data.
The Android Toolkit only shows services you connect - add only what you need.
Tool nodes execute when the AI Agent calls them, not when the workflow runs. They’re passive until invoked.