Agent Layer Node Setup Guide
Follow this guide to download the node bootstrap script, gather your credentials, and start your Agent Layer node locally.
Keep your manual.config.json private. It contains credentials required to run the node.
At a glance
Step 1
Download setup.sh
Run the bootstrap script from the folder where you downloaded it.
Step 2
Fill credentials
Provide wallet, RPC, API key, and model details when prompted.
Step 3
Start the server
The script launches bun dev and keeps the node running on port 8080.
Prerequisites
Before running the setup script, install these dependencies:
- Git — Download Git
- Bun v1.3.6 — Install with:
curl -fsSL https://bun.sh/install | bash -s "bun-v1.3.6"
Verify with:
bun --version
- Ollama — Download Ollama for local model execution
1. Download and run setup.sh
- Download
setup.shfrom agentlayerai.vercel.app - Open a terminal in the folder where you downloaded it
- Make it executable and run it:
chmod +x setup.sh
./setup.sh
The script will:
- Clone the
node_serverinto a localAgent_Layer/directory - Write
default.config.jsonautomatically - Prompt you for the values that belong in
manual.config.json
2. Gather your credentials
Wallet
PRIVATE_KEY and PUBLIC_ADDRESS
Use a wallet on Base Mainnet, such as MetaMask. Your public address is the 0x... address shown in the wallet, and your private key comes from the wallet settings.
Never share your private key. Keep manual.config.json out of version control.
RPC
BASE_RPC_URL
Create a Base Mainnet RPC endpoint in Alchemy and copy the HTTPS URL into this field.
Endpoint
NODE_QUERY_URL
This value is fixed and should remain set to the Agent Layer query endpoint below.
API key
NODE_API_KEY
Sign in to the site, connect your Base wallet, and copy the displayed API key into the config prompt.
MODEL - Ollama model name
Set MODEL to any Ollama model you want to run locally, based on the hardware available on your machine. Pull the model first, then use the same model name in the config.
ollama pull <your-model-name>
For model selection, installation help, and usage details, refer to the Ollama documentation.
3. Server starts automatically
Once all values are entered, the script will start the node server for you:
bun install
bun dev
The server should come online on port 8080. Keep that terminal open; closing it stops the node.
To restart it later:
cd Agent_Layer
bun install
bun dev
What gets created
After the script finishes, your Agent_Layer/ directory will look like this:
Agent_Layer/
├── default.config.json
├── manual.config.json
└── ... node_server files
Troubleshooting
Permission denied
Run chmod +x setup.sh and try again.
Git not found
Install Git from the official download page and reopen your terminal.
Bun missing after install
Restart the terminal or reload your shell environment.
Ollama model not running
Make sure Ollama is running in the background, then start your node again.
ollama serve
Support
For help, visit agentlayerai.vercel.app or open an issue on the repository.