How to Run n8n Locally: Complete Setup Guide (2026)
Self-host n8n automation platform on your own machine
Start Building with Hypereal
Access Kling, Flux, Sora, Veo & more through a single API. Free credits to start, scale to millions.
No credit card required • 100k+ developers • Enterprise ready
How to Run n8n Locally: Complete Setup Guide for 2026
n8n is an open source workflow automation platform that lets you connect APIs, services, and data sources using a visual node-based editor. Think of it as a self-hosted alternative to Zapier or Make.com, with the key advantage that you own your data and have no workflow execution limits.
Running n8n locally gives you full control, zero cloud costs, and the ability to connect to services on your local network. This guide covers every setup method: Docker (recommended), npm, and Docker Compose for production-ready deployments.
Why Self-Host n8n?
| Feature | n8n Cloud | n8n Self-Hosted |
|---|---|---|
| Monthly cost | $24-$299/mo | Free |
| Workflow executions | Limited by plan | Unlimited |
| Data privacy | n8n servers | Your machine |
| Custom nodes | Limited | Unlimited |
| Network access | Internet only | Local + Internet |
| Maintenance | Managed | You |
| Updates | Automatic | Manual |
Prerequisites
- Docker Desktop (recommended method) or Node.js 18+
- 2GB RAM minimum (4GB recommended)
- 1GB disk space for n8n plus additional space for workflow data
- A modern browser (Chrome, Firefox, Edge)
Method 1: Docker (Recommended)
Docker is the easiest and most reliable way to run n8n locally.
Step 1: Install Docker
If you do not have Docker installed:
- macOS: Download Docker Desktop for Mac
- Windows: Download Docker Desktop for Windows
- Linux: Install Docker Engine:
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
Verify Docker is running:
docker --version
Step 2: Run n8n with Docker
Start n8n with a single command:
docker run -d \
--name n8n \
-p 5678:5678 \
-v n8n_data:/home/node/.n8n \
n8nio/n8n
This command:
- Runs n8n in detached mode (
-d) - Maps port 5678 to your machine
- Creates a persistent volume
n8n_dataso your workflows survive container restarts
Step 3: Access n8n
Open your browser and navigate to:
http://localhost:5678
You will be prompted to create an owner account. This is your admin account for the local instance.
Stopping and Starting
# Stop n8n
docker stop n8n
# Start n8n
docker start n8n
# View logs
docker logs n8n
# Remove the container (data is preserved in the volume)
docker rm n8n
# Remove data volume (WARNING: deletes all workflows)
docker volume rm n8n_data
Method 2: npm (Without Docker)
If you prefer not to use Docker:
Step 1: Install n8n globally
npm install -g n8n
Step 2: Start n8n
n8n start
n8n will start on http://localhost:5678 by default.
Running in the Background
Use a process manager like pm2:
npm install -g pm2
pm2 start n8n
pm2 save
pm2 startup # Enable auto-start on boot
Method 3: Docker Compose (Production-Ready)
For a more robust setup with PostgreSQL (instead of SQLite) and automatic restarts:
Create a docker-compose.yml file:
version: "3.8"
services:
n8n:
image: n8nio/n8n
restart: always
ports:
- "5678:5678"
environment:
- N8N_BASIC_AUTH_ACTIVE=true
- N8N_BASIC_AUTH_USER=admin
- N8N_BASIC_AUTH_PASSWORD=your-secure-password
- DB_TYPE=postgresdb
- DB_POSTGRESDB_HOST=postgres
- DB_POSTGRESDB_PORT=5432
- DB_POSTGRESDB_DATABASE=n8n
- DB_POSTGRESDB_USER=n8n
- DB_POSTGRESDB_PASSWORD=n8n-db-password
- N8N_ENCRYPTION_KEY=your-encryption-key-here
- GENERIC_TIMEZONE=America/New_York
- TZ=America/New_York
volumes:
- n8n_data:/home/node/.n8n
depends_on:
- postgres
postgres:
image: postgres:16
restart: always
environment:
- POSTGRES_USER=n8n
- POSTGRES_PASSWORD=n8n-db-password
- POSTGRES_DB=n8n
volumes:
- postgres_data:/var/lib/postgresql/data
volumes:
n8n_data:
postgres_data:
Start the stack:
docker compose up -d
Essential Environment Variables
| Variable | Default | Description |
|---|---|---|
N8N_PORT |
5678 | Port n8n listens on |
N8N_PROTOCOL |
http | Protocol (http or https) |
N8N_HOST |
localhost | Hostname |
N8N_BASIC_AUTH_ACTIVE |
false | Enable basic auth |
N8N_BASIC_AUTH_USER |
- | Basic auth username |
N8N_BASIC_AUTH_PASSWORD |
- | Basic auth password |
N8N_ENCRYPTION_KEY |
auto-generated | Key for encrypting credentials |
DB_TYPE |
sqlite | Database type (sqlite or postgresdb) |
GENERIC_TIMEZONE |
UTC | Timezone for cron triggers |
N8N_METRICS |
false | Enable Prometheus metrics |
EXECUTIONS_DATA_PRUNE |
true | Auto-delete old executions |
EXECUTIONS_DATA_MAX_AGE |
336 | Hours to keep execution data |
Creating Your First Workflow
Once n8n is running, let us build a simple workflow:
Example: Webhook to Slack Notification
- Click Add workflow in the n8n dashboard
- Click the + button to add nodes
- Add a Webhook node (trigger):
- Set Method to
POST - Copy the webhook URL
- Set Method to
- Add a Slack node:
- Connect your Slack workspace
- Set the channel
- Map the webhook body to the message
- Connect the Webhook node to the Slack node
- Click Execute Workflow to test
- Click Activate to make it live
Test with curl:
curl -X POST http://localhost:5678/webhook/your-webhook-id \
-H "Content-Type: application/json" \
-d '{"message": "Hello from n8n!", "priority": "high"}'
Example: Scheduled API Data Fetch
A workflow that fetches data from an API every hour and saves it to a file:
- Add a Schedule Trigger node:
- Set to run every 60 minutes
- Add an HTTP Request node:
- Method: GET
- URL:
https://api.example.com/data - Add authentication headers if needed
- Add a Write Binary File node:
- File name:
data-{{ $now.format('yyyy-MM-dd-HH') }}.json
- File name:
- Connect: Schedule -> HTTP Request -> Write File
- Activate the workflow
Connecting to AI APIs
n8n has built-in nodes for popular AI services. Here is how to connect to OpenAI:
- Add an OpenAI node
- Create credentials:
- API Key: your OpenAI API key
- Configure:
- Resource: Chat
- Model: gpt-4o
- Prompt: your prompt template
For services without built-in nodes, use the HTTP Request node:
{
"method": "POST",
"url": "https://api.hypereal.com/v1/generate",
"headers": {
"Authorization": "Bearer your-api-key",
"Content-Type": "application/json"
},
"body": {
"model": "flux",
"prompt": "{{ $json.prompt }}",
"width": 1024,
"height": 1024
}
}
Exposing n8n to the Internet
To receive webhooks from external services, you need to expose your local n8n instance.
Option 1: ngrok (Quick and Easy)
ngrok http 5678
This gives you a public URL like https://abc123.ngrok.io that tunnels to your local n8n.
Option 2: Cloudflare Tunnel (Free and Persistent)
cloudflared tunnel --url http://localhost:5678
Option 3: Reverse Proxy with Caddy
Install Caddy and create a Caddyfile:
n8n.yourdomain.com {
reverse_proxy localhost:5678
}
caddy run
Backup and Restore
Export All Workflows
# Export via CLI
docker exec -it n8n n8n export:workflow --all --output=/home/node/.n8n/backups/
# Or copy from the volume
docker cp n8n:/home/node/.n8n/backups/ ./n8n-backups/
Import Workflows
docker exec -it n8n n8n import:workflow --input=/home/node/.n8n/backups/
Database Backup (PostgreSQL)
docker exec postgres pg_dump -U n8n n8n > n8n-backup.sql
Performance Tuning
For heavy workloads, adjust these settings:
# Increase Node.js memory
docker run -d \
--name n8n \
-p 5678:5678 \
-e NODE_OPTIONS="--max-old-space-size=4096" \
-e EXECUTIONS_PROCESS=main \
-e N8N_CONCURRENCY_PRODUCTION_LIMIT=20 \
-v n8n_data:/home/node/.n8n \
n8nio/n8n
| Setting | Default | Recommended | Description |
|---|---|---|---|
NODE_OPTIONS |
512MB | 2048-4096MB | Node.js heap size |
N8N_CONCURRENCY_PRODUCTION_LIMIT |
5 | 10-20 | Max concurrent executions |
EXECUTIONS_DATA_PRUNE |
true | true | Remove old execution data |
EXECUTIONS_DATA_MAX_AGE |
336h | 168h | Reduce retention period |
Troubleshooting
n8n will not start (port in use):
# Check what is using port 5678
lsof -i :5678
# Use a different port
docker run -p 5679:5678 n8nio/n8n
Workflows not persisting after restart:
Ensure you are using a Docker volume (-v n8n_data:/home/node/.n8n). Without the volume, data is lost when the container is removed.
Cannot connect to local services:
If n8n is in Docker and needs to connect to a service on your host machine, use host.docker.internal instead of localhost:
http://host.docker.internal:3000/api/data
Webhook URLs not accessible: Local webhook URLs only work on your machine. To receive external webhooks, use ngrok, Cloudflare Tunnel, or set up a reverse proxy.
Memory issues with large workflows:
Increase the Node.js heap size with NODE_OPTIONS="--max-old-space-size=4096" and reduce execution data retention.
Updating n8n
Docker
docker pull n8nio/n8n
docker stop n8n
docker rm n8n
docker run -d --name n8n -p 5678:5678 -v n8n_data:/home/node/.n8n n8nio/n8n
npm
npm update -g n8n
Conclusion
Running n8n locally gives you a powerful, free automation platform with no execution limits. Docker makes setup trivial, and Docker Compose with PostgreSQL provides a production-ready deployment. Whether you are automating data pipelines, connecting APIs, or building complex multi-step workflows, n8n delivers the flexibility of code with the accessibility of a visual builder.
If your automation workflows involve AI media generation -- creating images, generating videos, or producing audio -- Hypereal AI provides a simple API you can call from any n8n HTTP Request node. Generate AI media at scale with pay-as-you-go pricing and fast inference times.
Related Articles
Start Building Today
Get 35 free credits on signup. No credit card required. Generate your first image in under 5 minutes.
