13 KiB
Docker Deployment Guide
The complete Docker setup guide for Open Notebook - from beginner to advanced configurations.
This guide covers everything you need to deploy Open Notebook using Docker, from a simple single-provider setup to advanced multi-provider configurations with local models.
📋 What You'll Get
Open Notebook is a powerful AI-powered research and note-taking tool that:
- Modern Next.js/React interface for a smooth user experience
- Helps you organize research across multiple notebooks
- Lets you chat with your documents using AI
- Supports 16+ AI providers (OpenAI, Anthropic, Google, Ollama, and more)
- Creates AI-generated podcasts from your content
- Works with PDFs, web links, videos, audio files, and more
📦 Docker Image Registries
Open Notebook images are available from two registries:
- GitHub Container Registry (GHCR):
ghcr.io/lfnovo/open-notebook- Hosted on GitHub, no Docker Hub account needed - Docker Hub:
lfnovo/open_notebook- Traditional Docker registry
Both registries contain identical images. Choose based on your preference:
- Use GHCR if you prefer GitHub-native workflows or Docker Hub is blocked
- Use Docker Hub if you're already using it or prefer the traditional registry
All examples in this guide use Docker Hub (lfnovo/open_notebook), but you can replace it with ghcr.io/lfnovo/open-notebook anywhere.
🚀 Quick Start (5 Minutes)
Step 1: Install Docker
Windows
- Download Docker Desktop from docker.com
- Run the installer and follow the setup wizard
- Restart your computer when prompted
- Launch Docker Desktop
macOS
- Download Docker Desktop from docker.com
- Choose Intel or Apple Silicon based on your Mac
- Drag Docker to Applications folder
- Open Docker from Applications
Linux (Ubuntu/Debian)
sudo apt update
sudo apt install docker.io docker-compose
sudo systemctl start docker
sudo systemctl enable docker
sudo usermod -aG docker $USER
Log out and log back in after installation.
Step 2: Get Your OpenAI API Key
OpenAI provides everything you need to get started:
- Text generation for chat and notes
- Embeddings for search functionality
- Text-to-speech for podcast generation
- Speech-to-text for audio transcription
- Go to platform.openai.com
- Create an account or sign in
- Navigate to API Keys in the sidebar
- Click "Create new secret key"
- Name your key (e.g., "Open Notebook")
- Copy the key (starts with "sk-")
- Save it safely - you won't see it again!
Important: Add at least $5 in credits to your OpenAI account before using the API.
Step 3: Deploy Open Notebook
-
Create a project directory:
mkdir open-notebook cd open-notebook -
Create
docker-compose.yml:services: open_notebook: image: lfnovo/open_notebook:v1-latest-single ports: - "8502:8502" # Frontend - "5055:5055" # API environment: - OPENAI_API_KEY=your_openai_key_here volumes: - ./notebook_data:/app/data - ./surreal_data:/mydata restart: always -
Create
docker.envfile (optional but recommended):# Required: Your OpenAI API key OPENAI_API_KEY=sk-your-actual-key-here # Optional: Security for public deployments OPEN_NOTEBOOK_PASSWORD=your_secure_password # Database settings (auto-configured) SURREAL_URL=ws://localhost:8000/rpc SURREAL_USER=root SURREAL_PASSWORD=root SURREAL_NAMESPACE=open_notebook SURREAL_DATABASE=production -
Start Open Notebook:
docker compose up -d -
Access the application:
- Next.js UI: http://localhost:8502 - Modern, responsive interface
- API Documentation: http://localhost:5055/docs - Full REST API access
- You should see the Open Notebook interface!
Alternative: Using GHCR To use GitHub Container Registry instead, simply replace the image name:
services:
open_notebook:
image: ghcr.io/lfnovo/open-notebook:v1-latest-single
# ... rest of configuration stays the same
Step 4: Configure Your Models
Before creating your first notebook, configure your AI models:
- Click "⚙️ Settings" in the sidebar
- Click "🤖 Models" tab
- Configure these recommended models:
- Language Model:
gpt-5-mini(cost-effective) - Embedding Model:
text-embedding-3-small(required for search) - Text-to-Speech:
gpt-4o-mini-tts(for podcast generation) - Speech-to-Text:
whisper-1(for audio transcription)
- Language Model:
- Click "Save" after configuring all models
Step 5: Create Your First Notebook
- Click "Create New Notebook"
- Give it a name (e.g., "My Research")
- Add a description
- Click "Create"
- Add your first source (web link, PDF, or text)
- Start chatting with your content!
🔧 Advanced Configuration
Multi-Container Setup
For production deployments or development, use the multi-container setup:
services:
surrealdb:
image: surrealdb/surrealdb:v1-latest
ports:
- "8000:8000"
command: start --log trace --user root --pass root memory
restart: always
open_notebook:
image: lfnovo/open_notebook:v1-latest
# Or use: ghcr.io/lfnovo/open-notebook:v1-latest
ports:
- "8502:8502" # Next.js Frontend
- "5055:5055" # REST API
env_file:
- ./docker.env
volumes:
- ./notebook_data:/app/data
depends_on:
- surrealdb
restart: always
Environment Configuration
Create a comprehensive docker.env file:
# Required: Database connection
SURREAL_URL=ws://surrealdb:8000/rpc
SURREAL_USER=root
SURREAL_PASSWORD=root
SURREAL_NAMESPACE=open_notebook
SURREAL_DATABASE=production
# Required: At least one AI provider
OPENAI_API_KEY=sk-your-openai-key
# Optional: Additional AI providers
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key
GOOGLE_API_KEY=your-google-key
GROQ_API_KEY=gsk_your-groq-key
# Optional: Security
OPEN_NOTEBOOK_PASSWORD=your_secure_password
# Optional: Advanced features
ELEVENLABS_API_KEY=your-elevenlabs-key
🌟 Advanced Provider Setup
OpenRouter (100+ Models)
OpenRouter gives you access to virtually every AI model through a single API:
- Get your API key at openrouter.ai
- Add to your
docker.env:OPENROUTER_API_KEY=sk-or-your-openrouter-key - Restart the container:
docker compose restart - Configure models in Models
Recommended OpenRouter models:
anthropic/claude-3-haiku- Fast and cost-effectivegoogle/gemini-pro- Good reasoning capabilitiesmeta-llama/llama-3-8b-instruct- Open source option
Ollama (Local Models)
Run AI models locally for complete privacy:
-
Install Ollama on your host machine from ollama.ai
-
Start Ollama:
ollama serve -
Download models:
ollama pull llama2 # 7B model (~4GB) ollama pull mistral # 7B model (~4GB) ollama pull llama2:13b # 13B model (~8GB) -
Find your IP address:
- Windows:
ipconfig(look for IPv4 Address) - macOS/Linux:
ifconfigorip addr show
- Windows:
-
Configure Open Notebook:
OLLAMA_API_BASE=http://192.168.1.100:11434Replace
192.168.1.100with your actual IP. -
Restart and configure models in Models
Other Providers
Anthropic (Direct):
ANTHROPIC_API_KEY=sk-ant-your-key
Google Gemini:
GOOGLE_API_KEY=AIzaSy-your-key
Groq (Fast Inference):
GROQ_API_KEY=gsk_your-key
🔒 Security & Production
Password Protection
For public deployments, always set a password:
OPEN_NOTEBOOK_PASSWORD=your_secure_password
This protects both the web interface and API endpoints.
Production Best Practices
- Use HTTPS: Deploy behind a reverse proxy with SSL
- Regular Updates: Keep containers updated
- Monitor Resources: Set up resource limits
- Backup Data: Regular backups of volumes
- Network Security: Configure firewall rules
Example Production Setup
services:
surrealdb:
image: surrealdb/surrealdb:v1-latest
ports:
- "127.0.0.1:8000:8000" # Bind to localhost only
command: start --log warn --user root --pass root file:///mydata/database.db
volumes:
- ./surreal_data:/mydata
restart: always
deploy:
resources:
limits:
memory: 1G
cpus: "0.5"
open_notebook:
image: lfnovo/open_notebook:v1-latest
ports:
- "127.0.0.1:8502:8502"
- "127.0.0.1:5055:5055"
env_file:
- ./docker.env
volumes:
- ./notebook_data:/app/data
depends_on:
- surrealdb
restart: always
deploy:
resources:
limits:
memory: 2G
cpus: "1.0"
🛠️ Management & Maintenance
Container Management
# Start services
docker compose up -d
# Stop services
docker compose down
# View logs
docker compose logs -f
# Restart specific service
docker compose restart open_notebook
# Update to latest version
docker compose pull
docker compose up -d
Data Management
# Backup data
tar -czf backup-$(date +%Y%m%d).tar.gz notebook_data surreal_data
# Restore data
tar -xzf backup-20240101.tar.gz
# Clean up old containers
docker system prune -a
Monitoring
# Check resource usage
docker stats
# Check service health
docker compose ps
# View detailed logs
docker compose logs --tail=100 -f open_notebook
📊 Performance Optimization
Resource Allocation
Minimum requirements:
- 2GB RAM
- 2 CPU cores
- 10GB storage
Recommended for production:
- 4GB+ RAM
- 4+ CPU cores
- 50GB+ storage
Model Selection Tips
For cost optimization:
- Use OpenRouter for expensive models
- Use Ollama for simple tasks
- Monitor usage at provider dashboards
For performance:
- Use Groq for fast inference
- Use local models for privacy
- Use OpenAI for reliability
🔍 Troubleshooting
Common Issues
Port conflicts:
# Check what's using port 8502
lsof -i :8502
# Use different port
docker compose -p 8503:8502 up -d
API key errors:
- Verify keys are set correctly in
docker.env - Check you have credits with your AI provider
- Ensure no extra spaces in the key
Database connection issues:
- Check SurrealDB container is running
- Verify database files are writable
- Try restarting containers
Out of memory errors:
- Increase Docker memory allocation
- Use smaller models
- Monitor resource usage
Getting Help
- Check logs:
docker compose logs -f - Verify environment:
docker compose config - Test connectivity:
docker compose exec open_notebook ping surrealdb - Join Discord: discord.gg/37XJPXfz2w
- GitHub Issues: github.com/lfnovo/open-notebook/issues
🎯 Next Steps
After successful deployment:
- Create your first notebook - Start with a simple research project
- Explore features - Try podcasts, transformations, and search
- Optimize models - Experiment with different providers
- Join the community - Share your experience and get help
📚 Complete Configuration Reference
All Environment Variables
# Database Configuration
SURREAL_URL=ws://surrealdb:8000/rpc
SURREAL_USER=root
SURREAL_PASSWORD=root
SURREAL_NAMESPACE=open_notebook
SURREAL_DATABASE=production
# Required: At least one AI provider
OPENAI_API_KEY=sk-your-openai-key
# Optional: Additional AI providers
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key
GOOGLE_API_KEY=your-google-key
GROQ_API_KEY=gsk_your-groq-key
OPENROUTER_API_KEY=sk-or-your-openrouter-key
OLLAMA_API_BASE=http://192.168.1.100:11434
# Optional: Advanced TTS
ELEVENLABS_API_KEY=your-elevenlabs-key
# Optional: Security
OPEN_NOTEBOOK_PASSWORD=your_secure_password
# Optional: Advanced settings
LOG_LEVEL=INFO
MAX_UPLOAD_SIZE=100MB
ENABLE_ANALYTICS=false
Complete Docker Compose
version: '3.8'
services:
surrealdb:
image: surrealdb/surrealdb:v1-latest
ports:
- "8000:8000"
command: start --log warn --user root --pass root file:///mydata/database.db
volumes:
- ./surreal_data:/mydata
restart: always
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 30s
timeout: 10s
retries: 3
open_notebook:
image: lfnovo/open_notebook:v1-latest
ports:
- "8502:8502" # Next.js Frontend
- "5055:5055" # REST API
env_file:
- ./docker.env
volumes:
- ./notebook_data:/app/data
depends_on:
surrealdb:
condition: service_healthy
restart: always
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:5055/health"]
interval: 30s
timeout: 10s
retries: 3
Ready to get started? Follow the Quick Start section above and you'll be up and running in 5 minutes!