20 KiB
Open Notebook Installation Guide
This comprehensive guide will help you install and configure Open Notebook, an open-source, privacy-focused alternative to Google's Notebook LM. Whether you're a beginner or advanced user, this guide covers all installation methods and configuration options.
Table of Contents
- Quick Start
- System Requirements
- Installation Methods
- Service Architecture
- Environment Configuration
- Manual Installation
- Docker Installation
- AI Model Configuration
- Verification and Testing
- Security Configuration
- Troubleshooting
Quick Start
For users who want to get started immediately:
Docker (Recommended for Beginners)
# Create project directory
mkdir open-notebook && cd open-notebook
# Download configuration files
curl -O https://raw.githubusercontent.com/lfnovo/open-notebook/main/docker-compose.yml
curl -O https://raw.githubusercontent.com/lfnovo/open-notebook/main/.env.example
# Rename and configure environment
mv .env.example docker.env
# Edit docker.env with your API keys
# Start Open Notebook
docker compose up -d
From Source (Developers)
# Clone and setup
git clone https://github.com/lfnovo/open-notebook
cd open-notebook
cp .env.example .env
# Edit .env with your API keys
# Install dependencies and start
uv sync
make start-all
Access Open Notebook at http://localhost:8502
System Requirements
Hardware Requirements
- CPU: 2+ cores recommended (4+ cores for better performance)
- RAM: Minimum 4GB (8GB+ recommended)
- Storage: 10GB+ available space
- Network: Stable internet connection for AI model access
Operating System Support
- macOS: 10.15 (Catalina) or later
- Linux: Ubuntu 18.04+, Debian 9+, CentOS 7+, Fedora 30+
- Windows: Windows 10 or later (WSL2 recommended)
Software Prerequisites
- Python: 3.9 or later (for source installation)
- Docker: Latest version (for Docker installation)
- uv: Python package manager (for source installation)
Installation Methods
Open Notebook supports multiple installation methods. Choose the one that best fits your needs:
| Method | Best For | Difficulty | Pros | Cons |
|---|---|---|---|---|
| Docker Single-Container | Beginners, simple deployments | Easy | One-click setup, isolated environment | Less control, harder to debug |
| Docker Multi-Container | Production deployments | Medium | Scalable, professional setup | More complex configuration |
| Source Installation | Developers, customization | Advanced | Full control, easy debugging | Requires Python knowledge |
Service Architecture
Open Notebook consists of four main services that work together:
1. SurrealDB Database (Port 8000)
- Purpose: Stores notebooks, sources, notes, and metadata
- Technology: SurrealDB - a modern, multi-model database
- Configuration: Runs in Docker container with persistent storage
2. FastAPI Backend (Port 5055)
- Purpose: REST API for all application functionality
- Features: Interactive API documentation, authentication, data validation
- Endpoints:
/api/notebooks,/api/sources,/api/notes,/api/chat
3. Background Worker
- Purpose: Processes long-running tasks asynchronously
- Tasks: Podcast generation, content transformations, embeddings
- Technology: Surreal Commands worker system
4. React frontend (Port 8502)
- Purpose: Web-based user interface
- Features: Notebooks, chat, sources, notes, search
- Technology: Next.js framework
Service Communication Flow
User Browser → React frontend → FastAPI Backend → SurrealDB Database
↓
Background Worker ← Job Queue
Environment Configuration
Open Notebook uses environment variables for configuration. Create a .env file (or docker.env for Docker) based on the template below:
Core Configuration
# Security (Optional - for public deployments)
OPEN_NOTEBOOK_PASSWORD=your_secure_password_here
# Database Configuration
SURREAL_URL="ws://localhost:8000/rpc"
SURREAL_USER="root"
SURREAL_PASSWORD="root"
SURREAL_NAMESPACE="open_notebook"
SURREAL_DATABASE="production"
AI Provider Configuration
OpenAI (Recommended for beginners)
# Provides: Language models, embeddings, TTS, STT
OPENAI_API_KEY=sk-your-openai-key-here
Anthropic (Claude models)
# Provides: High-quality language models
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key-here
Google (Gemini)
# Provides: Large context models, embeddings, TTS
GEMINI_API_KEY=your-gemini-key-here
Vertex AI (Google Cloud)
# Provides: Enterprise-grade AI models
VERTEX_PROJECT=your-google-cloud-project-name
GOOGLE_APPLICATION_CREDENTIALS=./google-credentials.json
VERTEX_LOCATION=us-east5
Additional Providers
# DeepSeek - Cost-effective models
DEEPSEEK_API_KEY=your-deepseek-key-here
# Mistral - European AI provider
MISTRAL_API_KEY=your-mistral-key-here
# Groq - Fast inference
GROQ_API_KEY=your-groq-key-here
# xAI (Grok) - Cutting-edge models
XAI_API_KEY=your-xai-key-here
# ElevenLabs - High-quality voice synthesis
ELEVENLABS_API_KEY=your-elevenlabs-key-here
# Ollama - Local AI models
OLLAMA_API_BASE="http://localhost:11434"
# OpenRouter - Access to multiple models
OPENROUTER_BASE_URL="https://openrouter.ai/api/v1"
OPENROUTER_API_KEY=your-openrouter-key-here
# Azure OpenAI
AZURE_OPENAI_API_KEY=your-azure-key-here
AZURE_OPENAI_ENDPOINT=https://your-endpoint.openai.azure.com/
AZURE_OPENAI_API_VERSION="2024-12-01-preview"
AZURE_OPENAI_DEPLOYMENT_NAME=your-deployment-name
# OpenAI Compatible (LM Studio, etc.)
OPENAI_COMPATIBLE_BASE_URL=http://localhost:1234/v1
# Optional - only if your endpoint requires authentication
OPENAI_COMPATIBLE_API_KEY=your-key-here
Optional Services
# Firecrawl - Enhanced web scraping
FIRECRAWL_API_KEY=your-firecrawl-key-here
# Jina - Advanced embeddings
JINA_API_KEY=your-jina-key-here
# Voyage AI - Specialized embeddings
VOYAGE_API_KEY=your-voyage-key-here
# LangSmith - Debugging and monitoring
LANGCHAIN_TRACING_V2=true
LANGCHAIN_ENDPOINT="https://api.smith.langchain.com"
LANGCHAIN_API_KEY=your-langsmith-key-here
LANGCHAIN_PROJECT="Open Notebook"
Manual Installation
Prerequisites Installation
macOS
# Install Homebrew if not already installed
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
# Install system dependencies
brew install libmagic
# Install uv (Python package manager)
brew install uv
# Install Docker Desktop
brew install --cask docker
Ubuntu/Debian
# Update package list
sudo apt update
# Install system dependencies
sudo apt install -y libmagic-dev python3-dev build-essential
# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install Docker
sudo apt install -y docker.io docker-compose-plugin
sudo systemctl start docker
sudo systemctl enable docker
sudo usermod -aG docker $USER
CentOS/RHEL/Fedora
# Install system dependencies
sudo dnf install -y file-devel python3-devel gcc
# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install Docker
sudo dnf install -y docker docker-compose
sudo systemctl start docker
sudo systemctl enable docker
sudo usermod -aG docker $USER
Source Installation Steps
- Clone the Repository
git clone https://github.com/lfnovo/open-notebook.git
cd open-notebook
- Configure Environment
# Copy environment template
cp .env.example .env
# Edit environment file with your API keys
nano .env # or use your preferred editor
- Install Python Dependencies
# Install all required packages
uv sync
# Install additional system-specific packages
uv pip install python-magic
- Initialize Database
# Start SurrealDB
make database
# Wait for database to be ready (about 10 seconds)
- Start All Services
# Start all services at once
make start-all
This will start:
- SurrealDB database on port 8000
- FastAPI backend on port 5055
- Background worker for processing
- React frontend on port 8502
Alternative: Start Services Individually
For development or debugging, you can start each service separately:
# Terminal 1: Database
make database
# Terminal 2: API Backend
make api
# Terminal 3: Background Worker
make worker
# Terminal 4: React frontend
make run
Docker Installation
Single-Container Deployment (Recommended for Beginners)
Perfect for personal use or platforms like PikaPods:
- Create Project Directory
mkdir open-notebook
cd open-notebook
- Create Docker Compose File
# Create docker-compose.yml
cat > docker-compose.yml << 'EOF'
services:
open_notebook:
image: lfnovo/open_notebook:v1-latest-single
ports:
- "8502:8502"
- "5055:5055"
env_file:
- ./docker.env
pull_policy: always
volumes:
- ./notebook_data:/app/data
- ./surreal_single_data:/mydata
restart: always
EOF
- Create Environment File
# Create docker.env with your API keys
cat > docker.env << 'EOF'
# REQUIRED: At least one AI provider
OPENAI_API_KEY=your-openai-key-here
# Database settings (don't change)
SURREAL_ADDRESS=localhost
SURREAL_PORT=8000
SURREAL_USER=root
SURREAL_PASS=root
SURREAL_NAMESPACE=open_notebook
SURREAL_DATABASE=production
# Optional: Password protection
# OPEN_NOTEBOOK_PASSWORD=your_secure_password
EOF
- Start Open Notebook
docker compose up -d
Multi-Container Deployment (Production)
For scalable production deployments:
- Download Configuration
# Download the main docker-compose.yml
curl -O https://raw.githubusercontent.com/lfnovo/open-notebook/main/docker-compose.yml
# Copy environment template
curl -o docker.env https://raw.githubusercontent.com/lfnovo/open-notebook/main/.env.example
- Configure Environment
# Edit docker.env with your API keys
nano docker.env
- Start Services
# Start with multi-container profile
docker compose --profile multi up -d
Docker Service Management
# Check service status
docker compose ps
# View logs
docker compose logs -f
# Stop services
docker compose down
# Update to latest version
docker compose pull
docker compose up -d
# Restart specific service
docker compose restart open_notebook
AI Model Configuration
After installation, configure your AI models for optimal performance:
1. Access Model Settings
- Navigate to Settings → Models in the web interface
- Or visit
http://localhost:8502and click the settings icon
2. Configure Model Categories
Language Models (Chat & Generation)
Budget-Friendly Options:
gpt-5-mini(OpenAI) - Great value for most tasksdeepseek-chat(DeepSeek) - Excellent quality-to-price ratiogemini-2.0-flash(Google) - Large context window
Premium Options:
gpt-4o(OpenAI) - Excellent tool callingclaude-3.5-sonnet(Anthropic) - High-quality reasoninggrok-3(xAI) - Cutting-edge intelligence
Embedding Models (Search & Similarity)
Recommended:
text-embedding-3-small(OpenAI) - $0.02 per 1M tokenstext-embedding-004(Google) - Generous free tiermistral-embed(Mistral) - European alternative
Text-to-Speech (Podcast Generation)
High Quality:
eleven_turbo_v2_5(ElevenLabs) - Best voice qualitygpt-4o-mini-tts(OpenAI) - Good quality, reliable
Budget Options:
gemini-2.5-flash-preview-tts(Google) - $10 per 1M tokens
Speech-to-Text (Audio Transcription)
Recommended:
whisper-1(OpenAI) - Industry standardscribe_v1(ElevenLabs) - High-quality transcription
3. Provider-Specific Setup
OpenAI Setup
- Visit https://platform.openai.com/
- Create account and navigate to API Keys
- Click "Create new secret key"
- Add at least $5 in billing credits
- Copy key to your
.envfile
Anthropic Setup
- Visit https://console.anthropic.com/
- Create account and navigate to API Keys
- Generate new key
- Add to environment variables
Google (Gemini) Setup
- Visit https://makersuite.google.com/app/apikey
- Create new API key
- Add to environment variables
4. Model Recommendations by Use Case
Personal Research
# Language: gpt-5-mini (OpenAI)
# Embedding: text-embedding-3-small (OpenAI)
# TTS: gpt-4o-mini-tts (OpenAI)
# STT: whisper-1 (OpenAI)
Professional Use
# Language: claude-3.5-sonnet (Anthropic)
# Embedding: text-embedding-004 (Google)
# TTS: eleven_turbo_v2_5 (ElevenLabs)
# STT: whisper-1 (OpenAI)
Budget-Conscious
# Language: deepseek-chat (DeepSeek)
# Embedding: text-embedding-004 (Google)
# TTS: gemini-2.5-flash-preview-tts (Google)
# STT: whisper-1 (OpenAI)
Verification and Testing
1. Service Health Checks
Check All Services
# For source installation
make status
# For Docker
docker compose ps
Individual Service Tests
# Test database connection
curl http://localhost:8000/health
# Test API backend
curl http://localhost:5055/health
# Test React frontend
curl http://localhost:8502/healthz
2. Create Test Notebook
-
Access Web Interface
- Open
http://localhost:8502 - You should see the Open Notebook home page
- Open
-
Create First Notebook
- Click "Create New Notebook"
- Name: "Test Notebook"
- Description: "Testing installation"
- Click "Create"
-
Add Test Source
- Click "Add Source"
- Select "Text" tab
- Paste: "This is a test document for Open Notebook installation."
- Click "Add Source"
-
Test Chat Function
- Go to Chat tab
- Ask: "What is this document about?"
- You should receive a response about the test document
3. Feature Testing
Test Search Functionality
- Add multiple sources to your notebook
- Use the search bar to find specific content
- Verify both full-text and semantic search work
Test Transformations
- Select a source
- Click "Transform" → "Summarize"
- Verify transformation completes successfully
Test Podcast Generation
- Add substantial content to your notebook
- Navigate to "Podcast" tab
- Click "Generate Podcast"
- Wait for background processing to complete
Security Configuration
Password Protection
For public deployments, enable password protection:
# Add to your .env or docker.env file
OPEN_NOTEBOOK_PASSWORD=your_secure_password_here
Features:
- React frontend: Password prompt on first access
- REST API: Requires
Authorization: Bearer your_passwordheader - Local Usage: Optional (can be left empty)
API Security
When using the REST API programmatically:
# Example API call with password
curl -H "Authorization: Bearer your_password" \
http://localhost:5055/api/notebooks
Network Security
For production deployments:
- Use HTTPS: Configure reverse proxy (nginx, Cloudflare)
- Firewall Rules: Restrict access to necessary ports only
- VPN Access: Consider VPN for private networks
- Regular Updates: Keep Docker images updated
# Update Docker images
docker compose pull
docker compose up -d
Troubleshooting
Common Installation Issues
Port Already in Use
# Error: Port 8502 is already in use
# Solution: Find and stop conflicting process
lsof -i :8502
kill -9 <PID>
# Or use different port
uv run --env-file .env cd frontend && npm run dev --server.port=8503
Permission Denied (Docker)
# Error: Permission denied accessing Docker
# Solution: Add user to docker group
sudo usermod -aG docker $USER
# Log out and log back in
Python/uv Installation Issues
# Error: uv command not found
# Solution: Install uv package manager
curl -LsSf https://astral.sh/uv/install.sh | sh
source ~/.bashrc
# Error: Python version conflict
# Solution: Use uv's Python management
uv python install 3.11
uv python pin 3.11
libmagic Installation Issues
# macOS: Install via Homebrew
brew install libmagic
# Ubuntu/Debian: Install dev package
sudo apt install libmagic-dev
# CentOS/RHEL/Fedora: Install dev package
sudo dnf install file-devel
API and Database Issues
Database Connection Failed
# Check if SurrealDB is running
docker compose ps surrealdb
# Check database logs
docker compose logs surrealdb
# Restart database
docker compose restart surrealdb
API Backend Not Responding
# Check API logs
docker compose logs api
# For source installation
# Check if API process is running
pgrep -f "run_api.py"
# Restart API
make api
Worker Not Processing Jobs
# Check worker status
pgrep -f "surreal-commands-worker"
# Restart worker
make worker-restart
# Check worker logs
docker compose logs worker
AI Provider Issues
OpenAI API Key Errors
# Error: Invalid API key
# Solution: Verify key format and billing
# 1. Check key starts with "sk-"
# 2. Verify billing credits in OpenAI dashboard
# 3. Check API key permissions
Model Not Available
# Error: Model not found
# Solution: Check model availability
# 1. Verify model name in provider documentation
# 2. Check API key permissions
# 3. Try alternative model
Rate Limiting Issues
# Error: Rate limit exceeded
# Solution: Implement backoff strategy
# 1. Reduce concurrent requests
# 2. Upgrade provider plan
# 3. Use multiple providers
Performance Issues
Slow Response Times
# Check system resources
top
docker stats
# Optimize database
# Consider increasing Docker memory limits
# Use faster storage (SSD)
Memory Issues
# Error: Out of memory
# Solution: Increase Docker memory
# 1. Docker Desktop → Settings → Resources
# 2. Increase memory limit to 4GB+
# 3. Consider model optimization
Data and Storage Issues
Persistent Data Loss
# Ensure volumes are properly mounted
docker compose config
# Check volume permissions
ls -la ./notebook_data
ls -la ./surreal_data
# Fix permissions if needed
sudo chown -R $USER:$USER ./notebook_data
sudo chown -R $USER:$USER ./surreal_data
Getting Help
Community Support
- Discord: https://discord.gg/37XJPXfz2w
- GitHub Issues: https://github.com/lfnovo/open-notebook/issues
- Installation Assistant: https://chatgpt.com/g/g-68776e2765b48191bd1bae3f30212631-open-notebook-installation-assistant
Bug Reports
When reporting issues, include:
- Installation method (Docker/source)
- Operating system and version
- Error messages and logs
- Steps to reproduce
- Environment configuration (without API keys)
Log Collection
# Collect all logs
docker compose logs > open-notebook-logs.txt
# For source installation
make status > status.txt
Next Steps
After successful installation:
- Read the User Guide: Learn about features and workflows
- Check Model Providers: Explore different AI providers for your needs
- Configure Transformations: Set up custom content processing
- Explore API: Use the REST API for integrations
- Join Community: Connect with other users for tips and support
Advanced Configuration
For advanced users:
- Custom Prompts: Customize AI behavior with Jinja templates
- API Integration: Build custom applications using the REST API
- Multi-User Setup: Configure for team usage
- Backup Strategy: Set up automated backups
Performance Optimization
- Model Selection: Choose optimal models for your use case
- Caching: Configure appropriate cache settings
- Resource Limits: Tune Docker resource allocation
- Monitoring: Set up logging and monitoring
Welcome to Open Notebook! 🚀