mirror of https://github.com/ollama/ollama.git
				
				
				
			docs: improve linux install documentation (#6683)
Includes small improvements to document layout and code blocks
This commit is contained in:
		
							parent
							
								
									da915345d1
								
							
						
					
					
						commit
						108fb6c1d1
					
				
							
								
								
									
										103
									
								
								docs/linux.md
								
								
								
								
							
							
						
						
									
										103
									
								
								docs/linux.md
								
								
								
								
							|  | @ -1,43 +1,57 @@ | |||
| # Ollama on Linux | ||||
| # Linux | ||||
| 
 | ||||
| ## Install | ||||
| 
 | ||||
| Install Ollama running this one-liner: | ||||
| To install Ollama, run the following command: | ||||
| 
 | ||||
| > | ||||
| 
 | ||||
| ```bash | ||||
| ```shell | ||||
| curl -fsSL https://ollama.com/install.sh | sh | ||||
| ``` | ||||
| 
 | ||||
| ## AMD Radeon GPU support | ||||
| 
 | ||||
| While AMD has contributed the `amdgpu` driver upstream to the official linux | ||||
| kernel source, the version is older and may not support all ROCm features. We | ||||
| recommend you install the latest driver from | ||||
| https://www.amd.com/en/support/linux-drivers for best support of your Radeon | ||||
| GPU. | ||||
| 
 | ||||
| ## Manual install | ||||
| 
 | ||||
| ### Download `ollama` | ||||
| Download and extract the package: | ||||
| 
 | ||||
| Download and extract the Linux package: | ||||
| 
 | ||||
| ```bash | ||||
| curl -fsSL https://ollama.com/download/ollama-linux-amd64.tgz | sudo tar zx -C /usr | ||||
| ```shell | ||||
| curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz | ||||
| sudo tar -C /usr -xzf ollama-linux-amd64.tgz | ||||
| ``` | ||||
| 
 | ||||
| If you have an AMD GPU, also download and extract the ROCm package into the same location | ||||
| ```bash | ||||
| curl -fsSL https://ollama.com/download/ollama-linux-amd64-rocm.tgz | sudo tar zx -C /usr | ||||
| Start Ollama: | ||||
| 
 | ||||
| ```shell | ||||
| ollama serve | ||||
| ``` | ||||
| 
 | ||||
| In another terminal, verify that Ollama is running: | ||||
| 
 | ||||
| ```shell | ||||
| ollama -v | ||||
| ``` | ||||
| 
 | ||||
| ### AMD GPU install | ||||
| 
 | ||||
| If you have an AMD GPU, also download and extract the additional ROCm package: | ||||
| 
 | ||||
| ```shell | ||||
| curl -L https://ollama.com/download/ollama-linux-amd64-rocm.tgz -o ollama-linux-amd64-rocm.tgz | ||||
| sudo tar -C /usr -xzf ollama-linux-amd64-rocm.tgz | ||||
| ``` | ||||
| 
 | ||||
| ### ARM64 install | ||||
| 
 | ||||
| Download and extract the ARM64-specific package: | ||||
| 
 | ||||
| ```shell | ||||
| curl -L https://ollama.com/download/ollama-linux-arm64.tgz -o ollama-linux-arm64.tgz | ||||
| sudo tar -C /usr -xzf ollama-linux-arm64.tgz | ||||
| ``` | ||||
| 
 | ||||
| ### Adding Ollama as a startup service (recommended) | ||||
| 
 | ||||
| Create a user and group for Ollama: | ||||
| 
 | ||||
| ```bash | ||||
| ```shell | ||||
| sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama | ||||
| sudo usermod -a -G ollama $(whoami) | ||||
| ``` | ||||
|  | @ -63,47 +77,54 @@ WantedBy=default.target | |||
| 
 | ||||
| Then start the service: | ||||
| 
 | ||||
| ```bash | ||||
| ```shell | ||||
| sudo systemctl daemon-reload | ||||
| sudo systemctl enable ollama | ||||
| ``` | ||||
| 
 | ||||
| ### Install CUDA drivers (optional – for Nvidia GPUs) | ||||
| ### Install CUDA drivers (optional) | ||||
| 
 | ||||
| [Download and install](https://developer.nvidia.com/cuda-downloads) CUDA. | ||||
| 
 | ||||
| Verify that the drivers are installed by running the following command, which should print details about your GPU: | ||||
| 
 | ||||
| ```bash | ||||
| ```shell | ||||
| nvidia-smi | ||||
| ``` | ||||
| 
 | ||||
| ### Install ROCm (optional - for Radeon GPUs) | ||||
| [Download and Install](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html) | ||||
| ### Install AMD ROCm drivers (optional) | ||||
| 
 | ||||
| Make sure to install ROCm v6 | ||||
| [Download and Install](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html) ROCm v6. | ||||
| 
 | ||||
| ### Start Ollama | ||||
| 
 | ||||
| Start Ollama and verify it is running: | ||||
| 
 | ||||
| ```bash | ||||
| ```shell | ||||
| sudo systemctl start ollama | ||||
| sudo systemctl status ollama | ||||
| ``` | ||||
| 
 | ||||
| ## Update | ||||
| > [!NOTE] | ||||
| > While AMD has contributed the `amdgpu` driver upstream to the official linux | ||||
| > kernel source, the version is older and may not support all ROCm features. We | ||||
| > recommend you install the latest driver from | ||||
| > https://www.amd.com/en/support/linux-drivers for best support of your Radeon | ||||
| > GPU. | ||||
| 
 | ||||
| Update ollama by running the install script again: | ||||
| ## Updating | ||||
| 
 | ||||
| ```bash | ||||
| Update Ollama by running the install script again: | ||||
| 
 | ||||
| ```shell | ||||
| curl -fsSL https://ollama.com/install.sh | sh | ||||
| ``` | ||||
| 
 | ||||
| Or by downloading the ollama binary: | ||||
| Or by re-downloading Ollama: | ||||
| 
 | ||||
| ```bash | ||||
| curl -fsSL https://ollama.com/download/ollama-linux-amd64.tgz | sudo tar zx -C /usr | ||||
| ```shell | ||||
| curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz | ||||
| sudo tar -C /usr -xzf ollama-linux-amd64.tgz | ||||
| ``` | ||||
| 
 | ||||
| ## Installing specific versions | ||||
|  | @ -112,15 +133,15 @@ Use `OLLAMA_VERSION` environment variable with the install script to install a s | |||
| 
 | ||||
| For example: | ||||
| 
 | ||||
| ``` | ||||
| curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.1.32 sh | ||||
| ```shell | ||||
| curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.3.9 sh | ||||
| ``` | ||||
| 
 | ||||
| ## Viewing logs | ||||
| 
 | ||||
| To view logs of Ollama running as a startup service, run: | ||||
| 
 | ||||
| ```bash | ||||
| ```shell | ||||
| journalctl -e -u ollama | ||||
| ``` | ||||
| 
 | ||||
|  | @ -128,7 +149,7 @@ journalctl -e -u ollama | |||
| 
 | ||||
| Remove the ollama service: | ||||
| 
 | ||||
| ```bash | ||||
| ```shell | ||||
| sudo systemctl stop ollama | ||||
| sudo systemctl disable ollama | ||||
| sudo rm /etc/systemd/system/ollama.service | ||||
|  | @ -136,13 +157,13 @@ sudo rm /etc/systemd/system/ollama.service | |||
| 
 | ||||
| Remove the ollama binary from your bin directory (either `/usr/local/bin`, `/usr/bin`, or `/bin`): | ||||
| 
 | ||||
| ```bash | ||||
| ```shell | ||||
| sudo rm $(which ollama) | ||||
| ``` | ||||
| 
 | ||||
| Remove the downloaded models and Ollama service user and group: | ||||
| 
 | ||||
| ```bash | ||||
| ```shell | ||||
| sudo rm -r /usr/share/ollama | ||||
| sudo userdel ollama | ||||
| sudo groupdel ollama | ||||
|  |  | |||
		Loading…
	
		Reference in New Issue