Create, run, and share self-contained large language models (LLMs). Ollama bundles a model’s weights, configuration, prompts, and more into self-contained packages that run anywhere.
> Note: certain models that can be run with Ollama are intended for research and/or non-commercial use only.
> Note: Ollama is in early preview. Please report any issues you find.
### Features
## Examples
- Download and run popular large language models
### Quickstart
- Switch between multiple models on the fly
- Hardware acceleration where available (Metal, CUDA)
- Fast inference server written in Go, powered by [llama.cpp](https://github.com/ggerganov/llama.cpp)
- REST API to use with your application (python, typescript SDKs coming soon)
## Install
-[Download](https://ollama.ai/download) for macOS with Apple Silicon (Intel coming soon)
- Download for Windows (coming soon)
You can also build the [binary from source](#building).
## Quickstart
Run a fast and simple model.
```
```
ollama run orca
ollama run llama2
>>> hi
Hello! How can I help you today?
```
```
## Example models
### Creating a model
### 💬 Chat
Create a `Modelfile`:
Have a conversation.
```
ollama run vicuna "Why is the sky blue?"
```
```
FROM llama2
PROMPT """
You are super mario from super mario bros. Answer Mario, the assistant, only.
### 🗺️ Instructions
User: {{ .Prompt }}
Mario:
Get a helping hand.
"""
```
```
ollama run orca "Write an email to my boss."
```
### 🔎 Ask questions about documents
Send the contents of a document and ask questions about it.
Next, create and run the model:
```
```
ollama run nous-hermes "$(cat input.txt)", please summarize this story
ollama create mario -f ./Modelfile
ollama run mario
>>> hi
Hello! It's your friend Mario.
```
```
### 📖 Storytelling
## Install
Venture into the unknown.
-[Download](https://ollama.ai/download) for macOS on Apple Silicon (Intel coming soon)
- Download for Windows and Linux (coming soon)
- Build [from source](#building)
```
## Model library
ollama run nous-hermes "Once upon a time"
```
## Advanced usage
Ollama includes a library of open-source, pre-trained models. More models are coming soon.