index.mdx 1.27 KB
Newer Older
1
---
2
title: Introduction
3
---
4

5
Ollama's API allows you to run and interact with models programatically.
6

7
## Get started
Matt Williams's avatar
Matt Williams committed
8

9
If you're just getting started, follow the [quickstart](/quickstart) documentation to get up and running with Ollama's API.
Matt Williams's avatar
Matt Williams committed
10

11
## Base URL
Matt Williams's avatar
Matt Williams committed
12

13
After installation, Ollama's API is served by default at:
Matt Williams's avatar
Matt Williams committed
14

15
```
16
http://localhost:11434/api
Matt Williams's avatar
Matt Williams committed
17
18
```

19
For running cloud models on **ollama.com**, the same API is available with the following base URL:
Matt Williams's avatar
Matt Williams committed
20
21

```
22
https://ollama.com/api
23
24
```

25
## Example request
26

27
Once Ollama is running, its API is automatically available and can be accessed via `curl`:
28
29
30

```shell
curl http://localhost:11434/api/generate -d '{
31
32
  "model": "gemma3",
  "prompt": "Why is the sky blue?"
33
34
35
}'
```

36
## Libraries
Michael Yang's avatar
Michael Yang committed
37

38
Ollama has official libraries for Python and JavaScript:
Michael Yang's avatar
Michael Yang committed
39

40
41
- [Python](https://github.com/ollama/ollama-python)
- [JavaScript](https://github.com/ollama/ollama-js)
Michael Yang's avatar
Michael Yang committed
42

43
Several community-maintained libraries are available for Ollama. For a full list, see the [Ollama GitHub repository](https://github.com/ollama/ollama?tab=readme-ov-file#libraries-1).
44

45
## Versioning
46

47
Ollama's API isn't strictly versioned, but the API is expected to be stable and backwards compatible. Deprecations are rare and will be announced in the [release notes](https://github.com/ollama/ollama/releases).