Unverified Commit f61f3402 authored by Matt Williams's avatar Matt Williams Committed by GitHub
Browse files

FAQ: answer a few faq questions (#1128)



* faq: does ollama share my prompts
Signed-off-by: default avatarMatt Williams <m@technovangelist.com>

* faq: ollama and openai
Signed-off-by: default avatarMatt Williams <m@technovangelist.com>

* faq: vscode plugins
Signed-off-by: default avatarMatt Williams <m@technovangelist.com>

* faq: send a doc to Ollama
Signed-off-by: default avatarMatt Williams <m@technovangelist.com>

* extra spacing
Signed-off-by: default avatarMatt Williams <m@technovangelist.com>

* Update faq.md

* Update faq.md

---------
Signed-off-by: default avatarMatt Williams <m@technovangelist.com>
Co-authored-by: default avatarMichael <mchiang0610@users.noreply.github.com>
parent 686f85d6
# FAQ # FAQ
- [How can I view the logs?](#how-can-i-view-the-logs)
- [How can I expose Ollama on my network?](#how-can-i-expose-ollama-on-my-network)
- [How can I allow additional web origins to access Ollama?](#how-can-i-allow-additional-web-origins-to-access-ollama)
- [Where are models stored?](#where-are-models-stored)
## How can I view the logs? ## How can I view the logs?
On macOS: On macOS:
...@@ -74,8 +79,6 @@ systemctl restart ollama ...@@ -74,8 +79,6 @@ systemctl restart ollama
- macOS: Raw model data is stored under `~/.ollama/models`. - macOS: Raw model data is stored under `~/.ollama/models`.
- Linux: Raw model data is stored under `/usr/share/ollama/.ollama/models` - Linux: Raw model data is stored under `/usr/share/ollama/.ollama/models`
Below the models directory you will find a structure similar to the following: Below the models directory you will find a structure similar to the following:
```shell ```shell
...@@ -96,3 +99,11 @@ The manifest lists all the layers used in this model. You will see a `media type ...@@ -96,3 +99,11 @@ The manifest lists all the layers used in this model. You will see a `media type
### How can I change where Ollama stores models? ### How can I change where Ollama stores models?
To modify where models are stored, you can use the `OLLAMA_MODELS` environment variable. Note that on Linux this means defining `OLLAMA_MODELS` in a drop-in `/etc/systemd/system/ollama.service.d` service file, reloading systemd, and restarting the ollama service. To modify where models are stored, you can use the `OLLAMA_MODELS` environment variable. Note that on Linux this means defining `OLLAMA_MODELS` in a drop-in `/etc/systemd/system/ollama.service.d` service file, reloading systemd, and restarting the ollama service.
## Does Ollama send my prompts and answers back to Ollama.ai to use in any way?
No. Anything you do with Ollama, such as generate a response from the model, stays with you. We don't collect any data about how you use the model. You are always in control of your own data.
## How can I use Ollama in VSCode to help me code?
There is already a large collection of plugins available for VSCode as well as other editors that leverage Ollama. You can see the list of plugins at the bottom of the main repository readme.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment