Unverified Commit 1b272d5b authored by Patrick Devine's avatar Patrick Devine Committed by GitHub
Browse files

change `github.com/jmorganca/ollama` to `github.com/ollama/ollama` (#3347)

parent 29715dbc
......@@ -113,7 +113,7 @@ FROM llama2
```
A list of available base models:
<https://github.com/jmorganca/ollama#model-library>
<https://github.com/ollama/ollama#model-library>
#### Build from a `bin` file
......
# OpenAI compatibility
> **Note:** OpenAI compatibility is experimental and is subject to major adjustments including breaking changes. For fully-featured access to the Ollama API, see the Ollama [Python library](https://github.com/ollama/ollama-python), [JavaScript library](https://github.com/ollama/ollama-js) and [REST API](https://github.com/jmorganca/ollama/blob/main/docs/api.md).
> **Note:** OpenAI compatibility is experimental and is subject to major adjustments including breaking changes. For fully-featured access to the Ollama API, see the Ollama [Python library](https://github.com/ollama/ollama-python), [JavaScript library](https://github.com/ollama/ollama-js) and [REST API](https://github.com/ollama/ollama/blob/main/docs/api.md).
Ollama provides experimental compatibility with parts of the [OpenAI API](https://platform.openai.com/docs/api-reference) to help connect existing applications to Ollama.
......
# PrivateGPT with Llama 2 uncensored
https://github.com/jmorganca/ollama/assets/3325447/20cf8ec6-ff25-42c6-bdd8-9be594e3ce1b
https://github.com/ollama/ollama/assets/3325447/20cf8ec6-ff25-42c6-bdd8-9be594e3ce1b
> Note: this example is a slightly modified version of PrivateGPT using models such as Llama 2 Uncensored. All credit for PrivateGPT goes to Iván Martínez who is the creator of it, and you can find his GitHub repo [here](https://github.com/imartinez/privateGPT).
......
......@@ -28,7 +28,7 @@ You are Mario from Super Mario Bros, acting as an assistant.
What if you want to change its behaviour?
- Try changing the prompt
- Try changing the parameters [Docs](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md)
- Try changing the parameters [Docs](https://github.com/ollama/ollama/blob/main/docs/modelfile.md)
- Try changing the model (e.g. An uncensored model by `FROM wizard-vicuna` this is the wizard-vicuna uncensored model )
Once the changes are made,
......
# JSON Output Example
![llmjson 2023-11-10 15_31_31](https://github.com/jmorganca/ollama/assets/633681/e599d986-9b4a-4118-81a4-4cfe7e22da25)
![llmjson 2023-11-10 15_31_31](https://github.com/ollama/ollama/assets/633681/e599d986-9b4a-4118-81a4-4cfe7e22da25)
There are two python scripts in this example. `randomaddresses.py` generates random addresses from different countries. `predefinedschema.py` sets a template for the model to fill in.
......
# Log Analysis example
![loganalyzer 2023-11-10 08_53_29](https://github.com/jmorganca/ollama/assets/633681/ad30f1fc-321f-4953-8914-e30e24db9921)
![loganalyzer 2023-11-10 08_53_29](https://github.com/ollama/ollama/assets/633681/ad30f1fc-321f-4953-8914-e30e24db9921)
This example shows one possible way to create a log file analyzer. It uses the model **mattw/loganalyzer** which is based on **codebooga**, a 34b parameter model.
......
# Function calling
![function calling 2023-11-16 16_12_58](https://github.com/jmorganca/ollama/assets/633681/a0acc247-9746-45ab-b325-b65dfbbee4fb)
![function calling 2023-11-16 16_12_58](https://github.com/ollama/ollama/assets/633681/a0acc247-9746-45ab-b325-b65dfbbee4fb)
One of the features added to some models is 'function calling'. It's a bit of a confusing name. It's understandable if you think that means the model can call functions, but that's not what it means. Function calling simply means that the output of the model is formatted in JSON, using a preconfigured schema, and uses the expected types. Then your code can use the output of the model and call functions with it. Using the JSON format in Ollama, you can use any model for function calling.
......
......@@ -8,7 +8,7 @@ import (
"testing"
"time"
"github.com/jmorganca/ollama/api"
"github.com/ollama/ollama/api"
)
func TestOrcaMiniBlueSky(t *testing.T) {
......
......@@ -9,7 +9,7 @@ import (
"testing"
"time"
"github.com/jmorganca/ollama/api"
"github.com/ollama/ollama/api"
"github.com/stretchr/testify/require"
)
......
......@@ -9,7 +9,7 @@ import (
"testing"
"time"
"github.com/jmorganca/ollama/api"
"github.com/ollama/ollama/api"
)
// TODO - this would ideally be in the llm package, but that would require some refactoring of interfaces in the server
......
......@@ -21,8 +21,8 @@ import (
"testing"
"time"
"github.com/jmorganca/ollama/api"
"github.com/jmorganca/ollama/app/lifecycle"
"github.com/ollama/ollama/api"
"github.com/ollama/ollama/app/lifecycle"
"github.com/stretchr/testify/assert"
)
......
......@@ -33,8 +33,8 @@ import (
"time"
"unsafe"
"github.com/jmorganca/ollama/api"
"github.com/jmorganca/ollama/gpu"
"github.com/ollama/ollama/api"
"github.com/ollama/ollama/gpu"
)
type dynExtServer struct {
......
......@@ -15,7 +15,7 @@ import (
"github.com/pdevine/tensor/native"
"github.com/x448/float16"
"github.com/jmorganca/ollama/format"
"github.com/ollama/ollama/format"
)
type ContainerGGUF struct {
......
......@@ -5,7 +5,7 @@ import (
"fmt"
"time"
"github.com/jmorganca/ollama/api"
"github.com/ollama/ollama/api"
)
const jsonGrammar = `
......
......@@ -8,8 +8,8 @@ import (
"runtime"
"slices"
"github.com/jmorganca/ollama/api"
"github.com/jmorganca/ollama/gpu"
"github.com/ollama/ollama/api"
"github.com/ollama/ollama/gpu"
)
type LLM interface {
......
......@@ -16,7 +16,7 @@ import (
"golang.org/x/exp/slices"
"golang.org/x/sync/errgroup"
"github.com/jmorganca/ollama/gpu"
"github.com/ollama/ollama/gpu"
)
// Libraries names may contain an optional variant separated by '_'
......
......@@ -3,7 +3,7 @@ package llm
import (
"testing"
"github.com/jmorganca/ollama/gpu"
"github.com/ollama/ollama/gpu"
"github.com/stretchr/testify/assert"
)
......
......@@ -3,7 +3,7 @@ package main
import (
"context"
"github.com/jmorganca/ollama/cmd"
"github.com/ollama/ollama/cmd"
"github.com/spf13/cobra"
)
......
......@@ -11,7 +11,7 @@ import (
"time"
"github.com/gin-gonic/gin"
"github.com/jmorganca/ollama/api"
"github.com/ollama/ollama/api"
)
type Error struct {
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment