Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ollama
Commits
2e339c2b
"src/vscode:/vscode.git/clone" did not exist on "88466358733da21a4ab45d85300ee6960f588e7d"
Commit
2e339c2b
authored
Jul 18, 2023
by
Jeffrey Morgan
Browse files
flatten `examples`
parent
38f0c54c
Changes
5
Hide whitespace changes
Inline
Side-by-side
Showing
5 changed files
with
0 additions
and
47 deletions
+0
-47
examples/midjourneyprompter
examples/midjourneyprompter
+0
-0
examples/python/README.md
examples/python/README.md
+0
-15
examples/python/main.py
examples/python/main.py
+0
-32
examples/recipemaker
examples/recipemaker
+0
-0
examples/tweetwriter
examples/tweetwriter
+0
-0
No files found.
examples/
modelfiles/
midjourneyprompter
→
examples/midjourneyprompter
View file @
2e339c2b
File moved
examples/python/README.md
deleted
100644 → 0
View file @
38f0c54c
# Python
This is a simple example of calling the Ollama api from a python app.
First, download a model:
```
curl -L https://huggingface.co/TheBloke/orca_mini_3B-GGML/resolve/main/orca-mini-3b.ggmlv3.q4_1.bin -o orca.bin
```
Then run it using the example script. You'll need to have Ollama running on your machine.
```
python3 main.py orca.bin
```
examples/python/main.py
deleted
100644 → 0
View file @
38f0c54c
import
http.client
import
json
import
os
import
sys
if
len
(
sys
.
argv
)
<
2
:
print
(
"Usage: python main.py <model file>"
)
sys
.
exit
(
1
)
conn
=
http
.
client
.
HTTPConnection
(
'localhost'
,
11434
)
headers
=
{
'Content-Type'
:
'application/json'
}
# generate text from the model
conn
.
request
(
"POST"
,
"/api/generate"
,
json
.
dumps
({
'model'
:
os
.
path
.
join
(
os
.
getcwd
(),
sys
.
argv
[
1
]),
'prompt'
:
'write me a short story'
,
'stream'
:
True
}),
headers
)
response
=
conn
.
getresponse
()
def
parse_generate
(
data
):
for
event
in
data
.
decode
(
'utf-8'
).
split
(
"
\n
"
):
if
not
event
:
continue
yield
event
if
response
.
status
==
200
:
for
chunk
in
response
:
for
event
in
parse_generate
(
chunk
):
print
(
json
.
loads
(
event
)[
'response'
],
end
=
""
,
flush
=
True
)
examples/
modelfiles/
recipemaker
→
examples/recipemaker
View file @
2e339c2b
File moved
examples/
modelfiles/
tweetwriter
→
examples/tweetwriter
View file @
2e339c2b
File moved
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment