Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ollama
Commits
280da445
Unverified
Commit
280da445
authored
Mar 08, 2024
by
Daniel Hiltgen
Committed by
GitHub
Mar 08, 2024
Browse files
Merge pull request #2988 from dhiltgen/rocm_docs
Refined ROCm troubleshooting docs
parents
0cebc79c
69f02278
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
30 additions
and
24 deletions
+30
-24
docs/troubleshooting.md
docs/troubleshooting.md
+30
-24
No files found.
docs/troubleshooting.md
View file @
280da445
...
@@ -70,30 +70,36 @@ cat /proc/cpuinfo| grep flags | head -1
...
@@ -70,30 +70,36 @@ cat /proc/cpuinfo| grep flags | head -1
## AMD Radeon GPU Support
## AMD Radeon GPU Support
Ollama leverages the AMD ROCm library, which does not support all AMD GPUs. In
Ollama leverages the AMD ROCm library, which does not support all AMD GPUs. In
some cases you can force the system to try to use a close GPU type. For example
some cases you can force the system to try to use a similar LLVM target that is
The Radeon RX 5400 is
`gfx1034`
(also known as 10.3.4) however, ROCm does not
close. For example The Radeon RX 5400 is
`gfx1034`
(also known as 10.3.4)
support this patch-level, the closest support is
`gfx1030`
. You can use the
however, ROCm does not currently support this target. The closest support is
environment variable
`HSA_OVERRIDE_GFX_VERSION`
with
`x.y.z`
syntax. So for
`gfx1030`
. You can use the environment variable
`HSA_OVERRIDE_GFX_VERSION`
with
example, to force the system to run on the RX 5400, you would set
`x.y.z`
syntax. So for example, to force the system to run on the RX 5400, you
`HSA_OVERRIDE_GFX_VERSION="10.3.0"`
as an environment variable for the server.
would set
`HSA_OVERRIDE_GFX_VERSION="10.3.0"`
as an environment variable for the
server. If you have an unsupported AMD GPU you can experiment using the list of
At this time, the known supported GPU types are the following: (This may change from
supported types below.
release to release)
-
gfx900
At this time, the known supported GPU types are the following LLVM Targets.
-
gfx906
This table shows some example GPUs that map to these LLVM targets:
-
gfx908
|
**LLVM Target**
|
**An Example GPU**
|
-
gfx90a
|-----------------|---------------------|
-
gfx940
| gfx900 | Radeon RX Vega 56 |
-
gfx941
| gfx906 | Radeon Instinct MI50 |
-
gfx942
| gfx908 | Radeon Instinct MI100 |
-
gfx1030
| gfx90a | Radeon Instinct MI210 |
-
gfx1100
| gfx940 | Radeon Instinct MI300 |
-
gfx1101
| gfx941 | |
-
gfx1102
| gfx942 | |
| gfx1030 | Radeon PRO V620 |
This will not work for all unsupported GPUs. Reach out on
[
Discord
](
https://discord.gg/ollama
)
| gfx1100 | Radeon PRO W7900 |
or file an
[
issue
](
https://github.com/ollama/ollama/issues
)
for additional help.
| gfx1101 | Radeon PRO W7700 |
| gfx1102 | Radeon RX 7600 |
AMD is working on enhancing ROCm v6 to broaden support for families of GPUs in a
future release which should increase support for more GPUs.
Reach out on
[
Discord
](
https://discord.gg/ollama
)
or file an
[
issue
](
https://github.com/ollama/ollama/issues
)
for additional help.
## Installing older versions on Linux
## Installing older versions on Linux
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment