Unverified Commit 280da445 authored by Daniel Hiltgen's avatar Daniel Hiltgen Committed by GitHub
Browse files

Merge pull request #2988 from dhiltgen/rocm_docs

Refined ROCm troubleshooting docs
parents 0cebc79c 69f02278
...@@ -70,30 +70,36 @@ cat /proc/cpuinfo| grep flags | head -1 ...@@ -70,30 +70,36 @@ cat /proc/cpuinfo| grep flags | head -1
## AMD Radeon GPU Support ## AMD Radeon GPU Support
Ollama leverages the AMD ROCm library, which does not support all AMD GPUs. In Ollama leverages the AMD ROCm library, which does not support all AMD GPUs. In
some cases you can force the system to try to use a close GPU type. For example some cases you can force the system to try to use a similar LLVM target that is
The Radeon RX 5400 is `gfx1034` (also known as 10.3.4) however, ROCm does not close. For example The Radeon RX 5400 is `gfx1034` (also known as 10.3.4)
support this patch-level, the closest support is `gfx1030`. You can use the however, ROCm does not currently support this target. The closest support is
environment variable `HSA_OVERRIDE_GFX_VERSION` with `x.y.z` syntax. So for `gfx1030`. You can use the environment variable `HSA_OVERRIDE_GFX_VERSION` with
example, to force the system to run on the RX 5400, you would set `x.y.z` syntax. So for example, to force the system to run on the RX 5400, you
`HSA_OVERRIDE_GFX_VERSION="10.3.0"` as an environment variable for the server. would set `HSA_OVERRIDE_GFX_VERSION="10.3.0"` as an environment variable for the
server. If you have an unsupported AMD GPU you can experiment using the list of
At this time, the known supported GPU types are the following: (This may change from supported types below.
release to release)
- gfx900 At this time, the known supported GPU types are the following LLVM Targets.
- gfx906 This table shows some example GPUs that map to these LLVM targets:
- gfx908 | **LLVM Target** | **An Example GPU** |
- gfx90a |-----------------|---------------------|
- gfx940 | gfx900 | Radeon RX Vega 56 |
- gfx941 | gfx906 | Radeon Instinct MI50 |
- gfx942 | gfx908 | Radeon Instinct MI100 |
- gfx1030 | gfx90a | Radeon Instinct MI210 |
- gfx1100 | gfx940 | Radeon Instinct MI300 |
- gfx1101 | gfx941 | |
- gfx1102 | gfx942 | |
| gfx1030 | Radeon PRO V620 |
This will not work for all unsupported GPUs. Reach out on [Discord](https://discord.gg/ollama) | gfx1100 | Radeon PRO W7900 |
or file an [issue](https://github.com/ollama/ollama/issues) for additional help. | gfx1101 | Radeon PRO W7700 |
| gfx1102 | Radeon RX 7600 |
AMD is working on enhancing ROCm v6 to broaden support for families of GPUs in a
future release which should increase support for more GPUs.
Reach out on [Discord](https://discord.gg/ollama) or file an
[issue](https://github.com/ollama/ollama/issues) for additional help.
## Installing older versions on Linux ## Installing older versions on Linux
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment