We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hello,
I have Ollama configured and it works
{ services.ollama.enable = true; services.ollama.acceleration = "rocm"; services.ollama.package = pkgs.unstable.ollama; }
When generating text, the CPU turns to 100% usage, but the GPU keep on low usage
When looking at log, it always contains no compatible amdgpu devices detected
no compatible amdgpu devices detected
journalctl
nov. 11 15:17:21 framework-16 systemd[1]: Started Server for local large language models. nov. 11 15:17:21 framework-16 ollama[849149]: 2024/11/11 15:17:21 routes.go:1153: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/var/lib/ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]" nov. 11 15:17:21 framework-16 ollama[849149]: time=2024-11-11T15:17:21.224+01:00 level=INFO source=images.go:753 msg="total blobs: 265" nov. 11 15:17:21 framework-16 ollama[849149]: time=2024-11-11T15:17:21.239+01:00 level=INFO source=images.go:760 msg="total unused blobs removed: 0" nov. 11 15:17:21 framework-16 ollama[849149]: time=2024-11-11T15:17:21.241+01:00 level=INFO source=routes.go:1200 msg="Listening on 127.0.0.1:11434 (version 0.3.12)" nov. 11 15:17:21 framework-16 ollama[849149]: time=2024-11-11T15:17:21.242+01:00 level=INFO source=common.go:135 msg="extracting embedded files" dir=/tmp/ollama443964246/runners nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.188+01:00 level=INFO source=common.go:49 msg="Dynamic LLM libraries" runners="[cpu_avx cpu_avx2 rocm cpu]" nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.188+01:00 level=INFO source=gpu.go:199 msg="looking for compatible GPUs" nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.189+01:00 level=WARN source=gpu.go:669 msg="unable to locate gpu dependency libraries" nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.189+01:00 level=WARN source=gpu.go:669 msg="unable to locate gpu dependency libraries" nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.189+01:00 level=WARN source=gpu.go:669 msg="unable to locate gpu dependency libraries" nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.189+01:00 level=WARN source=gpu.go:669 msg="unable to locate gpu dependency libraries" nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.189+01:00 level=WARN source=amd_linux.go:60 msg="ollama recommends running the https://www.amd.com/en/support/linux-drivers" error="amdgpuversion file missing: /sys/module/amdgpu/version stat /sys/module/amdgpu/version: no such file or directory" nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.190+01:00 level=WARN source=amd_linux.go:341 msg="amdgpu is not supported" gpu=0 gpu_type=gfx1102 library=/nix/store/61gkm5h3p829msla66xigl8wm93vynqf-rocm-path/lib supported_types=[] nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.190+01:00 level=WARN source=amd_linux.go:343 msg="See https://github.com/ollama/ollama/blob/main/docs/gpu.md#overrides for HSA_OVERRIDE_GFX_VERSION usage" nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.190+01:00 level=INFO source=amd_linux.go:275 msg="unsupported Radeon iGPU detected skipping" id=1 total="512.0 MiB" nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.190+01:00 level=INFO source=amd_linux.go:361 msg="no compatible amdgpu devices detected" nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.190+01:00 level=INFO source=gpu.go:347 msg="no compatible GPUs were discovered" nov. 11 15:17:23 framework-16 ollama[849149]: time=2024-11-11T15:17:23.190+01:00 level=INFO source=types.go:107 msg="inference compute" id=0 library=cpu variant=avx2 compute="" driver=0.0 name="" total="30.7 GiB" available="15.7 GiB"
I tried to add HSA_OVERRIDE_GFX_VERSION in the configuration as per the documentation mentions
HSA_OVERRIDE_GFX_VERSION
{ services.ollama.environmentVariables = { HSA_OVERRIDE_GFX_VERSION = "11.0.2"; HSA_OVERRIDE_GFX_VERSION_1 = "11.0.2"; HSA_OVERRIDE_GFX_VERSION_2 = "11.0.2"; HSA_OVERRIDE_GFX_VERSION_3 = "11.0.3"; ROCR_VISIBLE_DEVICES = "GPU-XX"; }; }
But stills ends with no compatible amdgpu devices detected
How to use the AMD GPU with Ollama on Framework 16 ?
ollama --version
ollama version is 0.3.12
nix-info
- system: `"x86_64-linux"` - host os: `Linux 6.6.59, NixOS, 24.05 (Uakari), 24.05.20241107.83fb6c0` - multi-user?: `yes` - sandbox: `yes` - version: `nix-env (Nix) 2.24.8` - nixpkgs: `/nix/store/l2v78vdd33hvyyy33w9zih2ss60k2yms-source`
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Hello,
I have Ollama configured and it works
No GPU
When generating text, the CPU turns to 100% usage, but the GPU keep on low usage
When looking at log, it always contains
no compatible amdgpu devices detected
long
journalctl
outputTests
I tried to add
HSA_OVERRIDE_GFX_VERSION
in the configuration as per the documentation mentionsBut stills ends with
no compatible amdgpu devices detected
Question
How to use the AMD GPU with Ollama on Framework 16 ?
Additional informations
The text was updated successfully, but these errors were encountered: