Skip to content

[bug]: cuda_utils.so: undefined symbol: cuModuleGetFunction #8544

@VortexAcherontic

Description

@VortexAcherontic

Is there an existing issue for this problem?

  • I have searched the existing issues

Install method

Invoke's Launcher

Operating system

Linux

GPU vendor

Nvidia (CUDA)

GPU model

RTX 3080

GPU VRAM

10GB

Version number

v6.5.1

Browser

No response

System Information

Can't do this the launcher does not have this button for me:

cat /etc/os-release 
NAME="openSUSE Tumbleweed"
# VERSION="20250901"
ID="opensuse-tumbleweed"
ID_LIKE="opensuse suse"
VERSION_ID="20250901"
PRETTY_NAME="openSUSE Tumbleweed"
ANSI_COLOR="0;32"
# CPE 2.3 format, boo#1217921
CPE_NAME="cpe:2.3:o:opensuse:tumbleweed:20250901:*:*:*:*:*:*:*"
#CPE 2.2 format
#CPE_NAME="cpe:/o:opensuse:tumbleweed:20250901"
BUG_REPORT_URL="https://bugzilla.opensuse.org"
SUPPORT_URL="https://bugs.opensuse.org"
HOME_URL="https://www.opensuse.org"
DOCUMENTATION_URL="https://en.opensuse.org/Portal:Tumbleweed"
LOGO="distributor-logo-Tumbleweed"
uname -r
6.16.3-1-default
nvidia-smi
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 580.76.05              Driver Version: 580.76.05      CUDA Version: 13.0     |
+-----------------------------------------+------------------------+----------------------+

What happened

Since Invoke AI 5.9.1 (last working version) I can no longer use it. After launching it it returns:

Started Invoke process with PID 49381

Traceback (most recent call last):
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/diffusers/utils/import_utils.py", line 820, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/vortexacherontic/distrobox/tumbleweed/.local/share/uv/python/cpython-3.12.9-linux-x86_64-gnu/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 999, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/diffusers/models/modeling_utils.py", line 40, in <module>
    from ..quantizers import DiffusersAutoQuantizer, DiffusersQuantizer
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/diffusers/quantizers/__init__.py", line 15, in <module>
    from .auto import DiffusersAutoQuantizer
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/diffusers/quantizers/auto.py", line 22, in <module>
    from .bitsandbytes import BnB4BitDiffusersQuantizer, BnB8BitDiffusersQuantizer
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/diffusers/quantizers/bitsandbytes/__init__.py", line 2, in <module>
    from .utils import dequantize_and_replace, dequantize_bnb_weight, replace_with_bnb_linear
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/diffusers/quantizers/bitsandbytes/utils.py", line 32, in <module>
    import bitsandbytes as bnb
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/bitsandbytes/__init__.py", line 20, in <module>
    from .nn import modules
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/bitsandbytes/nn/__init__.py", line 21, in <module>
    from .triton_based_modules import (
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/bitsandbytes/nn/triton_based_modules.py", line 6, in <module>
    from bitsandbytes.triton.dequantize_rowwise import dequantize_rowwise
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/bitsandbytes/triton/dequantize_rowwise.py", line 18, in <module>
    @triton.autotune(
     ^^^^^^^^^^^^^^^^
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/triton/runtime/autotuner.py", line 378, in decorator
    return Autotuner(fn, fn.arg_names, configs, key, reset_to_zero, restore_value, pre_hook=pre_hook,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/triton/runtime/autotuner.py", line 130, in __init__
    self.do_bench = driver.active.get_benchmarker()
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/triton/runtime/driver.py", line 23, in __getattr__
    self._initialize_obj()
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/triton/runtime/driver.py", line 20, in _initialize_obj
    self._obj = self._init_fn()
                ^^^^^^^^^^^^^^^
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/triton/runtime/driver.py", line 9, in _create_driver
    return actives[0]()
           ^^^^^^^^^^^^
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/triton/backends/nvidia/driver.py", line 535, in __init__
    self.utils = CudaUtils()  # TODO: make static
                 ^^^^^^^^^^^
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/triton/backends/nvidia/driver.py", line 89, in __init__
    mod = compile_module_from_src(Path(os.path.join(dirname, "driver.c")).read_text(), "cuda_utils")
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/triton/backends/nvidia/driver.py", line 71, in compile_module_from_src
    mod = importlib.util.module_from_spec(spec)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ImportError: /home/vortexacherontic/distrobox/tumbleweed/.triton/cache/QLAEYTJR4KV5WSBGJKRUAKVP475DE47NW7P4XMI2RFXBOIE5TZ4Q/cuda_utils.so: undefined symbol: cuModuleGetFunction

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/vortexacherontic/distrobox/tumbleweed/invokeai/.venv/bin/invokeai-web", line 12, in <module>
    sys.exit(run_app())
             ^^^^^^^^^
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/invokeai/app/run_app.py", line 35, in run_app
    from invokeai.app.invocations.baseinvocation import InvocationRegistry
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/invokeai/app/invocations/baseinvocation.py", line 41, in <module>
    from invokeai.app.services.shared.invocation_context import InvocationContext
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/invokeai/app/services/shared/invocation_context.py", line 18, in <module>
    from invokeai.app.services.model_records.model_records_base import UnknownModelException
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/invokeai/app/services/model_records/__init__.py", line 3, in <module>
    from .model_records_base import (  # noqa F401
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/invokeai/app/services/model_records/model_records_base.py", line 15, in <module>
    from invokeai.backend.model_manager.config import (
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/invokeai/backend/model_manager/__init__.py", line 3, in <module>
    from invokeai.backend.model_manager.config import (
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/invokeai/backend/model_manager/config.py", line 39, in <module>
    from invokeai.backend.model_manager.model_on_disk import ModelOnDisk
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/invokeai/backend/model_manager/model_on_disk.py", line 11, in <module>
    from invokeai.backend.model_manager.taxonomy import ModelRepoVariant
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/invokeai/backend/model_manager/taxonomy.py", line 7, in <module>
    from diffusers import ModelMixin
  File "<frozen importlib._bootstrap>", line 1412, in _handle_fromlist
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/diffusers/utils/import_utils.py", line 811, in __getattr__
    value = getattr(module, name)
            ^^^^^^^^^^^^^^^^^^^^^
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/diffusers/utils/import_utils.py", line 810, in __getattr__
    module = self._get_module(self._class_to_module[name])
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/vortexacherontic/distrobox_nvme/invoke_home/invokeai/.venv/lib/python3.12/site-packages/diffusers/utils/import_utils.py", line 822, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import diffusers.models.modeling_utils because of the following error (look up to see its traceback):
/home/vortexacherontic/distrobox/tumbleweed/.triton/cache/QLAEYTJR4KV5WSBGJKRUAKVP475DE47NW7P4XMI2RFXBOIE5TZ4Q/cuda_utils.so: undefined symbol: cuModuleGetFunction
Invoke process was terminated with signal 0, exit code 1

What you expected to happen

Invoke AI to launch

How to reproduce the problem

  • Install InvokeAI
  • Run
  • Crash

Additional context

I run InvokeAI via a Tumbleweed distrobox container which shares the entire nVidia Host driver (use --nvidia upon creating the container) . This hasn't been a problem prior and including version 5.9.1.

  • I have used the repair function of the launcher
  • I updated the Launcher to the latest version (1.7.1)
  • I fully uninstalled InvokeAI and installed it. (eg. nuking the entire distrobox container and it's user directory)

I waited to report this issue to see if future release might fix it. But it did not happen so I finally bothered to open up this issue.

Discord username

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions