Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compilation Error: 'OrtLoraAdapter' Does Not Name a Type in ONNX Runtime Nightly Build for aarch64 Linux with CUDA #1031

Open
Bahar-BM opened this issue Nov 4, 2024 · 2 comments
Labels

Comments

@Bahar-BM
Copy link

Bahar-BM commented Nov 4, 2024

Describe the bug
Hello, I'm attempting to build the nightly version for aarch64 Linux with CUDA enabled, but I encounter the following error:

In file included from /mnt/ssd_drive/repos/onnxruntime-genai/src/generators.h:35,
                 from /mnt/ssd_drive/repos/onnxruntime-genai/src/cuda/beam_search_scorer_cuda.cpp:1:
/mnt/ssd_drive/repos/onnxruntime-genai/src/models/onnxruntime_api.h:519:45: error: ‘OrtLoraAdapter’ does not name a type
  519 |   OrtRunOptions& AddActiveLoraAdapter(const OrtLoraAdapter& adapter);  ///< Wraps OrtApi::RunOptionsSetActiveLoraAdapter

To Reproduce
Steps to reproduce the behavior:
Build the project using build.py with the following command:

python3 build.py --config Release --ort_home examples/c --use_cuda --cuda_home=/mnt/ssd_drive/cuda_lib/cuda-12.3

P.S. I have previously built ONNX Runtime 1.19.2 for aarch64 Linux with CUDA enabled from source code and placed the required include files and libraries in the examples/c directory.

Desktop (please complete the following information):

  • OS: aarch64 Ubuntu
  • Version 20.04
@baijumeswani
Copy link
Contributor

To build onnxruntime-genai from source on main, you need the latest onnxruntime headers. You can get the 1.20 headers from here https://github.com/microsoft/onnxruntime/releases/download/v1.20.0/onnxruntime-linux-aarch64-1.20.0.tgz.

You can then put the headers inside examples/c/include and try building again with the same command. Alternatively, you could avoid passing in the ort_home arg, and onnxruntime-genai will try to download the necessary dependencies by itself.

@Bahar-BM
Copy link
Author

Bahar-BM commented Nov 4, 2024

Thank you for the quick response! It's good to know! You might want to update the README in examples/c, as your sample uses ONNX Runtime 1.19.2 and instructs users to build onnxruntime-genai from source code.

By the way, do you have any plans to release onnxruntime-genai artifacts for aarch64 Linux with CUDA enabled?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants