Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Could not load engine llama-cpp on Win10 #1879

Open
2 of 7 tasks
TC117 opened this issue Jan 21, 2025 · 0 comments
Open
2 of 7 tasks

bug: Could not load engine llama-cpp on Win10 #1879

TC117 opened this issue Jan 21, 2025 · 0 comments
Labels
P1: important Important feature / fix type: bug Something isn't working

Comments

@TC117
Copy link

TC117 commented Jan 21, 2025

Cortex version

1.0.8

Describe the issue and expected behaviour

  • Install cortex
  • run cortex run tinyllama

Image

logs.zip

Could not load engines

Steps to Reproduce

No response

Screenshots / Logs

No response

What is your OS?

  • Windows
  • Mac Silicon
  • Mac Intel
  • Linux / Ubuntu

What engine are you running?

  • cortex.llamacpp (default)
  • cortex.tensorrt-llm (Nvidia GPUs)
  • cortex.onnx (NPUs, DirectML)

Hardware Specs eg OS version, GPU

Window 10 - VM 108

@TC117 TC117 added the type: bug Something isn't working label Jan 21, 2025
@github-project-automation github-project-automation bot moved this to Investigating in Menlo Jan 21, 2025
@TC117 TC117 added P1: important Important feature / fix category: engine management Related to engine abstraction and removed category: engine management Related to engine abstraction labels Jan 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
P1: important Important feature / fix type: bug Something isn't working
Projects
Status: Investigating
Development

No branches or pull requests

1 participant