-
Notifications
You must be signed in to change notification settings - Fork 558
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama branch name generation doesn't work #5862
Comments
Hi! I believe you might be running into an issue related to our app permissions. Tauri has strong sandboxing around what the frontend can access. Could you try |
|
Sorry for the trouble! I'll spin up ollama and try it out this afternoon. |
I've had a go running it, and it seems to work in development mode 😬. I'm currently on holiday and debugging differences between development and production builds does not sound super enjoyable, so I'm going to put this onto my todolist for new year. Sorry for any inconvenience. |
+1 Experiencing the same error |
+1 same error :) |
error + 1 |
Auto-closed when I merged the likely fix, so re-opening until it has been verified. |
I'll have a go now 👍 |
By "now" I mean, in 30 minutes time when a nightly build has finished chugging away |
I made a mistake in the previous pr: #5885 |
Can you try pointing gitbutler at |
Argh. Let me properly debug this later today. |
Hello! Are there any updates on the issue? I've been experimenting with the dev build to figure out what's going wrong. I came across this thread, which might be helpful. According to the comments, the bug seems to be somehow related to the explicit specification of the port. I managed to work around this bug by following these steps:
server {
listen 80;
server_name ollama.local;
location / {
proxy_pass http://127.0.0.1:11434;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
}
It works now—well, about half the time. Obviously, this is not a solution, but perhaps it could help you solve the problem. However, I occasionally encounter errors like this:
It seems to me that the Ollama response isn’t always correct and doesn’t fully comply with the OLLAMA_CHAT_MESSAGE_FORMAT_SCHEMA, as there is no Is there anything you can do to address this? Maybe the prompt could be adjusted to force the model to return plain text instead of JSON? update: I’ve also managed to reproduce the error: “The string did not match the expected pattern”. This happens because of the following call, as well as inconsistencies in the Ollama output. |
Version
Version 0.14.4 (20241213.093904)
Operating System
macOS
Distribution Method
dmg (Apple Silicon)
Describe the issue
Hitting "Generate branch name" with the Ollama backend configured (default settings – endpoint
http://127.0.0.1:11434
, modelllama3
) results in the error "Failed to generate branch name: The string did not match the expected pattern".I do not really understand the network requests I'm seeing in devtools, but there's a request to
plugin:http|fetch_send
with a strange-looking response:I've checked that Ollama is actually installed and running:
How to reproduce
Configure GitButler to use Ollama, and try to generate a branch name.
Expected behavior
Ollama should be able to generate a branch name (like e.g. OpenAI).
Relevant log output
No response
The text was updated successfully, but these errors were encountered: