Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama branch name generation doesn't work #5862

Open
sersorrel opened this issue Dec 20, 2024 · 17 comments · Fixed by #5883
Open

Ollama branch name generation doesn't work #5862

sersorrel opened this issue Dec 20, 2024 · 17 comments · Fixed by #5883
Labels
bug Something isn't working

Comments

@sersorrel
Copy link

Version

Version 0.14.4 (20241213.093904)

Operating System

macOS

Distribution Method

dmg (Apple Silicon)

Describe the issue

Hitting "Generate branch name" with the Ollama backend configured (default settings – endpoint http://127.0.0.1:11434, model llama3) results in the error "Failed to generate branch name: The string did not match the expected pattern".

image

I do not really understand the network requests I'm seeing in devtools, but there's a request to plugin:http|fetch_send with a strange-looking response:

{
    "status": 403,
    "statusText": "Forbidden",
    "headers": [
        [
            "date",
            "Fri, 20 Dec 2024 11:31:26 GMT"
        ],
        [
            "content-length",
            "0"
        ]
    ],
    "url": "http://127.0.0.1:11434/api/chat",
    "rid": 3377205724
}

I've checked that Ollama is actually installed and running:

$ http POST http://127.0.0.1:11434/api/chat model=llama3
HTTP/1.1 200 OK
Content-Length: 137
Content-Type: application/json; charset=utf-8
Date: Fri, 20 Dec 2024 11:31:59 GMT

{
    "created_at": "2024-12-20T11:31:59.41741Z",
    "done": true,
    "done_reason": "load",
    "message": {
        "content": "",
        "role": "assistant"
    },
    "model": "llama3"
}

How to reproduce

Configure GitButler to use Ollama, and try to generate a branch name.

Expected behavior

Ollama should be able to generate a branch name (like e.g. OpenAI).

Relevant log output

No response

@sersorrel sersorrel added the bug Something isn't working label Dec 20, 2024
@Caleb-T-Owens
Copy link
Contributor

Hi! I believe you might be running into an issue related to our app permissions. Tauri has strong sandboxing around what the frontend can access.

Could you try 0.0.0.0 or localhost because I believe we have those whitelist.

@sersorrel
Copy link
Author

localhost gives the same error; 0.0.0.0 produces a different error: "url not allowed on the configured scope: http://0.0.0.0:11434/api/chat"

@Caleb-T-Owens
Copy link
Contributor

Sorry for the trouble! I'll spin up ollama and try it out this afternoon.

@Caleb-T-Owens
Copy link
Contributor

I've had a go running it, and it seems to work in development mode 😬. I'm currently on holiday and debugging differences between development and production builds does not sound super enjoyable, so I'm going to put this onto my todolist for new year.

Sorry for any inconvenience.

@ivanglushko
Copy link

+1 Experiencing the same error

@ikitozen
Copy link

ikitozen commented Jan 1, 2025

+1 same error :)

@mietl
Copy link

mietl commented Jan 2, 2025

error + 1

@mtsgrd
Copy link
Contributor

mtsgrd commented Jan 2, 2025

Auto-closed when I merged the likely fix, so re-opening until it has been verified.

@mtsgrd mtsgrd reopened this Jan 2, 2025
@Caleb-T-Owens
Copy link
Contributor

I'll have a go now 👍

@Caleb-T-Owens
Copy link
Contributor

By "now" I mean, in 30 minutes time when a nightly build has finished chugging away

@Caleb-T-Owens
Copy link
Contributor

Screenshot 2025-01-02 at 15 55 07

I'm getting these strange "string did not match the expected pattern" errors 🤔

@mtsgrd
Copy link
Contributor

mtsgrd commented Jan 2, 2025

I made a mistake in the previous pr: #5885

@ivanglushko
Copy link

Didn't work for me

Screenshot 2025-01-05 at 18 30 35

@mtsgrd
Copy link
Contributor

mtsgrd commented Jan 5, 2025

Can you try pointing gitbutler at localhost or 127.0.0.1?

@ivanglushko
Copy link

Screenshot 2025-01-08 at 16 59 28

Can you try pointing gitbutler at localhost or 127.0.0.1?

@mtsgrd
Copy link
Contributor

mtsgrd commented Jan 8, 2025

Argh. Let me properly debug this later today.

@erryox
Copy link

erryox commented Jan 16, 2025

Hello! Are there any updates on the issue?

I've been experimenting with the dev build to figure out what's going wrong. I came across this thread, which might be helpful. According to the comments, the bug seems to be somehow related to the explicit specification of the port.

I managed to work around this bug by following these steps:

  1. Add 127.0.0.1 ollama.local to /etc/hosts.
  2. Configure an Nginx server like this:
server {
    listen 80;
    server_name ollama.local;

    location / {
        proxy_pass http://127.0.0.1:11434;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
    }
}
  1. Add http://ollama.local to crates/gitbutler-tauri/capabilities/main.json.
  2. Update the Ollama endpoint from http://127.0.0.1:11434 to http://ollama.local.

It works now—well, about half the time. Obviously, this is not a solution, but perhaps it could help you solve the problem.

However, I occasionally encounter errors like this:

Invalid response: { "type": "object", "properties": { "result": "Update go.mod with cloud provider version" }, "required": [ "result" ], "additionalProperties": false }

It seems to me that the Ollama response isn’t always correct and doesn’t fully comply with the OLLAMA_CHAT_MESSAGE_FORMAT_SCHEMA, as there is no result field at the top level. The JSON returned by the LLM appears to be strange and doesn’t entirely make sense. This behavior seems expected, given that LLMs tend to hallucinate, especially smaller models like llama3.2 in my case.

Is there anything you can do to address this? Maybe the prompt could be adjusted to force the model to return plain text instead of JSON?

update: I’ve also managed to reproduce the error: “The string did not match the expected pattern”. This happens because of the following call, as well as inconsistencies in the Ollama output.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants