When LLM gives you "{ }"
from chevy9294@monero.town to programming@programming.dev on 07 Dec 20:08
https://monero.town/post/5194300

Today I found the weirdest bug in my life. I was making a chatbot for Signal using Ollama in Rust. I finished a basic demo and tried it. For any message I would get { }, {}, {} or { } .

Do you know how hard is to debug something like this???

What was the problem? Not my program. Ollama bug combined with ollama-rs bug (rust library for ollama). And both bugs are not even bugs if you don’t combine them.

Ollama released a new feature yesterday called “Structured outputs”. Basically you can specify a format of the output using format field in json request. Format field already existes for something but I don’t know for what. In ollama-rs you can specify the format to json or leave it empty. By default its empty. Where is the bug?

There is a difference betwen “format”: null and not specifying the format at all. Ollama-rs will set format to null if you dont specify it. Ollama will interpret null as a valid format. What happens? LLM WILL ACTUALLY GIVE YOU FORMAT OF OUTPUT AS NULL - { }!

#programming

threaded - newest

HeckGazer@programming.dev on 07 Dec 21:05 next collapse

And both bugs are not even bugs if you don’t combine them.

There is a difference betwen “format”: null and not specifying the format at all.

Hmmmmm, that sure does sound like a bug

dgriffith@aussie.zone on 08 Dec 06:25 collapse

Ollama released a new feature yesterday

Ollama 0.5.1, yesterday : “Fixed issue where Ollama’s API would generate JSON output when specifying “format”: null”

Ollama-rs 0.2.1: released 08/09/24.

Gee I wonder why it doesn’t play nicely with the latest Ollama API which uses new/updated behaviour for an option 🤔