Rated 4 out of 5 stars

Great addon with a lot of good tools. Thank you for your work. It works fine with OpenAI but I cant get it to work with a local Ollama LLM.

I tries http://127.0.0.1:11434 and also https://host.docker.internal:11434 after moving ollama to docker. I alway get
"Ollama API request failed: TypeError: NetworkError when attempting to fetch resource." when I try to fetch the modell-list. Anyone with an idea?

Have you setup CORS as described here?
https://micz.it/thunderbird-addon-thunderai/ollama-cors-information/

Please open an issue so we can interact more effectively: https://github.com/micz/ThunderAI/issues