Runnin llm's with ollama on SFOS

here is an interface to ollama I’ve been working on: https://codeberg.org/glitchapp/chatHud/releases You need löve runtime to make it run.
as explained on that release, the touch keyboard is not yet functional, you need a keyboard connected to the phone. By default it sends queries to the local server.
On desktop the program work as expected, on the phone not really, maybe something prevents the query to reach the ollama server.
I leave it here in case anyone want to play with it and test it.

1 Like

I’ve released a new version of my interface to ollama called “chathud”.

I’m sharing this version here for feedback and testing purposes. The chatHUD has internet access and connects to noobhub, a chat server. However, it only works with ollama on desktop devices. I suspect the issue is that the ollama app needs specific parameters to listen to any network connection.

To manually set up ollama server connectivity:

  1. Type "set api url to" followed by your Ollama IP (e.g., http://[yourOllamaIp]/api/generate).
    
  2. type “ollama” or “talk to ollama” followed by your message.
    The ollama response should be received.

The chatHUD also features an auto-complete functionality: type the first few letters of a command (e.g., “oll”) and press tab to complete it."