Because it can do sweet FA with SailfishOS? Or Android or Apple for that matter.
Firstly, their AI is OpenAI. Even if you’re willing to give them all your personal data, it wouldn’t know what to do with it. OpenAI isn’t at all open so you’d be better off running Llama on your own server. You then have to train it to do remote access of your phone.
Now, LLMs aren’t people, they’re neural networks based off text input. Training such a thing is easier said than done.
Ideally, you’d put the thing on custom low-power hardware with an Nvidia laptop GPU. This will of course cost you around $200-300.
It’s doable but not by me and probably not by you either.
