local LLMs builds tool that does exactly what user wants, how it wants it, which is bext UX
this becomes AI literacy
LLMs already nicely bridge the gap form "I want this" to "here's a local page that does it".
examples of tools i have built that requires almost very low tech knowledge * push a button on my phone to take screenshot in my mac (when i watch videos) * help me exercise, gamify it for me * "help me track time spent online to how it impacts what i do in real life, built a tool that rewards and me points me towads things that make me DO things online" * i want to improve my writing, give me exercises and build addiitonal tools (leading to an "append only" digital keyboard i use to exercise )
local AI can already create these tools, and no external company is ever going to beat me/the-user because instead of getting features i don't want, or that almost do what i want, or that do something that advantages the company they just do what I want
Repositories of tools-as-ideas created by others are quite often just index.html and ... that's all? manage data in localstorage, end of it?
Online inferences is still needed for large data (audio/video/images) processing. For now? we don't know, history suggests we'll have the capabilities to do that locally "soon". Or maybe not :)
The main issue is "online for collaboration". Not same user across different devices, that is easy. MeteorJS-style approaches (making local copies of part of dbs, reconcile to remote/origin) seems to be an interesting possibility at small scale, since once you have the right primitives in place you can go horizontally everywhere.
I can’t wait to run my models locally. The sooner I can do my shit without some American mega corp gulping down all my data, the better.