upvote
[flagged]
reply
A local model that can do better than Siri or Alexa as a personal or home assistant is, in my eyes, very useful. Being able to run on a phone or watch or glasses translates to me, low-powered AI, and not necessarily that I want my phone, or watch, or glasses to run things for me.

My Siri use has narrowed down to just setting timers. And even then, I still have my phone call people in the middle of the night. Siri is pretty dumb and does not do what I want it. I’d rather be able to customize an assistant to myself.

I am also thinking of automation in my day to day workflow for work.

reply
OK.. but what would you have all this "automation" actually do? What is Siri failing to do that you want it to do? How would customizing an assistant (for whatever definition) help?
reply
Throwing a few things out - HN has changed over the years, but people make stuff to make stuff. There don't need to be product use cases. The tone of the comment goes against the spirit of HN - likely the reason for downvotes.

That aside- a very small model that takes text and outputs structured json according to a spec is nice. It let's you turn natural language into a user action. For example, command palettes could benefit from this.

If you can do a tiny bit of planning (todo) and chain actions, it seems reasonable that you could traverse a rich state space to achieve some goal on behalf of a user.

Games could use something like it for free form dialog while stool enforcing predefined narrative graphs etc.

I'm sure you could come up with more. It's a fuzzy function.

reply
> people make stuff to make stuff. There don't need to be product use cases.

OK. Great! So it doesn't need to be a commercial product. But does it do something (anything?) interesting? I'm interested in your games example, I'd love to see it done in real life. IIUC, game AIs are actually much more constrained and predictable for play-ability reasons. If you let it go all free form a plurality of players have a "WTF??!?" experience which is super Not Good.

reply
It doesn't have to do any thing interesting - it's completely fascinating all on it's own. If you understand anything about the math and science behind LLMs, you'll understand that this is an achievement worthy of sharing to a community like HN.

That being said, small models like these have plenty of use cases. They allow for extra "slack" to be introduced into a programmatic workflow in a compute constrained environment. Something like this could help enable the "ever present" phone assistant, without scraping all your personal data and sending it off to Google/OpenAI/etc. Imagine if keywords in a chat would then trigger searches on your local data to bring up relevant notes/emails/documents into a cache, and then this cache directly powers your autocomplete (or just a sidebar that pops up with the most relevant information). Having flexible function calling in that loop is key for fault tolerance and adaptability to new content and contexts.

Its cool. Enjoy it.

reply
> Something like this could help enable the "ever present" phone assistant, without scraping all your personal data and sending it off to Google/OpenAI/etc

OK so show me what that's for. Show me something useful you can do with that ability.

> Imagine if keywords in a chat would then trigger searches on your local data to bring up relevant notes/emails/documents into a cache, and then this cache directly powers your autocomplete (or just a sidebar that pops up with the most relevant information).

I'm really trying but.. idgi? I truly cannot imagine how this would improve my life in any way...

> Its cool. Enjoy it.

No. It sounds like a useless complication on my watch. I don't fucking care if it can tell me the phase of the moon. I can look up at the sky and see the moon and know what phase it is.

EDIT: You say:

> If you understand anything about the math and science behind LLMs, you'll understand that this is an achievement worthy of sharing to a community like HN.

OK. So educate me. Tell me what I'm missing.

reply
deleted
reply
You can think of “phone use” for instance, what Siri is supposed to be.
reply
I mean.. Siri basically works? When I'm driving I say "Hey Siri, find me a gas station along my route", and it does. Or I say "Hey Siri, call Joe Bob mobile" and it does. Or I say "Hey Siri, play me a podcast". This is kind of a solved problem already? When I'm driving this is literally as complicated of a distraction as I want--I'm not going to be dictating emails or texts. When I'm not driving, the touchscreen keyboard (as shitty an interface as that is) is 100x better than voiced natural language commands.
reply
It does just barely work now after they spent billions, and they may still fall back to cloud LLMs for a significant number of things. This is a way that everyone can get that on the actual Apple Watch or local phone for any application they build.
reply
I get that, but I still can't imagine what it might be for. TBH I don't have a smart watch, because I can't think of anything I'd want one for--my mechanical watch keeps time to within a few seconds per month and the lume lasts all night. I don't know what making it "smarter" would do for me, it does an A+ job of being a watch. What are the things that "everyone" can build with this that actually matter? Like, what is the differentiator?

EDIT: To be clear, the monoculture of phone operating systems sucks. If this somehow enables more entrants into that space then I'm all for it. However, I don't see this in particular being the deciding factor... For example, the reason I don't run a 3rd party operating system on my phone isn't because it's lacking Siri or "OK Google" (if these things went away tomorrow I'd barely notice), it's because it would be a pain in the ass to make it be a phone.

reply