upvote
Private AI assistants will be a big thing. You don't want to send all your private data they have access to to a cloud AI API provider. You shouldn't, anyway.
reply
If you’re working on something sensitive, you may not want to share it with OpenAI or Anthropic.

You can run open source models like Kimi K or Qwen locally. Apple recently updated Xcode 26.3 to support local models.

reply
Local LLMs. Lots of people buy Macs due to their unified memory which obviates the need to buy a much more expensive GPU to get the same amount of VRAM.
reply
marketing.
reply
Image Playground
reply