upvote
Apparently the Overcast guy build a beowulf cluster of Mac minis to use the Apple transcription service.

https://www.linkedin.com/posts/nathangathright_marco-arment-...

reply
The ATP episode where he talked about this was incredibly fascinating. Marco is such a role model to me - he has a complete immunity to fads and trends, and just does things the way he wants to. He adopts the 'new and cool' things only when they have a real benefit.
reply
I largely subscribe to the use boring tech ethos ... but php? come on man.

and yet... successful people have used it to build really successful things: Facebook, Tumblr (I think), the things Marco's been involved with.

I just dunno outside of meta should we really be pushing php with all its flaws? or is it still flawed and I need to update my priors?

reply
There’s something major to be said for going to war with the tools you have.

And living with decisions made 15 years ago may be much more successful than trying to change horses mid-stream.

reply
Yeah heard him talk about that. 48 or so 16GB m4 Mac minis. Insane. The Beowulf lives
reply
For small tasks this seems perfect. However it being limited to English from what I can tell is quite a downsite for me.
reply
It can work in other languages?

  % apfel --model-info
  apfel v0.6.25 — model info
  ├ model:      apple-foundationmodel
  ├ on-device:  true (always)
  ├ available:  yes
  ├ context:    4096 tokens
  ├ languages:  zh, en, nl, zh, es, es, ja, en, pt, da, fr, it, nb, vi, tr, en, de, fr, es, pt, ko, sv, zh
  └ framework:  FoundationModels (macOS 26+)
Just use the language you want when prompting it, like other LLMs?

   % apfel "Gib mir ein Rezept für Currywurst."
  Natürlich! Hier ist ein einfaches Rezept für Currywurst:
  
  ### Zutaten:
  - **Für die Würste:**
    - 500 g Bratwürste (z. B. Frankfurter Würste)
(note: clipped most of the reply, since I assume most of us here don't actually need an LLM-generated recipe)
reply
The vision models and OCR are SUPER
reply