People would say, "Why use this when it's harder to read and only saves N ms?" He'd reply that you'd care about those ms when you had to read a database from 500 remote servers (I'm paraphrasing. He probably had a much better example.)
Turns out, he wrote a book that I later purchased. It appears to have been taken over by a different author, but the first release was all him and I bought it immediately when I recognized the name / unix.com handle. Though it was over my head when I first bought it, I later learned enough to love it. I hope he's on HN and knows that someone loved his posts / book.
https://www.amazon.com/Pro-Bash-Programming-Scripting-Expert...
Also performance improvements on heavy used systems unlocks:
Cost savings
Stability
Higher reliability
Higher throughput
Fewer incidents
Lower scaling out requirements.
For example, doing dangerous thing might be faster (no bound checks, weaker consistency guarantee, etc), but it clearly tend to be a reliability regression.
And how does performance improve reliability? Well, a more performant service is harder to overwhelm with a flood of requests.
Of course this is a very artificial and almost nonsensical example, but that is how you optimize bounds checks away - you just make it impossible for the bounds to be exceeded through means other than explicitly checking.
That's crazy to think about. My JSON files can be measured in bytes. :-D
https://developer.nvidia.com/blog/accelerating-json-processi...
So went not compare that case directly? We'd also want to see the performance of the assumed overheads i.e. how it scales.
Either way, I have really big doubts that there will be ever a significant amount of people who'd choose jq for that.