Fifteen years ago, if an application started spinning or mail stopped coming in, you could open up Console.app and have reasonable confidence the app in question would have logged an easy to tag error diagnostic. This was how the plague of mysterious DNS resolution issues got tied to the half-baked discoveryd so quickly.
Now, those 600 processes and 2000 threads are blasting thousands of log entries per second, with dozens of errors happening in unrecognizable daemons doing thrice-delegated work.
It seems like a perfect example of Jevons paradox (or andy/bill law): unified logging makes logging rich and cheap and free, but that causes everyone to throw it everywhere willy nilly. It's so noisy in there that I'm not sure who the logs are for anymore, it's useless for the user of the computer and even as a developer it seems impossible to debug things just by passively watching logs unless you already know the precise filter predicate.
In fact they must realize it's hopeless because the new Console doesn't even give you a mechanism to read past logs (I have to download eclecticlight's Ulbow for that).
This is the kind of thing that makes me want to grab Craig Federighi by the scruff and rub his nose in it. Every event that’s scrolling by here, an engineer thought was a bad enough scenario to log it at Error level. There should be zero of these on a standard customer install. How many of these are legitimate bugs? Do they even know? (Hahaha, of course they don’t.)
Something about the invisibility of background daemons makes them like flypaper for really stupid, face-palm level bugs. Because approximately zero customers look at the console errors and the crash files, they’re just sort of invisible and tolerated. Nobody seems to give a damn at Apple any more.
They could start attacking those common errors first, so that a typical Mac system has no regular errors or faults showing up. Then, you could start looking at errors which show up on weirdly configured end user systems, when you've gotten rid of all the noise.
But as long as every system produces tens of thousands of errors and faults every day, it's clear that nobody cares about fixing any of that.
grumble
Spotlight, aside from failing to find applications also pollutes the search results with random files it found on the filesystem, some shortcuts to search the web and whatnot. Also, at the start of me using a Mac it repeatedly got into the state of not displaying any results whatsoever. Fixing that each time required running some arcane commands in the terminal. Something that people associate with Linux, but ironically I think now Linux requires less of that than Mac.
But in Tahoe they removed the Applications view, so my solution is gone now.
All in all, with Apple destroying macOS in each release, crippling DTrace with SIP, Liquid Glass, poor performance monitoring compared to what I can see with tools like perf on Linux, or Intel VTune on Windows, Metal slowly becoming the only GPU programming option, I think I’m going to be switching back to Linux.
> I quickly found out that Apple Instruments doesn’t support fetching more than 10 counters, sometimes 8, and sometimes less. I was constantly getting errors like '<SOME_COUNTER>' conflicts with a previously added event. The maximum that I could get is 10 counters. So, the first takeaway was that there is a limit to how many counters I can fetch, and another is that counters are, in some way, incompatible with each other. Why and how they’re incompatible is a good question.
Also: https://hmijailblog.blogspot.com/2015/09/using-intels-perfor...
Your second example, is the complaint that Instruments doesn't have flamegraph visualization? That was true a decade ago when it was written, and is not true today. Or that Instrument's trace file format isn't documented?
Why do I like Instruments and think it is better? Because the people who designed it optimized it for solving real performance problems. There are a bunch of "templates" that are focused on issues like "why is my thing so slow, what is it doing" to "why am I using too much memory" to "what network traffic is coming out of this app". These are real, specific problems while perf will tell you things like "oh this instruction has a 12% cache miss rate because it got scheduled off the core 2ms ago". Which is something Instruments can also tell you, but the idea is that this is totally the wrong interface that you should be presenting for doing performance work since just presenting people with data is barely useful.
What people do instead with perf is they have like 17 scripts 12 of which were written by Brendan Gregg to load the info into something that can be half useful to them. This is to save you time if you don't know how the Linux kernel works. Part of the reason why flamegraphs and Perfetto are so popular is because everyone is so desperate to pull out the info and get something, anything, that's not the perf UI that they settle for what they can get. Instruments has exceptionally good UI for its tools, clearly designed by people who solve real performance problems. perf is a raw data dump from the kernel with some lipstick on it.
Mind you, I trust the data that perf is dumping because the tool is rock-solid. Instruments is not like that. It's buggy, sometimes undocumented (to be fair, perf is not great either, but at least it is open source), slow, and crashes a lot. This majorly sucks. But even with this I solve a lot more problems clicking around Instruments UI and cursing at it than I do with perf. And while they are slow to fix things they are directionally moving towards cleaning up bugs and allowing data export, so the problems that you brought up (which are very valid) are solved or on their way towards being solved.
The implication that perf is not is frankly laughable. Perhaps one major difference is that perf assumes you know how the OS works, and what various syscalls are doing.
You just proved again that it's not optimized for reality because that knowledge can't be assumed as the pool of people trying to solve real performance problems is much wider than the pool with that knowledge
Only a system reinstall + manually deleting all index files fixed it. Meanwhile it was eating 20-30GB of disk space. There are tons of reports of this in the apple forums.
Even then, it feels a lot slower in MacOS 26 than it did before, and you often get the rug-pull effect of your results changing a millisecond before you press the enter key. I would pay good money to go back to Snow Leopard.
That being said, macOS was definitely more snappy back on Catalina, which was the first version I had so I can't vouch for Snow Leopard. Each update after Catalina felt gradually worse and from what i heard Tahoe feels like the last nail in the coffin.
I hope the UX team will deliver a more polished, expressive and minimal design next time.
It is completely useless on network mounts, however, where I resort to find/grep/rg
Firstly performance issues like wtf is going on with search. Then there seems to be a need to constantly futz with stable established apps UXes every annual OS update for the sake of change. Moving buttons, adding clicks to workflows, etc.
My most recent enraging find was the date picker in the reminders app. When editting a reminder, there is an up/down arrow interface to the side of the date, but if you click them they change the MONTH. Who decided that makes any sense. In what world is bumping a reminder by a month the most common change? It’s actually worse than useless, its actively net negative.
I just got my first ARM Mac to replace my work Win machine (what has MS done to Windows!?!? :'()
Used to be I could type "display" and Id get right to display settings in settings. Now it shows thousands of useless links to who knows what. Instead I have to type "settings" and then, within settings, type "display"
Still better than the Windows shit show.
Honestly, a well setup Linux machine has better user experience than anything on the market today.
We probably have to preface that with “for older people”. IMO Linux has changed less UX wise than either Windows or MacOS in recent years
For several decades, I have used hundreds of different computers, from IBM mainframes, DEC minicomputers and early PCs with Intel 8080 or Motorola MC6800 until the latest computers with AMD Zen 5 or Intel Arrow Lake. I have used a variety of operating systems and user interfaces.
During the first decades, there has been a continuous and obvious improvement in user interfaces, so I never had any hesitation to switch to a new program with a completely different user interface for the same application, even every year or every few months, whenever such a change resulted in better results and productivity.
Nevertheless, an optimum seems to have been reached around 20 years ago, and since then more often than not I see only worse interfaces that make harder to do what was simpler previously, so there is no incentive for an "upgrade".
Therefore I indeed customize my GUIs in Linux to a mode that resembles much more older Windows or MacOS than their recent versions and which prioritizes instant responses and minimum distractions over the coolest look.
In the rare occasions when I find a program that does something in a better way than what I am using, I still switch immediately to it, no matter how different it may be in comparison with what I am familiar, so conservatism has nothing to do with preferring the older GUIs.
A consequence of having "UI designers" paid on salary instead of individual contract jobs that expire when the specific fix is complete. In order to preserve their continuing salary, the UI designers have to continue making changes for changes sake (so that the accounting dept. does not begin asking: "why are we paying salary for all these UI designers if they are not creating any output"). So combining reaching an optimum 20 years ago with the fact that the UI designers must make changes for the sake of change, results in the changes being sub-optimal.
“I've come up with a set of rules that describe our reactions to technologies:
1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
3. Anything invented after you're thirty-five is against the natural order of things.”
Douglas Adams
I just installed Plasma with Endevouros and use it. I used Cinnamon before it. They don't require much effort.
And yet on Windows 11, hit Win key, type display, it immediately shows display settings as the first result.
People are really unable to differentiate “I am having issues” and “things are universally or even widely broken”
I've been using spotlight since it was introduced for... everything. In Tahoe it has been absolutely terrible. Unusable. Always indexing. Never showing me applications which is the main thing I use it for (yes, it is configured to show applications!). They broke something.