If you do want to use average, you'd at least need to remove 10% both from the top and bottom before calculating it, but it's still gonna be super untrustworthy.
Not sure what to take away from your comment, I'm still unsure what kind of metric you're pitching and why it'd be a valuable thing to track
It's focused on the very poorest, who are not the mode. (Income distribution is approximately lognormal; see https://www.researchgate.net/figure/The-lognormal-distributi...).
Say you have 10 people: one making $800/year, 8 making $80k/year, and one evil billionaire making $800 million. Their times to earn $1 are respectively 10 hours, 0.1 hours, and essentially zero. If you take the arithmetic mean of that you get 1.09 hours, and that's dominated by the single poor person. If you double that person's income to $1600, then they're at 5 hours to earn $1, and the overall average is nearly cut in half to 0.58. Meanwhile you can reduce the income of all the middle class people to $40k and not much changes; the average time to $1 would be (5+8(0.2)+0)/10=0.66.
It captures the income distribution much better than average income.
Not really, and certainly not better than median income which is what people typically use. It tries to measure exactly how little income the very poor make, which is not normally what people mean when they talk about inequality or poverty, and also hard to measure at the accuracy that you need when small changes produce huge swings in the result. In particular I don't believe he's correctly accounted for government benefits; hardly anyone in the US is consuming less than $8000/year.