logoalt Hacker News

Aurornistoday at 6:46 PM2 repliesview on HN

The new measure:

> As of 2025, the time needed to earn $1 is 63 minutes in the US.

Confused, I clicked one of the links and tried to understand. Found this:

> The time to get $1 refers to a day of life for anyone at any age and in any circumstance, not just the hours worked by someone with a job.

Clicking another link took me to the abstract at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4785458 but that didn't answer any questions either.

I can't find anything really of substance in this, other than someone trying to redefine a lot of terms in confusing ways

$1 every 63 minutes would be $8343/year. I cannot think of any way to reconcile that with the US average household income or any other related figure.


Replies

akamakatoday at 7:03 PM

I did the same math. The closest guess I have is that it is derived from the poverty line for a family of four, $32150 (which divided by four is $8037).

tovejtoday at 7:08 PM

That's because it is the average of the "time to earn 1$" per individual.

So let's say you're Elon Musk and it takes you a negligible enough time to do this that we can say that t_Elon = 0.

Now say you are way below the poverty line and earn 6000$/year. This means t_Poor = 87 mins.

If we average 80 t_Poor and 20 t_Elon we find we get 0.8 x 87 mins = 67 mins. Even when the average income in this case would be 0.2 x income_Elon. Something like 7 billion $/year.

I hope this shows why you can't just take the inverse to get the average income. The only way that was true was if everyone earned the exact same income.

Why is this a better metric?

The average income is biased towards big earners, while this metric is more centered around the mode of the distribution (poor people).

It captures the income distribution much better than average income.

show 2 replies