I’m as frustrated as anybody else with how the economy is going in the US. But we should be skeptical about a new metric with an intuitive name that seems to confirm exactly what we all suspect but is sort of complex to interpret/measure, right?
In particular it seems weird that only we had a massive change during COVID.
Also seems a little odd that Germany was always better than the US, even in the 90’s when things were pretty good here.
Putting it together, we need to have COVID all the time here, so we can match the economic development of Germany immediately post-reunification.
Interesting. I hope this catches on -- it's tough to visualize poverty in concrete terms, but assessing how long it takes a population to make a given unit of purchasing power in average is a clever idea. It's sobering to realize that it would take a friend across the sea over 100 hours to assemble the funds for a $100 bill that I wouldn't look at twice...
Although I wish this sparked a conversation on how we can do better instead of national dick measuring contests. Those don't help.
Average != median. This measure seems to be so high because there are so many low paid workers in the US due to low minimum wage.
Median workers in the US have some of the highest hourly wages at PPP in the rich world and they have been increasing, but they are pretty similar to those in Germany. The big difference in annual pay at PPP is down to hours worked.
For 2022 average annual hours worked per worker in the US is 1790 while in Germany it is 1340 [1]. Meanwhile average hourly wages at PPP in US are $34.9 vs $34.6 in Germany [2]
[1] https://ourworldindata.org/grapher/annual-working-hours-per-...
[2] https://ourworldindata.org/grapher/average-hourly-earnings
This sounds like so much Eurocope: https://www.noahpinion.blog/p/eurocope
> The $1 is measured in international dollars. This means it buys the same amount of goods and services in any country as a US dollar does in the United States. It is often used alongside purchasing power parity (PPP) data. The “time” refers to a day of life for anyone, at any age and in any circumstance — not just the hours worked by someone with a job.
So IIUC this "average poverty" (measured in time per international dollar) includes people living off social welfare? Otherwise, if it only included the working population, wouldn't we have
average poverty ≝ (average yearly income* of the working population / 1yr)⁻¹
and so it should be inversely proportional to the average yearly income* metric mentioned in the article?*) Adjusted for purchasing power, i.e. measured in international dollars.
This metric makes a lot of intuitive sense and reflects the consumer sentiment I hear from neighbors. "Working more for less" isn't a new complaint, but something that measures that is interesting.
I would be very interested to find out how those stats are related to things like, GINI or old pre-GDP economic measures of raw production.
I would guess this is because places like Germany having incredibly low annual working hours.[1] The bottom of the list is populated by all European countries.
[1]https://en.wikipedia.org/wiki/List_of_countries_by_average_a...
it feels counterintuitive to me that US "average poverty" dropped more than 50 percent in covid, while european stayed absolutely untouched.
Sterck's article from The Conversation referenced in this article: https://theconversation.com/measuring-poverty-on-a-spectrum-...
the article doesn't explain how the math works. if min wage is 15$ an hour or 10$, how do they arrive at 1$ for 63 min???
He finds that “average poverty is substantially higher in the US, even though average incomes are higher than in most Western European countries”.
That seems like a complicated way to "talk about median income without talking about median income". By the end, they do describe the basic situation: US has greater total wealth and total income but that wealth and income is so unequally distributed that more people are poor.
OK, the idea is interesting, but the numbers seem completely bogus. In what world does it take the average American 63 minutes to earn $1, even one "international" dollar?
I get the "international" part - purchasing power. The number still seems way off, though.
In a time when minimum wage is $7/hr, how is the average American earning $1/hr?
Can anyone make that number make any sense?
This seems like a quite nice way to measure poverty.
I believe this is the paper https://ora.ox.ac.uk/objects/uuid%3A501e8eb8-3ce7-4ac0-9d09-...
It seems biased to ignore things like growth in housing prices and the stock market where we have seen some massive gains in recent years. If it's easy to invest in property or companies or bonds or treasuries or whatever to make a dollar. That should count.
The declining standard of living in the USA is has become painfully obvious. I think we're past solutions. The question is if it will go the way of Italy or the way of Yugoslavia.
This metric rates very high on my private index Compensation, Obscuring, Paltry, Earnings (COPE)
[dead]
[flagged]
Is the measure they are using inflation adjusted over time? If not this shows an enormous loss in purchasing capacity over time for the average person, which is certainly how its felt over the past decades as inflation has outrun wages for most people.
The new measure:
> As of 2025, the time needed to earn $1 is 63 minutes in the US.
Confused, I clicked one of the links and tried to understand. Found this:
> The time to get $1 refers to a day of life for anyone at any age and in any circumstance, not just the hours worked by someone with a job.
Clicking another link took me to the abstract at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4785458 but that didn't answer any questions either.
I can't find anything really of substance in this, other than someone trying to redefine a lot of terms in confusing ways
$1 every 63 minutes would be $8343/year. I cannot think of any way to reconcile that with the US average household income or any other related figure.