Rex Kerr
2 min readJul 23, 2022

--

Do you have a source that supports this claim? I haven't seen one.

Consider Amazon, for instance. They're the poster child for exploitative low-wage work these days, right?

But most of their growth and half their profit is from AWS--almost entirely highly skilled work by people who are compensated decently (because they are in demand by other companies).

Also, what do you consider low-wage? Average income for PhD holders is $100k/year. Of course you can get a PhD and go drive for Uber...but most people don't.

The human cost of having lots of people with low income is undeniable. But unlike when Marx wrote Capital, it is no longer obvious that the low-income work is actually the engine for productivity and profits. Companies, sociopathically serving shareholder interest above all else as is their mandate, will of course exploit everyone for profit if they can. But a lot of low-wage work can be automated right now; the reason it isn't is because wages are low enough so it's not yet worth it.

As an example of the vulnerable and secure jobs: https://www.cityam.com/exclusive-the-jobs-sectors-and-countries-most-at-risk-of-automation-and-robotics/

Almost all of the highly vulnerable jobs are low wage. Almost all the secure ones are highly skilled.

So unless you have a solid source for this, I just think this premise is wrong now (and has been for a while, I'd guess).

Although the human cost is just as real, the solutions are totally different if the reality is that a lot of low-wage work is just grandfathered in because investing in automation has a large up-front cost.

(We see that right now with China--a lot of their advantage over U.S. manufacturers is not in their lower labor costs--though they do have lower labor costs--but in their newer tooling and more extensive automation.)

--

--

Rex Kerr
Rex Kerr

Written by Rex Kerr

One who rejoices when everything is made as simple as possible, but no simpler. Sayer of things that may be wrong, but not so bad that they're not even wrong.

Responses (1)