Get the Ludd Out

I am all for advancement, progress means growth and a move forward. We should always move forward never back. Funny that however does not apply to the political and social landscape as regards to racism, sexism and classicism. We are a class-centric nation and when you are in the center the move away does not always mean its a good thing.

The Tech sector prides itself on being the worlds darling, and the increasing IPO’s and obsession by the political elite to see it as a saving grace in an economic minefield is one of both tragedy and comedy. The Tech Sector prides itself on “saving the world” and by that I think they mean one billion dollar IPO at a time. The world notsomuch.

There is a never ending hypocrisy and fraud associated with the Tech sector and their claims of meritocracy are limited to the few faces of color and ethnicity that are “recruited” via H1B1 Visas and other immigrant shilling that defines immigration reform to the Valley by the San Francisco bay.

But what it also means is a way as in “away” from job creation to wealth creation. The slow disintegration of jobs and skills that are needed less when more toys, apps and other means are created to eliminate the need for people. From Google Glass to the Driver less cars who needs people, right Barbra?

The article below is about this very issue and my personal favorite is the medical app that can predict post partum depression in women by their Twitter feeds. I rolled on the floor with that one. Wow the tech sector’s idiocy and degradation of people smacks of both hubris and arrogance. Maybe that is why many are anti-abortion they are afraid if their Parents knew they would breed such anti social angry children seeking revenge they might have ended that all before it began. Breeding its what humans do, the tech sector fuck robots apparently. Harsh? Yes.

Tech Leaps, Job Losses and Rising Inequality

APRIL 15, 2014
Eduardo Porter

It’s hard to overstate the excitement of tech people about what is on the verge of happening to the practice of medicine.

Eric Horvitz, co-director of Microsoft Research’s main lab in Redmond, Wash., told me about a system that could predict a pregnant woman’s odds of suffering postpartum depression with uncanny accuracy by looking at her posts on Twitter, measuring signs like how many times she used words like “I” and “me.”

Ramesh Rao of the California Institute for Telecommunications and Information Technology at the University of California, San Diego, described how doctors using video and audio to remotely assess victims of stroke made the correct call 98 percent of the time.

This is just the beginning. “The real innovative things have yet to be activated,” Mr. Rao said. “Whatever happens will be disruptive.”

That’s not the half of it.

A few years ago, this kind of technological development would be treated like unadulterated good news: an opportunity to improve the nation’s health and standard of living while perhaps even reducing health care costs and achieving a leap in productivity that would cement the United States’ pre-eminent position on the frontier of technology.

The share of income taken by workers has been shrinking around the world, as labor faces more competition and new labor-saving technologies.

But a growing pessimism has crept into our understanding of the impact of such innovations. It’s an old fear, widely held since the time of Ned Ludd, who destroyed two mechanical knitting machines in 19th-century England and introduced the Luddite movement, humankind’s first organized protest against technological change.

In its current incarnation, though, the fear is actually very new. It strikes against bedrock propositions developed over more than half a century of economic scholarship. It can be articulated succinctly: What if technology has become a substitute for labor, rather than its complement?

As J. Bradford Delong, a professor of economics at the University of California, Berkeley, wrote recently, throughout most of human history every new machine that took the job once performed by a person’s hands and muscles increased the demand for complementary human skills — like those performed by eyes, ears or brains.

But, Mr. Delong pointed out, no law of nature ensures this will always be the case. Some jobs — nannies, say, or waiting tables — may always require lots of people. But as information technology creeps into occupations that have historically relied mostly on brainpower, it threatens to leave many fewer good jobs for people to do.

These sorts of ideas still strike most mainstream economists as heretical, an uncalled-for departure from a canon that states that capital — from land and lathes to computers and cyclotrons — is complementary to labor.

It was a canon written by economists like Robert Solow, who won the Nobel in economic science for his work on how labor, capital and technological progress contribute to economic growth. He proposed more than 50 years ago that the share of an economy’s rewards accruing to labor and capital would be roughly stable over the long term.

But evidence is emerging that this long-held tenet is no longer valid. In the United States, the share of national income that goes to workers — in wages and benefits — has been falling for almost half a century.

Today it’s at its lowest level since the 1950s while the returns to capital have soared. Corporate profits take the largest share of national income since the government started measuring the statistic in the 1920s.

In a recent interview, Professor Solow stressed that his proposition of relatively stable labor and capital shares assumed “an economy in a steady state with no systematic structural changes occurring.”

That assumption doesn’t seem to hold anymore. “Over the last few decades something structural might be happening to the economy that seems to want to increase the capital share,” he said.

Professor Solow suggests that technology is probably not the only cause of labor’s declining share. He cites “everyday reasons,” including the erosion of the minimum wage, the decimation of trade unions and anti-labor legislation.


But technology clearly plays a role. “We will know better in 10 or 15 years,” Professor Solow said. “But if I had to interpret the data now, I would guess that as the economy becomes more capital intensive, capital’s share of income will rise.”

This shift is happening globally. In a recent article in the Quarterly Journal of Economics, Loukas Karabarbounis and Brent Neiman from the University of Chicago’s Booth School of Business found that the share of income going to workers has been declining around the world.

As the cost of capital investments has fallen relative to the cost of labor, businesses have rushed to replace workers with technology.

“From the mid-1970s onwards, there is evidence that capital and labor are more substitutable” than what standard economic models would suggest, Professor Neiman told me. “This is happening all over the place. It is a major global trend.”

The implication is potentially dire: The vast disparities in the distribution of income that have been widening inexorably since the 1980s will widen further.

This is hardly a consensus reading of the record. “It is hard to make a very definite prediction about how the capital-income share will evolve over the next 10 years,” Daron Acemoglu, a colleague of Mr. Solow’s at M.I.T., told me. “Future technology could maybe increase the contribution of labor.”

Tyler Cowen, a professor of economics at George Mason University, argues that the very definitions of labor and capital are arbitrary. Instead, he looks around the world to find the relatively scarce factors of production and finds two: natural resources, which are dwindling, and good ideas, which can reach larger markets than ever before.

If you possess one of those, then you will reap most of the rewards of growth. If you don’t, you will not.

Conventional wisdom in economics has long held that technological change affects income inequality by increasing the rewards to skill — through a dynamic called “skill-biased technical change.” Losers are workers whose job can be replaced by machines (textile workers, for example). Those whose skills are enhanced by machines (think Wall Street traders using ultrafast computers) win.

It is becoming increasingly apparent, however, that this is not the whole story and that the skills-heavy narrative of inequality is not as straightforward as economists once believed. The persistent decline in the labor share of income suggests another dynamic. Call it “capital-biased technical change” — which encourages replacing decently paid workers with a machine, regardless of their skill.

For instance, r esearch by the Canadian economists Paul Beaudry, David Green and Benjamin Sand finds that demand for highly skilled workers in the United States peaked around 2000 and then fell, even as their supply continued to grow. This pushed the highly educated down the ladder of skills in search of jobs, pushing less-educated workers further down.

This dynamic opens a new avenue for inequality to widen: the rise in the rewards to inherited wealth, a topic explored in depth in Thomas Piketty’s expansive new book, “Capital in the Twenty-First Century.”

So what about the long-term prospect of good jobs in medicine? Policy makers hold fast to the hope that a growing health care industry will support the American middle-class worker of the future. But technology could easily disrupt this promise too.

“Health care jobs may be safe now,” said Gordon Hanson, a professor of economics at the University of California, San Diego, “but our sense of what’s safe has been consistently belied by the impact of our technological progress.”

Or as Mr. Rao put it, diagnosing depression from Twitter posts “doesn’t require any medical training.

The only safe route into the future seems to be to already have a lot of money.

Sympathy for the Devil

I wrote about Luddites in an early blog and upon the announcement that more Americans are enrolling and in turn completing college that should be a good thing, right? Well in better days that may be true but in reality the truth is that many degrees are already obsolete, useless and expensive for what the economy offers in way of compensation for the effort and expense.

The saving world sector claims (as do others) that current Americans lack the skill set to do the jobs available. They are however quite vague as to what those skill sets are, unless you mean work for less wages and benefits without complaint, then it is quite clear.

 I do agree that we have a serious problem in communication skills, basic writing, reading and in turn cognitively connecting to those disciplines are rising. But to point out it was a 29 year old high school dropout that has shaken the world to its core with his PRISM revelations and ironically only a week or two before another high school dropout was acclaimed for selling his utterly over hyped dot.com/bomb/app or whatever to Yahoo. Ah yes the tech industry is awash in tales of the famous for dropping out, Gates, Jobs and Zuckerberg, who made it, yet they claim that without that same degree Americans are useless to do the jobs they outsource, in source or tech out to.

Paul Krugman’s column discusses the hypocrisy and in turn bullshit that the idea that a degree is what will save us from ourselves.

Sympathy for the Luddites

By PAUL KRUGMAN

Published: June 13, 2013
In 1786, the cloth workers of Leeds, a wool-industry center in northern England, issued a protest against the growing use of “scribbling” machines, which were taking over a task formerly performed by skilled labor. “How are those men, thus thrown out of employ to provide for their families?” asked the petitioners. “And what are they to put their children apprentice to?”

Those weren’t foolish questions. Mechanization eventually — that is, after a couple of generations — led to a broad rise in British living standards. But it’s far from clear whether typical workers reaped any benefits during the early stages of the Industrial Revolution; many workers were clearly hurt. And often the workers hurt most were those who had, with effort, acquired valuable skills — only to find those skills suddenly devalued.

So are we living in another such era? And, if we are, what are we going to do about it?

Until recently, the conventional wisdom about the effects of technology on workers was, in a way, comforting. Clearly, many workers weren’t sharing fully — or, in many cases, at all — in the benefits of rising productivity; instead, the bulk of the gains were going to a minority of the work force. But this, the story went, was because modern technology was raising the demand for highly educated workers while reducing the demand for less educated workers. And the solution was more education.

Now, there were always problems with this story. Notably, while it could account for a rising gap in wages between those with college degrees and those without, it couldn’t explain why a small group — the famous “one percent” — was experiencing much bigger gains than highly educated workers in general. Still, there may have been something to this story a decade ago.

Today, however, a much darker picture of the effects of technology on labor is emerging. In this picture, highly educated workers are as likely as less educated workers to find themselves displaced and devalued, and pushing for more education may create as many problems as it solves.

I’ve noted before that the nature of rising inequality in America changed around 2000. Until then, it was all about worker versus worker; the distribution of income between labor and capital — between wages and profits, if you like — had been stable for decades. Since then, however, labor’s share of the pie has fallen sharply. As it turns out, this is not a uniquely American phenomenon. A new report from the International Labor Organization points out that the same thing has been happening in many other countries, which is what you’d expect to see if global technological trends were turning against workers.

And some of those turns may well be sudden. The McKinsey Global Institute recently released a report on a dozen major new technologies that it considers likely to be “disruptive,” upsetting existing market and social arrangements. Even a quick scan of the report’s list suggests that some of the victims of disruption will be workers who are currently considered highly skilled, and who invested a lot of time and money in acquiring those skills. For example, the report suggests that we’re going to be seeing a lot of “automation of knowledge work,” with software doing things that used to require college graduates. Advanced robotics could further diminish employment in manufacturing, but it could also replace some medical professionals.

So should workers simply be prepared to acquire new skills? The woolworkers of 18th-century Leeds addressed this issue back in 1786: “Who will maintain our families, whilst we undertake the arduous task” of learning a new trade? Also, they asked, what will happen if the new trade, in turn, gets devalued by further technological advance?

And the modern counterparts of those woolworkers might well ask further, what will happen to us if, like so many students, we go deep into debt to acquire the skills we’re told we need, only to learn that the economy no longer wants those skills?

Education, then, is no longer the answer to rising inequality, if it ever was (which I doubt).

So what is the answer? If the picture I’ve drawn is at all right, the only way we could have anything resembling a middle-class society — a society in which ordinary citizens have a reasonable assurance of maintaining a decent life as long as they work hard and play by the rules — would be by having a strong social safety net, one that guarantees not just health care but a minimum income, too. And with an ever-rising share of income going to capital rather than labor, that safety net would have to be paid for to an important extent via taxes on profits and/or investment income.

I can already hear conservatives shouting about the evils of “redistribution.” But what, exactly, would they propose instead?