I’ve been meaning to write a post about the future of work. Several young people have expressed concern that increasing automation — especially from software and robots — will mean that there will not be jobs for millions of Americans in the future. I’ve pointed out that 95% of the US workforce in the late 1700s worked in agriculture, that figure dropped to 50% by 1900, and today (2015) stands at roughly 1.2%. And yet people have managed to find new kinds of jobs.
But then I read this opinion piece in the WSJ yesterday, and it was so excellent that I’m sharing it here.
What is unique about today’s digital revolution is the suspicion, fanned by progressives, that for the first time technology threatens to make obsolete not only some jobs—as assembly-line robotics has, for instance—but human labor itself. The fear is that we are entering an era when sophisticated software will leave only the most intelligent with high-paying jobs.
Political activist Jeremy Rifkin made this case in his 1995 book, “The End of Work.” The world, he argued, “is fast polarizing into two potentially irreconcilable forces: on the one side an information elite that controls and manages the high-tech global economy; and on the other, the growing numbers of displaced workers, who have few prospects and little hope for meaningful employment in an automated world.”
In the winter 2012 issue of the socialist journal Jacobin, editor Peter Frase laid out what has become a familiar refrain: American capitalism is under challenge not by an oppressed proletariat, but by an underused one, seeking both sustenance and dignity in an increasingly automated workplace.
So strongly is today’s left committed to this idea of thwarted human potential that a recent guidebook to political incorrectness distributed by the University of California administration advised against phrases such as “America is the land of opportunity.” The newest sin among progressives is even the suggestion that the average person has a chance to be self-supporting.
No wonder that in her attempt to placate the progressive wing of the Democratic Party, Hillary Clinton focused her economic speech last month at Manhattan’s New School on a polarized economy “benefiting high-skilled workers, but displacing and downgrading blue-collar jobs and other mid-level jobs that used to provide solid incomes for millions of Americans.” Among her solutions: artificially raise wages of poorly educated workers and put the recovery from more-aggressive government lawsuits against corporations “into a separate trust fund to benefit the public.”
Unfortunately for the left, not everyone who studies technological disruption is convinced that digital innovation threatens the job market—or that American society is splitting between a few super-intelligent overlords and a permanent underclass.
At MIT’s Center for Digital Business, for instance, Erik Brynjolfssonhas long argued that computers, while great at pattern recognition and repetitive tasks, are “lousy general problem solvers” and, for all their speed, have “little creative ability.” While they may someday chauffeur us to the office in driverless cars, they “are lost when asked to work even a little outside a predefined domain.”
In the future, becoming a policeman, plumber or doctor will not require beating out a computer for the job, Mr. Brynjolfsson writes in his 2012 paper “Race Against the Machine.” It will involve knowing how to use technology to maximize one’s performance. Success won’t demand an intellectual capacity beyond the current average, but it will require a better preparation than most K-12 schools provide. “Unfortunately,” he adds, “our educational progress has stalled.”
That poor schooling, and not some intrinsic human limitation, is the real barrier to full employment seems to be borne out by what economists call the “skills gap.” More than nine million Americans are currently looking for work, but 5.4 million job openings continue to sit unfilled, according to the Bureau of Labor Statistics. Most of the largest increases have been in health care or professional and business services.
In a recent study by the large U.S. online job site, CareerBuilder, more than half the employers surveyed had positions for which they could not find qualified candidates: 71% had trouble finding information-technology specialists, 70% engineers, 66% managers, 56% health-care and other specialists, and 52% financial operations personnel. Nearly half of small and medium-size employers say they can find few or no “qualified applicants” for recent vacancies, according to the latest survey by the National Federation of Independent Businesses.
With the Labor Department conceding that help-wanted postings have “remained at a historically high level,” this is the time not to rail against technology but to use it to make education more effective: gearing coursework to the learning styles of individual students, identifying and remedying disabilities early on, and providing online access to the best classes in the world.
Creating the kind of educational system that will preserve the value of human labor far into the future will not be easy, especially when progressivism’s strongest allies, the teacher unions, are reluctant to pitch in. But if Aldous Huxley, George Orwell and other great science-fiction writers have taught us anything, it is that groups identified as “obsolete” can quickly be deemed “inferior”—and a paternalistic government can become the beneficiary’s worst nightmare.
Mr. Andrews was executive director of the Yankee Institute for Public Policy from 1999 to 2009. He is the author of To Thine Own Self Be True: the Relationship between Spiritual Values and Emotional Health (Doubleday, 1989).