Will the IoT Lead to Mass Layoffs or Massive Productivity?
The Internet of Things could potentially be a double-edged sword when it comes to employment, but there is reason for optimism for those who embrace the technology.
July 7, 2016
2016 has been a bloody year for tech layoffs for big companies such as IBM, Intel, Microsoft, and VMware. A recently leaked memo from IBM shed light on that firm’s plans to cut workers in the Netherlands: “The demand of the market for new skills and capabilities, is changing fast due to new technologies and changing business models.” The memo continues: “Our customers have a need for new insights, knowledge and capabilities, making the existing expertise redundant. That is why the optimization of our workforce is a permanent and ongoing part of our business model.”
For IBM, technologies such as its IoT-savvy cognitive computing platform Watson will lead the way forward, while its legacies businesses become less important.
The same general principle applies to employees. Earlier this year, IBM had an undisclosed wave of layoffs, which could ultimately affect more than 14,000 workers, according to an estimate quoted in The Wall Street Journal. Meanwhile, the company has nearly 8000 job openings in sectors such as its Watson division.
Other big companies are taking similar steps. Earlier this year, Intel cut 12,000 workers. Intel CEO Brian Krzanich reasoned that it was imperative that the company ditch its PC business. The company plans to reinvest the savings in its cloud computing and the Internet of Things business units.
How the IoT Might Transform Employment
On the one hand, the Internet of Things could lead to huge gains in productivity. According to GE’s estimates, the Industrial Internet will add more to the global economy by 2030 than any major economy except the United States and China. If the industrial revolution sparked by the IoT follows the trajectory of such previous revolutions, average income and living standards are poised to rise in industrially advanced economies. In the United States, productivity driven by the Industrial Internet could lead to a bump in average income from 20 to 30% by 2030, according to GE’s estimates. The Industrial Internet could have enormous benefits for the productivity of agriculture, infrastructure, manufacturing, and other fields.
The IoT and related technologies like cloud computing could make millions traditional jobs redundant as well. A 2013 Oxford study titled “The Future of Employment: How Susceptible Are Jobs to Computerisation?” concluded that 47% of U.S. jobs are at risk of being replaced by machines. McKinsey & Company reckons that 45% of current work activities could be automated now. Workers in fields such as sales, service fields, transportation and material moving, production, and office and administrative support could be among the hardest hit but even professionals such as doctors and lawyers could get squeezed by technology. In the transportation segment alone, self-driving cars, trucks, buses, trains, and ships could have a huge impact alone.
Autonomous robots have already begun testing delivering packages in Europe. The automotive industry could also struggle to adjust to a transportation model in which people give up on private vehicle ownership in favor of shared driverless transportation.
The potential of technology to outperform humans in particular tasks was hinted at in the late 1990s when Kevin Ashton came up with the name “Internet of Things.” Then selling cosmetics in the United Kingdom, Ashton wondered why no one seemed to be able to track why a particular shade of lipstick was disappearing from store shelves and thought that a chip with an RFID antenna could do a better job at tracking what was going on. In 2009, Ashton explained to RFID Journal: “people have limited time, attention and accuracy—all of which means they are not very good at capturing data about things in the real world.”
While sensors and computers may be adept at collecting and analyzing data, it is likely premature to infer that the Internet of Things would cause widespread unemployment. Pundits like Erik Brynjolfsson and Andrew McAfee, however, have reasoned that advances in technology are to blame for the tepid employment growth in developed nations in the past decade or two.
It is certainly not hard to track technology’s role when tracking the dwindling number of agricultural laborers, for instance. Roughly a century ago, agricultural workers made up close to 5% of the total workforce in the United Kingdom, reports Katie Allen in The Guardian. Now, the figure is a fraction of a percent. But the paper argues that technology has driven substantial increases in the number of jobs in knowledge-intensive sectors, and, after looking at 140 years’ worth of data, it concludes that technology has created more jobs than it has destroyed.
Last year, The Guardian posed the question whether the IoT and automation technologies would could massive unemployment. It is much more likely, however, that the technologies will cause gradual—albeit significant shifts—in employment over time. As Harvard professor Shoshana Zuboff has quipped, “everything that can be automated will be automated.” But even if companies wanted to automate everything possible all at once, they wouldn't be able to. Meanwhile, however, the employment market seems to be getting more competitive.
About the Author
You May Also Like