In the first quarter of the 19th century, a series of violent revolts in England were perpetrated by the Luddites, textile laborers gravely concerned about the degradation of their jobs and working conditions in the midst of the Industrial Revolution. These disgruntled workers smashed, burned and destroyed weaving machines used to make fabric and clothing. While their arguments were mostly against unfair labor practices, news reports at the time focused on the Luddites' alleged hatred and mistrust of industrial machines doing away with their jobs.
The Luddite revolts have come to symbolize today's fear in many employee circles about machines, robots and artificial intelligence constructs disrupting the workplace by eliminating or minimizing many occupations. From a historical point of view, the Luddites were proven wrong; mass production and manufacturing eventually increased lowered consumer prices and increased demand, which ultimately resulted in more jobs and safer working conditions.
It would be tempting to dismiss today's fears about the potential of automation, robotics and artificial intelligence to eliminate jobs as Luddite sentiment or as a technophobic fantasy, but there are some valid concerns in this regard. Consider the high level of estrangement coming from workers in the American manufacturing sector, many of whom helped to elect Donald Trump as President of the United States based on his campaign promises to bring back high-paying factory jobs to depressed industrial towns. A similar anxiety is being experienced among skilled workers and professionals who feel that artificial intelligence is a threat to their job security.
The debate surrounding the incorporation of advanced technology such as artificial intelligence in the workplace is marked by two main opposing factions. On one hand, there are those who take on a fatalist view of Luddite proportions. On the other hand, there are others who have a hopeful view of how technology will result in greater efficiency without massive job losses.
The Pew Research Center conducted a survey among 1,896 experts on computer science and technology about how robotics and artificial intelligence will shape our everyday lives by the year 2025. With regard to the economic impact on the future of jobs, the experts surveyed were uniformly split between those who think that worker displacement will be extensive (48%) and those who believe that the same innovation that continues to drive AI development will prevail in terms of job creation and expansion of industries.
Although the Luddites were eventually proved wrong, their concerns seemed to be valid for years after many of them were executed due to their violent reaction to the industrial revolution. In the British city of Manchester, a major industrial textile center, 19th century engineer Richard Roberts was approached by wealthy mill owners exasperated by the attitudes of workers emboldened by the Luddites to stage labor stoppages and demand higher wages and improved working conditions. Roberts created an advanced yarn spinning machine that came to be called Iron Man due to its efficient mechanism that mimicked human actions. Iron Man and more advanced textile machines confirmed the fatalistic view of Luddites, and this is something that is somewhat echoed by a January 2017 study published by respected think tank McKinsey and Company, which determined that 30% of work tasks, duties and responsibilities across 60% of modern occupations can be automated or computerized.
AI software could result in the global loss of five million jobs by the year 2020; this gloomy forecast was recently issued at the World Economic Forum. Going as far back as 2013, Oxford University researchers suggested that nearly half of all American jobs could be threatened by the advancement of automation across industries. It is important to note that the Oxford study underscores that jobs will not entirely disappear. Many positions will be redefined, but this does not assuage the fears of workers at a time when companies' plans to integrate more automation at their plants eliminate some job positions.
Manufacturing often comes to mind in the fatalistic view because this is a blue collar sector associated with predictive and repetitive tasks. Nonetheless, tax preparation professionals should also be concerned about working in an industry where 99 percent of tasks can be automated, particularly at a time when market leader H&R Block is using Watson, the famous AI supercomputer developed by IBM.
It is of no consolation to workers when respected scientists such as Moshe Ya'akov Vardi of Rice University seem to be in the fatalistic camp when they explain that society needs to tackle some issues such as: what will humans do when all the work is performed by AI and robots? Vardi is not alone in this questioning. American tech entrepreneur Elon Musk, founder of ventures such as Tesla and SpaceX, is concerned that AI development is getting too far ahead of itself, and thus he is actively investing into AI ventures that will allow individuals to connect to neural networks and benefit from experiences similar to machine learning. Musk is of the opinion that humans should not get a raw deal from AI and automation. He believes that humans deserve a chance to develop skills to "compete with the machines."
A Better Tomorrow: The Hopeful View
Moshe Vardi and Elon Musk certainly have valid points that do not subscribe to the Luddite technophobia, but their viewpoints are not in agreement with those who think that AI and other advanced technologies will not take away our jobs. Instead, the hopefuls point out evidence of automated contraptions gone wrong.
From tech giant Amazon purchasing supermarket chain Whole Foods to Panasonic testing out robotic grocery clerks in Japan, checkout automation these days is looking more like it will become the Carl's Jr. fictional automated kiosks in the 2006 futurisic comedy film "Idiocracy." In that movie, the malfunctioning kiosks resulted in vandalism by aggrieved customers. In 2017, researchers from Queensland University in Australia have determined that self-checkout stations are giving loss prevention departments a headache due to an effect known as the deviancy threshold.
In the aforementioned study, researchers determined that individuals withdrawing funds at ATMs are far more likely to pocket an erroneously dispensed $20 bill than they are to take it from a human bank clerk who makes a mistake. Speaking of ATMs, former Google CEO Eric Schmidt is fond of mentioning a study published by the American Enterprise Institute about labor statistics compiled on the employment levels of American bank tellers since ATMs were introduced about five decades ago. Whereas at one point ATMs were thought of as inventions that would put bank clerks out of work, the reality from 1988 to 2004 was that 43% more physical banking branches were opened and staffed, thereby resulting in a higher number of employees hired.
Back to the automated point-of-sale and checkout stations: Sociology Professor Stacy Torres from Albany University penned an emotional New York Times article praising the pleasant, 10-year relationship she held with Esther, a friendly Manhattan cashier who worked at a bakery. Professor Torres spoke warmly about Esther's humanity, the way she remembered coffee orders, her earnest glee when seeing pictures of customers' grandchildren, and her attentive way of helping elderly customers. Amazon is already working on a completely automated neighborhood market solution, and thus it is not unreasonable to think that CEO Jeff Bezos may approve an "Esther the Automated Register" project at Whole Foods, but we already know about its shortcomings. The last thing a supermarket shopper wants to hear is a loud and shaming electronic voice or an admonishing noise when a pack of chewing gum did not scan correctly. These are situations when you need the real Esther, not the automated version, which could end up vandalized by angry humans just like the Idiocracy fast-food kiosks.
Those who are hopeful about the continuing integration of AI into the workplace argue that the goal has never been to remove employees. Instead, society is challenged to embed workers in a more sensible and seamless manner.
Advanced AI developments are getting increasingly better at playing board games, filing tax returns, writing sports reports, and even helping recruiters score candidates. As much as the Luddites may want to gloat at being rightfully fatalistic, the hopefuls are also right to think that massive unemployment will probably not materialize.
In June 2017, Jobscience conducted a global survey of recruiters to find out what they thought was impacting the search for talent. The experts surveyed indicated that they are interested in AI and the ability to streamline recruiting efforts with the help of computers. They also felt the need to hire older, skilled workers, which means less emphasis will be put on short-term contractors. This also means that those who have diversified their careers, those who have talents that they used to learn new skills, and those who have obtained new degrees and certifications in certain fields are the workers of the future. On the other hand, low-skilled individuals whose education levels are not optimal will have a hard time adjusting or finding a job in the new tech economy.
Proper training of new employees, as well as retraining of new workers, is remain essential for societies to think about. Skill-based training needs to address the current gaps in the labor market, and this is not going to be easy. It is unreasonable to think that truck drivers displaced by Otto, Uber's project to develop self-driving technology for long-haul road transportation, will be able to easily obtain certifications in Java programming. In addition to identifying the new skills demanded by the new labor market, feasible training programs must be developed through partnerships between the government, educational institutions, and the private sector.
As the Luddite revolts of the 19th century proved, major changes are not easily assimilated by society. AI, automation are simply technological issues; to think of these advances as dystopic harbingers of doom is not realistic, but neither is the thought that humans are so adaptable that they always figure something out.
Ever since the Information Age was heralded by the sheer advancement of computer networking, technological innovation has been welcomed with great fanfare. Revisiting the Luddite history, we can surmise that tech innovation can also be painful, particularly when it seems as if though it may take away our jobs and careers, two aspects of life that give it purpose.
We know that technology can create wealth and employment; we have seen this a few times over the last five decades. We just need to once again review why the Luddites were wrong and how they were proven wrong, but we need to do this in a manner conducive to the Brave New World we have created.
AI optimists never seem to have the jobs which are threatened by AI.
Comment
All the recruiting news you see here, delivered straight to your inbox.
Just enter your e-mail address below
1801 members
316 members
180 members
190 members
222 members
34 members
62 members
194 members
619 members
530 members
© 2024 All Rights Reserved Powered by
Badges | Report an Issue | Privacy Policy | Terms of Service
With over 100K strong in our network, RecruitingBlogs.com is part of the RecruitingDaily.com, LLC family of Recruiting and HR communities.
Our goal is to provide information that is meaningful. Without compromise, our community comes first.
One Reservoir Corporate Drive
4 Research Drive – Suite 402
Shelton, CT 06484
Email us: info@recruitingdaily.com
All the recruiting news you see here, delivered straight to your inbox.
Just enter your e-mail address below
You need to be a member of RecruitingBlogs to add comments!
Join RecruitingBlogs