Automation
Kristian Willmott Headshot
Kristian Willmott Head of Marketing

Socially Responsible Automation

The first industrial revolution was sprung forward by great innovations in automation and machinery. Among the humming of the spinning jenny, the hissing of Watt's steam engine, in the tapping’s of morse code and under the lighting of the incandescent lamp, a revolution was being forged through automation, ingenuity and an unleashing of the spirit of entrepreneurship. However, this revolution also produced the death of industries, the unemployment of artisans and insecurity that has been cultivated in the minds of workers ever since.

This very insecurity drove the Luddite movement to scourge an industrialising North of England between 1811-1816, burning machinery and decimating property rights. As we embark on the next industrial revolution, driving greater automation, digitalisation and hyper-connectivity, industry needs to consider the societal costs of the next leap forward. By encouraging socially responsible automation (SRA), it will safeguard the human role in the labour market and build the political support needed to stimulate greater integration of artificial intelligence (AI) and machine learning (ML) in society.

The Luddite revolutions of the first industrial revolution, the communist revolutions of the early 20th century and industrial strife of the 70s sent shock waves through the pillars of capitalism. If we fail to account for the societal costs of the next industrial revolution, we could risk similar upheavals. 

 

The first industrial revolution ushered in new inventions in industrial robotics, automating much of manufacturing and farming. Consequently, a co-product of this was an increasing replacement of human capital. However, unlike the low skilled labour replacements of the previous leaps forward, automation will now be geared for more complex tasks, infiltrating the knowledge economy and developing cognitive agents. Resultantly, this disruptive power could upset the medium-high skilled labour market and thus have profound impacts on inequality and lower employment rates. Therefore, it is essential to consider SRA or a human-centred template of future automation. For this to happen, the technology community will need to take a more humanistic approach in designing and developing automated products, as well as maintain the traditional focus on economic value. As such, this would require developers to understand the “societal costs'' to their products and input “human values” into AI (Sampath M, Khargonekar P; 2018).

Additionally, SRA would allow for more predictable and stable job’s, giving people confidence that they won’t be replaced by automation and allowing for greater cooperation between human capital and AI. For instance, a blended approach can already be witnessed in the warehouses of Amazon, Ocado and Alibaba as robots do the systematic and menial jobs of transporting and packing goods. This allows for human capital to focus on flexible and accountable work, whilst also allowing firms to also cut operational costs and increase fulfilment time. Another good example is Toyota, which has a proud history of automation, but it does not ‘primarily target labour to reduce production expenses’ (Sampath M, Khargonekar P; 2018) but instead uses automation to increase the quality of the product and improve the efficiency of assembling resources.

 

Ultimately, this question boils down to how we continue the process of automating the economy, allowing for all its economic efficiencies and productive value, whilst protecting the livelihoods of millions of people. There is a balance to be reached between cut-throat automation and innovation hampering labour protections. We can turn to history to be a guide, too much automation will spark the fear that set the Luddite rebellions ablaze, too much regulation will meet the innovation deficit that plagued the USSR and lead to its ultimate collapse in 1991.

 

To conclude, socially responsible automation will be essential to ease the transition to the new economy and limit the potentially harmful societal costs of automation. If technology developers, regulators, industrialists all take a socially responsible outlook to automation, economic values and human values could be aligned but there is a balance to be found. What is your position on this? Should we allow automation to follow the natural course or should we regulate the roles that are automated to ensure human value is protected?