Businesses put AI innovation on hold to avoid public backlash

A major analysis of barriers to AI adoption highlights bias and a lack of explainability among key risk factors

Several significant barriers to building AI embedded with ethical principles are contributing to swelling public distrust in automated and data-driven technologies and software, according to the Centre for Data Ethics and Innovation (CDEI).

A number of UK sectors may be unwilling to experiment and engage in AI-based innovation for fear of sustaining reputation damage, a panel of more than 120 experts concluded in the arms-length government organisation’s AI Barometer.

There are considerable opportunities in successfully adopting ethically-embedded AI, which range from building a fairer justice system to more efficient decarbonisation, and more effective public health research and disease tracking. However, these ‘harder to achieve’ opportunities are unlikely to be realised without concerted government support and a clear national policy, according to CDEI chair Roger Taylor.

Advertisement - Article continues below

“These opportunities have a number of common characteristics,” Taylor said. “They require coordinated action across organisations or ecosystems; they involve the use of very large-scale complex data about people; and they affect decisions that have an immediate and significant impact on people’s lives.

“The second overarching conclusion is that there are a number of common barriers to achieving these ‘harder to achieve’ benefits. Some relate to the workforce – the skills and diversity of those working on these problems. Some involve our state of knowledge, about, for example, what the public will accept as ethical. Others relate to the data governance and regulatory structures we currently have in place.”

Advertisement
Advertisement - Article continues below

As part of the barometer, the panel assessed the viability and severity of 19 common risks across five sectors including criminal justice, financial services, health and social care, digital and social media, as well as energy and utilities.

Both bias leading to discrimination and the lack of explainability were deemed severe risks across four out of the five sectors. This is in addition to cyber attacks, failure of consent mechanisms, and a lack of transparency also being deemed a severe risk in most industries, and a moderate risk across the others.

Advertisement - Article continues below

Conversely, the panellists said that loss of trust in institutions was likely a severe or moderate risk, but the loss of trust in AI, as well as low accuracy, as generally low-to-moderate risks across the five sectors.

There are several barriers that exist to addressing these risks, the report continued, ranging from regulatory confusion to market disincentives.

Related Resource

The IT expert’s guide to AI and content management

Your guide to the biggest opportunities for IT teams when it comes to AI and content management

Download now

Regulatory confusion may arise, for example, for new technologies such as facial recognition where the ethics and application can fall between the gaps of disparate regulators. Market disincentives, meanwhile, would manifest as, for example, social media companies fearing the loss of profits if they take action to mitigate disinformation.

CDEI picked out three barriers in particular that are acutely contributing to a swelling sense of public distrust, namely low data quality and availability, a lock of co-ordinated policy and practice, as well as a lack of transparency around AI and data use.

Advertisement - Article continues below

The use of poor data in training algorithms can lead to faulty of biased systems, the report outlined, with the concentration of market power over data and unwillingness to share data all stymying innovation.

The guidance, training and various approaches used across the development and deployment of AI and data-driven systems is also highly localised and disparate. Regulatory approaches may vary between sectors, which can lead to confusion among those both deploying and overseeing the technology.

This comes in tandem with a lack of transparency, with both the private and public sector not always open about how they use AI, or how they are regulated. This prevents scrutiny and accountability, which could otherwise block ethical innovation.

Without these barriers being addressed, they will feed into a chronic loss of trust, deemed a bigger brake on innovation than several of these barriers combined, the report claims. This would mean consumers are not likely to use new technologies or share their data needed to build them. This would, firstly, affect businesses’ ability to build functional and useful AI products, but also deter them from engaging in innovation for fear of meeting opposition.

Advertisement - Article continues below

The CDEI plans to promote the findings of its 152-page AI Barometer to policymakers and other decision-makers across the industry, in regulation and in research. The report will also be further expanded over the next year with the panel examining additional sectors to gather a broader understanding of the barriers to implementing ethical AI.

The body is also launching a new programme of work that aims to address many of the institutional barriers as they arise in various settings, ranging from policing to social media platforms. The CDEI plans to work with private sector and public sector partners to ensure the recommendations are taken seriously and implemented.

Featured Resources

Preparing for long-term remote working after COVID-19

Learn how to safely and securely enable your remote workforce

Download now

Cloud vs on-premise storage: What’s right for you?

Key considerations driving document storage decisions for businesses

Download now

Staying ahead of the game in the world of data

Create successful marketing campaigns by understanding your customers better

Download now

Transforming productivity

Solutions that facilitate work at full speed

Download now
Advertisement
Advertisement

Most Popular

Visit/business/business-operations/356395/nvidia-overtakes-intel-as-most-valuable-us-chipmaker
Business operations

Nvidia overtakes Intel as most valuable US chipmaker

9 Jul 2020
Visit/laptops/29190/how-to-find-ram-speed-size-and-type
Laptops

How to find RAM speed, size and type

24 Jun 2020
Visit/hardware/components/356405/is-it-time-to-put-intel-outside
components

Is it time to put Intel Outside?

10 Jul 2020