AI is now powerful enough to automate the back office

Graphic showing office blocks in a digitised way
(Image credit: Getty Images)

This article originally appeared in issue 28 of IT Pro 20/20, available here. To sign up to receive each new issue in your inbox, click here

Artificial intelligence - the ability for machines to learn and make decisions based on information analysis - is enabling businesses to make great improvements in the way they work.

Organisations and customers alike are excited about the cooler things that AI can do, like self-driving cars, however, more often, thoughts of AI tend to conjure up images of robots, or automated functions, doing the work that used to be completed by humans. So how exactly is AI optimising the more mundane back-office functions?

The companies who are seeing the greater benefits of automation in the back office are those that have focused on the same main areas: accounting, IT, and HR. From replacing manual data-inputting and invoicing in the Finance department to form-filling in HR, the organisations seeing the greater performance are either in the process of implementing, or have already introduced automation.

And according to Automation Anywhere, those digitally-minded CIOs know that in order to enhance their customer-facing experiences, their back office functions also need to be effective and integrated.

Automation is a clear catalyst for boosting digital transformation, with Gartner predicting that by 2025, “70% of organisations will implement… automation to deliver flexibility and efficiency”. And to reinforce that anticipated growth, Statista sees the automation market reaching nearly £218 million (some $265 million) by 2025, having reached approximately £144 million ($175 million) in 2020.

Despite the leaps and bounds with which automation is expanding, some office departments are slow, or unlikely, to adopt automation. Those departments that manage a number of sensitive processes requiring a level of emotional intelligence, for example. HR and Legal departments are two that would possibly lag behind adoption, with Legal also having the challenge of legacy attitudes. Although the recruitment side of HR would definitely benefit from automation, as businesses adopt CV-screening tools, psychometric tests, and other digital processes in their hiring.

It’s knowing where to start. There are many factors to consider, including financial and technological, plus there are attitudes against introducing automation in case it could affect their roles. It is a weigh up to establish whether it is sufficiently worthwhile to remove the human element.

Office workers in a well-lit office space

(Image credit: Getty Images)

Lost in translation

The first question organisations must ask themselves often centres on clarifying the terminology itself, according to the president of financial automation firm Beanworks, Karim Ben-Jaafar. “What are you looking for, machine learning or AI? Do you know the difference? Most companies don't. They think machine learning and AI are just synonyms – but they're not.”

RELATED RESOURCE

The business value of IBM AI-powered automation solutions

Improved business operations, processes, and results

FREE DOWNLOAD

For his fintech company, machine learning is an entirely passive – and far more expensive – tool, while AI allows for processes to be taught to a tool. He believes machine learning is falsely seen as a perfect solution that requires minimal effort to implement. In fact, choosing it for tasks it’s not fit for is prohibitively expensive and akin, he says, to being an early adopter of laser eye surgery (LASIK). “That's monstrously expensive right now,” he says. “That's like getting LASIK when it first came out at $100,000 an eye. The good news is the price is going down to the point where you're going to look at it like it's $500.”

For Jay DeWalt, chief operating officer of Arria NLG, meanwhile, the three key terms that frame his conversations about AI are natural language understanding (NLU), natural language processing (NLP) and natural language generation (NLG). He uses the analogy of children learning to speak as a way to frame the technology his company uses in a wide array of industries and departments. First, they learn how to understand commands from their parents, before learning the sentiments of those commands, and then beginning to understand meaning. It isn’t until later, though, that they articulate those words in a meaningful and insightful way.

Despite criticisms, DeWalt is betting on AI, and NLG specifically, as a world-changing tool. “Someday, I'm going to just talk to my machines and they're going to talk back to me and give me the information I'm seeking. I think NLG is a revolutionary technology that's going to change the world and how we interact with systems, how we get information from our data.”

Automation can be a runaway train

In the case of large-scale enterprises, like PricewaterhouseCoopers (PWC), it’s not just about what AI can do for their clients but how AI can help them scale their own business. Suneet Dua, products and technology chief growth officer at PwC US, says that the process that led to the 7,000 automations now at work within PwC started in earnest four years ago, and accelerated due to COVID-19. He adds the biggest hurdle in any business is getting executives to understand these automations can reduce routine tasks and efficiently redeploy the workforce.

“I use this analogy where [the] automation train is going at like 100mph, human skills are going at like 10mph, and those two trains need to eventually converge. The problem that's holding back human skills is executives at the respective companies. They're not investing in human skills to upskill the future.”

Somebody walking up to the PwC office

While the most commonly cited indicator of AI’s value in automating the back office is the number of hours saved, Dua says, at the core, should be an improved environment for employees. He adds the training and education around the use of AI should include how the automation of simple tasks allows employees to work on more fulfilling and skills-based work, even if the six million hour reduction in work due to automations is profit-positive.

“What happens then, is when you hire a tax person, or a finance person, or an HR person, you hire them for the top of the tech stack skills,” Dua says. “You don’t hire them to do rudimentary, mundane tasks.”

Augmenting, not eliminating, the human touch

Brian Green, the director of technology ethics at the Markkula Center for Applied Ethics at Santa Clara University says a major concern in implementing such projects anchors on the data being fed into these systems. Central to this is the biases created by humans. “The main issues that are hitting right now have to do with bias and fairness, and whether the AI is just automating human biases and prejudices, which is certainly something that we've seen,” he explains.

Green adds that while the data may not seem biased initially, AI tools tend to locate proxy variables that can heavily skew the data towards areas that are traditionally affluent or otherwise homogenous. This is in addition to the human-based concerns, like what workers are supposed to do when their job ceases to exist. “The goal of that kind of software's pretty much to eliminate people. And so we really are faced with the question of how we approach these sorts of problems and try to fix them.”

While the prospect of removing humans from the workforce is a concern, industry figures tell IT Pro they’re looking to augment jobs rather than eliminate them. For Jerry Levine, chief evangelist and co-general counsel at ContractPodAi, a company working on automating legal contracts, removing the people is a non-starter.

RELATED RESOURCE

The business value of IBM AI-powered automation solutions

Improved business operations, processes, and results

FREE DOWNLOAD

“We’re an augmentation tool, not a replacement for the human being,” he explains. “It's not possible right now [to completely remove humans], I don't know if it'll ever really be possible, and, to me, I want to work with humans as a lawyer.”

He likens a lawyer’s work to that of an engineer or developer. “When I talk with a lot of engineers and technical [staff], and especially developers, I always point out that just as your job as a developer is to write code that tells computers how to operate, the job of a lawyer is almost to write code that tells human beings, organisations, entities, how to interact and how to operate in their relationship.”

Meanwhile, PwC’s AI cognition tool maintains a list of company holidays and can organise an out-of-office reply for employees who ask. The tool, Dua says, tells employees where their coworkers are sitting in the office on any given day and suggest possible times for lunch. Still, there are aspects of the business that Dua has no interest in automating. The exceptions, he says, are ethical or disciplinary matters. Issues normally handled by HR teams, in particular, are highly delicate situations. “To me,” he says, “those are the ones that need to have some sort of high touch environment to solve for them.”

This article was first published on 07/06/2022 and has since been updated.

John Loeppky is a British-Canadian disabled freelance writer based in Regina, Saskatchewan. His work has appeared for the CBC, FiveThirtyEight, Defector, and a multitude of others. John most often writes about disability, sport, media, technology, and art. His goal in life is to have an entertaining obituary to read.