Are algorithms overlooking the power of instinct?
Is there a point where a human being is better placed to make a decision than a computer?
If you're a regular customer of Amazon, then chances are that it knows you pretty well. Taking the lessons taught by the Tesco Clubcard (one of the most effective data gathering operations of the last decade or two), Amazon's algorithms gather together details of what you buy, what you look at, what you're tempted by and what you leave behind. Then, it aggressively - yet quietly - targets you with offers and ideas, all with the noble quest of getting you to spend some extra money.
It's not alone. Netflix, too, likes to think it knows you. That the depth of coding behind the service is such that it tailors very specific recommendations - and even categories - to what it thinks you like. It doesn't know what you like, of course, but it's damned if it's not going to make a decent guess.
Those guesses have their moments, of course. It's little revelation that algorithms tend to be rather successful too, and dominate much of the digital information we receive. Once written, they tend to be a lot cheaper than human beings to maintain too. Thus, your Facebook feed is determined by code reacting to what it thinks you're interested in (or, more to the point, who it thinks you're interested in). Advertising on sites such as this often wraps around what information ad providers have gleaned. All around you in the digital world, a computer - often with not much further direction from a human - is choosing what content is presented to you, in what order. That spans across ecommerce sites too, right through to news and social media outlets.
Enter the human
Yet in any business boardroom, a good number of people who have tended to get to the top, or have done something leftfield, have taken a risk. They've often used their instinct and gone where the data hasn't told them to go. There's an old ice hockey adage of going where the puck isn't, and human beings tend to be far more effective than computers at doing that.
The buzzword of the moment, in business circles, remains 'disruption', and disruptive companies - Uber, Apple, Airbnb are all good examples - have not been afraid to keep human thinking at the heart of their ideas. It's not quite the beer mat business idea, but there's something inherently simple, a core idea, that's at the heart of much of their success. Why not rent out your spare room to holidaymakers? Why don't you order a cab on your smartphone? Why do you need to buy music physically? Wouldn't it be easier just to download it?
The pros and cons
In the case of Apple, Steve Jobs was obviously the core driving force behind realising many of the company's ideas (albeit not necessarily conceiving them). In Walter Isaacson's exhaustive biography of the late Apple co-founder (the book that in turn inspired the Steve Jobs film, starring Michael Fassbender), one particular anecdote follows the early prototype design for the Apple Store. When an employee pointed out to Jobs that Apple had got it wrong and it needed to be arranged by application rather than product, Isaacson describes Jobs being immediately snappy and angry. Within an hour, though, the whole plan for the Apple stores was turned around, to the model we see today. No computer came up with that.
Conversely, data has significant uses. Harvard Business Review argued back in 2014 that working through data was likely to get you a better employee than recruiting via instinct. It argued that human beings are "very good at specifying what's needed for a position and eliciting information from candidates", yet "they're very bad at weighting results". Harvard's conclusion was that "a simple equation outperforms human decisions by at least 25%" and this conclusion stuck across entry level positions through to management candidates.
Meet in the middle?
What this points to, of course, is a fairly logical conclusion; that there's a balance to be struck. That the availability of data is a good thing, and the right human approach is to weigh up what's there properly and make an assessment based on it.
That requires time, though, and time is a commodity the digital world never has enough of. Facebook, after all, came in for intense criticism over its decision to remove the human being from the process that chooses what news stories appear on its front page. Instead, it's an algorithm-driven process, one that Facebook doesn't explain. Inevitably, some odd and inappropriate story choices have slipped through the net. Facebook, though, would likely argue that it's won more than it's lost.
Have we, though? Algorithms have appeal to organisations, not least those who continually cost-cut, and most end users arguably barely spot the difference. But surely, if everyone is relying on an algorithm, the end result is an homogenous hodge-podge of the familiar? Plus, they give the human beings in the process something to blame, and that's not something that should be overlooked.
Instinct, and genuine risk taking, are often the difference between middle of the road and success, however, and furthermore, barely a day goes by when something contextually chosen by a computer starkly contrasts with what it's sited next to. There's a moral of the story, in that companies cut out the human at their peril. And clearly, it's finding the balance that's key.
Main image credit: Bigstock
How to choose an AI vendor
Five key things to look for in an AI vendorDownload now
The UK 2020 Databerg report
Cloud adoption trends in the UK and recommendations for cloud migrationDownload now
2021 state of email security report: Ransomware on the rise
Securing the enterprise in the COVID worldDownload now
The impact of AWS in the UK
How AWS is powering Britain's fastest-growing companiesDownload now