The other side of AI

AI

Think of AI and you'll almost invariably be thinking of a cloud-based compute system where queries are captured at source and sent to the cloud for processing, after which the result of that processing is sent back to the source. But there's a different side to AI that's starting to gain more exposure, localised embedded AI.

There's been a lot of talk over the past year about neural processors, especially in the smartphone arena, and we should only expect the buzz around these embedded AI platforms to increase. So what are we expecting these embedded AI systems to do, and how will they differ from the far more powerful cloud-based systems?

We sat down with David Harold, VP marketing and communications Imagination Technologies, while at CES to talk embedded AI and the impact it will have on the coming years. Imagination Technologies recently announced a new automotive technology system, PowerVR Automotive, and it's safe to say embedded AI will be finding its way into more and more cars.

With countless automotive and technology companies all racing towards the autonomous driving ideal, the interim step of Level 3 autonomous driving - where an onboard system can take complete control of the vehicle, while ensuring the driver remains vigilant and engaged - is high on the agenda for 2019. But cars move quickly, and situations can change even more quickly, so the compute required for instantaneous decision making can't be based in the cloud, where a drop in connectivity could spell disaster.

That's why the automotive sector is one area that will undoubtedly see extensive embedded AI development over the coming years - if a car is going to be making potentially life or death decisions, you need that compute power in the car. As Harold pointed out, the localised self-awareness of an autonomous vehicle can't rely on connectivity, the system inside the vehicle must be processing all the local sensor data and making those decisions. It's something that Danny Shapiro - senior director automotive Nvidia - echoed when we spoke to him, too. While there has been a lot of talk about how 5G will help enable the autonomous car revolution, it won't change the fact that the critical compute systems need to be embedded in the car.

Of course there's already a huge amount of technology in cars, before you start to think about autonomous driving technology, and none of that is going away. Systems like music, navigation, communication, even passenger video playback will need to be controlled by that same in-car technology system, so you don't want any compute power that the autonomous driving system needs to be soaked up by your Spotify streaming or the movie your kids are watching in the back.

The answer is virtualisation. Just as with enterprise-level hardware, an in-car, embedded system can use virtualisation to sandbox each application to ensure that one doesn't affect another, while also making sure that critical systems are completely protected from any malicious code or apps that end users may introduce. There needs to be a level of prioritisation, too, ensuring that those critical systems have enough compute power and resources to operate at optimal levels. Essentially, the AI or neural systems that we'll see embedded in cars, will be just as complex and versatile as anything in a data centre, but obviously far more compact.

But it's not just automotive where embedded AI will be having a significant impact; another area where embedded and cloud-based AI will work in union is security and surveillance. CCTV cameras coupled with facial recognition systems have become an integral part of local, national and international security efforts. It's a prime example of how physical technology - the camera - works with integrated technology - an embedded AI processor - and cloud-based resources - security services databases - to achieve a single goal, identifying potential threats.

That process could be accelerated if that facial recognition was done locally, but you obviously can't embed a huge database locally. However, Harold suggested that you wouldn't necessarily have to.

If you were searching for specific people, and your intelligence told you that there's a high probability of them being in a specific area, you could have their details stored within the camera. The embedded system would basically ignore anyone who doesn't match the four or five faces it has stored locally, making the process much faster and giving security forces a better chance of acting upon the intel.

This scenario highlights that a third level of AI is likely to gain momentum, with AI embedded at the edge. The concept of edge computing has been around a while, and again it's all about improving performance and latency by bringing that compute closer to the physical location of the user. By placing edge servers at the cell tower, you could deliver more power than the embedded AI system, along with more storage for larger databases, while still providing significantly improved latency and performance compared with bouncing to a cloud- supporting data centre.

There's absolutely no doubt that AI is going to have an impact on everything we do, from personalising our consumer experiences, through searching for and predicting threats, to analysing market and geopolitical data to make effective business decisions, but it won't be a one size fits all' solution. AI will be a hybrid solution, marrying localised embedded technology, edge-based compute and enterprise level cloud platforms, all the while delivering a seamless experience to users. In fact, that's already starting to happen today.