Analysis

Can Google Stadia finally bring success to cloud gaming?

It’s the first attempt to get game streaming right in years, but ISPs could be its undoing

Google Stadia handset

It's not usually part of our remit, but despite it being a gaming-geared announcement, there's something about the new Google Stadia gizmo that interests us, specifically its infrastructure which raises questions we can't quite answer yet.

Codenamed Project Stream before its launch, Google's game streaming service is far from the first to grace the market, but it might be the most well-timed attempt of them all. When it launches tomorrow, it will aim to bring console-quality games directly to virtually any screen with no need for a physical console.

Games will be controlled by a new Stadia controller which connects directly to the platform via Wi-Fi. Not the screen, not the Chromecast – directly to the game client residing in the cloud.

It's important to note this isn't the first time a cloud streaming of this kind has been attempted. OnLive and PlayStation Now, to name but two, promised so much but then crashed and burned after delivering very little.

Advertisement
Advertisement - Article continues below

OnLive ran into money issues where the cost of running its infrastructure far exceeded the income it made. It's reported that the service cost millions of dollars to run every month but on launch, and for a few weeks after, the company only received single-digit daily income because of its "try before you buy" policy on games.

PlayStation Now is actually still alive and kicking. The premium service hasn't been adopted nearly as widely as first thought, probably due to a combination of its reported heavy input lag and poor variety of supported games.

So with that, Stadia must overcome some significant challenges to breathe new life into the platform. With edge computing, it theoretically has an advantage over OnLive and being Google, it has already managed to secure a decent selection of launch titles to ensure day-one success.

What caught our eye is the cloud and edge computing aspects of the service's infrastructure and streaming strategy. OnLive attempted cloud gaming in the past but the existing support infrastructure, which was unfit for purpose, has been the main blockade in developing a system that actually works. The streaming speed and quality were passable as a proof of concept and just about playable with some games, but to launch Stadia eight years after the failure of OnLive, it must do much better.

So many questions

Google has said it will be using a combination of its highly advanced data centres and edge infrastructure to deliver gaming at low latencies, something that, especially in the online multiplayer space, is of vital importance.

Phil Harrison, VP and GM at Google, leading the Stadia project, said that the measurable latency issues seen in Project Stream are "solved and mitigated".

"There are some investments in the datacentre that will create a much higher experience for more people, and there are some fundamental advances in compression algorithms," he told Eurogamer.

"One important thing to keep in mind is that we are building our infrastructure at the edge. It's not just in our central, massive datacentres. We're building infrastructure as close to the end user, the gamer, as possible - so that helps mitigate some of the historical challenges".

The interaction between the datacentre and the edge is unclear, specifically to what extent both will impact the overall processing and transmission of game data. What's been said so far seems somewhat confusing and at times contradictory. For example, Harrison spoke about microsecond ping speeds for gamers but 20ms edge to datacentre speeds.

Google's massive number of datacentres, according to Harrison, will be pivotal in delivering the Stadia experience the tech giant has imagined. Harrison said Google's datacentres offer the theoretically unlimited compute capacity needed for a cloud-based streaming service to thrive. Gaming developments have, in years gone by, been limited because of hardware and people's reluctance to upgrade for a few years, or until the next console life cycle starts. In the datacentre, CPU and GPU capacity is as powerful as the developer needs it to be to run its game.

Advertisement
Advertisement - Article continues below

Chris Gardner, senior analyst at Forrester is optimistic about the capability of Google's infrastructure. "The network configuration is somewhat of a mystery, but clearly Google nailed this because benchmarks have shown perceived input latency to be extremely fast," he tells Cloud Pro. "Google has experience with network optimisation (all the way down to designing its own protocols) so the performance is not a stretch."

Take the specified hardware announced by Google and put it into one of Google's many datacentres and you arguably have a recipe for success, he adds.

Trouble in the network

However, the promises around network speeds proved to be a point of contention for us. Firstly, Harrison told the BBC that 4K gaming can be achieved on download speeds of 25mbps; for reference, the average UK household gets just 18.5mbps speeds from its internet connection, far less in rural areas. The VP said Google expects network demands to improve, but it definitely wasn't a promise.

Although Google seems to be confident in the fact that its back-end equipment is up to the task, it's likely to face the problem of internet service provider (ISP) throttling - world-class servers or not. Harrison already confirmed that Google already has relationships within the wider industry, but it's possible the company could run into the same problems that Netflix faced during its expansion where it started paying ISPs to allow faster speeds on the service but instead users were throttled.

It's a very real possibility that ISPs would throttle bandwidth as popularity grows and network demands are greater. "[Netflix] had to negotiate with the major players to ensure the customer experience wasn't dreadful," says Gardner. "I expect the same experience for game streaming providers, although much more so because now it's not just a bandwidth negotiation, it's latency as well".

On the topic of latency, Gardner cited this as his biggest concern of the whole project. "What I expect to see is streaming to be initially successful with casual games, platformers and roleplaying games," he said. "However, multiplayer games demand low latency and low input lag to stay competitive and enjoyable. This is my biggest concern," he added. "Shooters, MOBAs and other types of super competitive games - I honestly don't expect gamers to tolerate the latency."

Competition is just around the corner

There are only three companies in the world right now that are positioned well enough to feasibly deliver on a cloud-based product like this. Google is one of them, AWS and Microsoft are the others. We just wouldn't expect any of these to pump so much time and money into something the world isn't ready for yet.

Google's main competitor in this area is Microsoft, which is working on Project xCloud, its own game streaming service currently in beta. The company behind the Xbox is certainly lagging compared to Google as its product is currently still in the development stage, but it arguably presents the best case to make this idea of game streaming work. Reports from those selected to test the beta version of xCloud also seem to be unanimously positive.

Couple Microsoft's prowess in the cloud market with its strong presence in the console sector spanning nearly two decades and that provides an impressive backing for what could potentially be a better product compared to Google's. It's possible Microsoft could let Google launch Stadia, learn from its inevitable mistakes, and then blow it out of the water with a far superior service.

Regardless of how all of this plays out, it's difficult to get excited about something that has failed in so many previous attempts and with so little information about the project disclosed - the kind of information that we need to make educated guesses about its viability as a service - we can't help but look on with scepticism.

Advertisement
Advertisement - Article continues below

Main image credit: Marco Verch

Featured Resources

The IT Pro guide to Windows 10 migration

Everything you need to know for a successful transition

Download now

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

Download now

Software-defined storage for dummies

Control storage costs, eliminate storage bottlenecks and solve storage management challenges

Download now

6 best practices for escaping ransomware

A complete guide to tackling ransomware attacks

Download now
Advertisement

Recommended

Visit/cloud/354231/the-it-pro-podcast-is-the-future-multi-cloud
Cloud

The IT Pro Podcast: Is the future multi-cloud?

29 Nov 2019
Visit/business/business-strategy/354204/google-accused-of-union-busting
Business strategy

Google accused of ‘union busting’

26 Nov 2019
Visit/cloud/public-cloud/354159/vodafone-launches-neuron-platform-with-google-cloud
public cloud

Vodafone launches 'Neuron' platform with Google Cloud

20 Nov 2019
Visit/cloud/hybrid-cloud/354158/google-cloud-ramps-up-its-migration-partnerships
hybrid cloud

Google Cloud ramps up its migration partnerships

20 Nov 2019

Most Popular

Visit/security/identity-and-access-management-iam/354289/44-million-microsoft-customers-found-using
identity and access management (IAM)

44 million Microsoft customers found using compromised passwords

6 Dec 2019
Visit/cloud/microsoft-azure/354230/microsoft-not-amazon-is-going-to-win-the-cloud-wars
Microsoft Azure

Microsoft, not Amazon, is going to win the cloud wars

30 Nov 2019
Visit/hardware/354237/five-signs-that-its-time-to-retire-it-kit
Sponsored

Five signs that it’s time to retire IT kit

29 Nov 2019
Visit/business/business-strategy/354195/where-modernisation-and-sustainability-meet-a-tale-of-two
Sponsored

Where modernisation and sustainability meet: A tale of two benefits

25 Nov 2019