CNI: employers, not hackers, are the real risk

Powerlines

There's been no shortage of security advice for businesses recently. Earlier this month, GCHQ and the Department of Business, Innovation and Skills issued guidance for corporations on how to protect themselves from cybercrime.

And this week, companies running parts of the critical national infrastructure, or CNI, have come in for scrutiny and so has the behaviour of their staff, with a new guidance document produced by the Centre for Protection of National Infrastructure and PA Consulting, a management consulting firm.

Governments around the world have started to worry that critical national infrastructure, including power, water and transport, has become a target for both cybercriminals and for foreign governments keen to disrupt a potential adversary, or an economic rival.

The idea that a government, or groups associated with them, might attack another nation's CNI has led to organisations, such as NATO, putting cyber warfare higher up their agenda.

Security groups have also started to discuss the idea of a Geneva Convention for cybersecurity: governments might agree, for example, that hospitals should never be attacked by a virus, or a distributed denial of service attack.

But the definition of national infrastructure is broadening as governments and security agencies realise how much different parts of the economy depend on each other.

Electricity or water might be obvious critical infrastructure, as are transport and healthcare. Equally, though, fuel is critical and so are fuel deliveries; we all need to eat, but without banks or even cash machines, people cannot buy food. The result is that more companies' systems are critical, on a national level, than their IT managers might initially think.

According to Bill Windle, one of the co-authors of the report and a security specialist at PA Consulting, this is illustrated by studies in the US that suggest big cities would start to lose vital services just a day and a half after a power outage, as equipment for pumping water or sewage stop working.

A cyber attack, though, is not the only way critical infrastructure might fail. Sometimes, as Windle points out, problems are caused not so much by bad people, but by good people trying to cut corners or make honest mistakes. There is also the danger, he says, that some employees will engage in "counterproductive behaviour" if they think no-one is watching.

The result is a document called HoMER, for holistic management of employee risk. The guidance spans the accidental or foolish such as sharing passwords to fraud or theft, or installing malware on employers' systems. But the guidance is not just about controlling employees' actions: it also encourages staff to think more about information security, and also challenge behaviour they spot that could be unsafe.

"In IT security, people are always the weak link," says Windle. "If you look at Stuxnet, that was an advanced technical attack, but it was also designed to spread via USB. There will always be attempts to exploit social engineering or human actors."

It may be one more document to add to the reading list, but as the guidance suggests, safety is as much about creating trust between the employer and employee as it is about building ever higher walls.

Stephen Pritchard is a contributing editor at IT Pro