Crossing the cloud chasm

Person jumping through gap between two cliffs
(Image credit: Shutterstock)

It’s a funny old time in the chronicled timeline of cloud computing isn’t it?

On the one hand we have any number of research institutes and analyst houses issuing surveys to report that 90-something percent of companies will adopt cloud in the next eighteen (or whatever) months. Yet this week also saw Rackspace issue its Cloud Reality Check report, which "evidenced" the fact that the majority (59 percent in this case) of UK firms surveyed are still chained to physical servers.

One would have thought then that the majority of our efforts should be focused on addressing the security, reliability and ROI concerns that are reportedly still leading to misgivings and reservations among firms thinking of moving to hosted environments.

Yet there is of course a good proportion of the cloud computing industry already well advanced in terms of virtualised application and data storage implementation.

It’s almost like a cloud computing chasm.

While many firms have yet to cross the chasm, get inside the tornado and reach escape velocity as Geoffrey Moore would put it; other firms are dealing with real application and data management issues and using cloud-based tools for tasks ranging from testing to integration to debugging and “transformation” to mobile services, as is the current trend.

These are application tools for the cloud that work in the cloud.

It seems like a far cry from, “Oh, I’m not sure, what happens if we upload our stock management system to the cloud data centre again?”

Over on the post-implementation side of the chasm we find workflow monitoring tools that analyse real-time data flows. We can look at application and data integration issues and tune system performance to eliminate bottlenecks based on a view of performance metrics.

In this part of the cloud zone where engineers are actually getting their hands dirty with virtual grease, we can envisage an engine room typified by legacy enterprise service bus (ESB) models, bespoke applications, on-premise packaged apps, cloud services and perhaps Hadoop.

Put simply, those that do get to the cloud often find that data is everywhere. This is the view of Ash Jhaveri, VP of product management at cloud integration company SnapLogic.

"SnapLogic’s customers understand that to remain competitive their applications need to share data. The challenge created by myriad apps sharing data in diverse formats is gaining an understanding of where data is flowing in real-time within the system. With an understanding of data flow, customers can quickly identify and remove bottlenecks," he said.

The company’s latest release of its SnapLogic cloud integration platform has data flow analysis power to look at CPU utilisation, wait times and data throughput. There is enhanced debugging “in the cloud” i.e. users can follow data flowing through an integration workflow. “Individual records of data are now viewable before and after they pass through filters and joins, allowing immediate debugging,” said the company, in a press statement.

So are most of us driving lumbering family saloons up to the edge of the cloud chasm? Are the real cloud adopters driving Lamborghinis and Ferraris deep into cloud territory with sophisticated engineers on hand for pit stop checks every mile?

Either way, please fasten your seatbelt.