Internet of Things (IoT) is too big for traditional data processing solutions.  Edge analytics may be the key to exploiting the oceans of data it generates. Identifying critical data sets and processing them close to source will be critical to the success of new services, and for speed and compliance it is likely that a lot of this analysis will take place in edge data centers.

This blogpost was written by Patrick van der Wilt, Commercial Director, EvoSwitch

Here Comes the Flood

According to Gartner, IoT will include 26 billion devices by 2020. Organisations in virtually every industry are using these devices to drive higher levels of efficiency, reduce costs, generate new revenue, and understand customers at more granular levels. However, not all of these organisations are prepared to deal with the deluge of data they will bring. The huge amount of data streaming from IoT could easily saturate datacenter networks, storage and processing capacity.

Enter Edge Analytics. An increasingly popular way of addressing these challenges is to put automated, intelligent analytics at the edge — near where the data is generated — to reduce the amount of data and networking communications overhead. Data that falls within normal parameters would be ignored or routed to lower cost storage for archival and regulatory reasons, while that which falls outside the norm could trigger an alert and be sent to a primary data platform for further analysis.

Islands of Data

A section of this compute, storage and analysis could take place in the Cloud – for instance via specialist machine data analytics firms like Splunk or Sumo Logic.  But for a lot of organisations the data will need to be processed closer to source (and faster) in specialist edge data centers such as EvoSwitch facilities. This applies in particular where companies don’t want any of the compliance headaches of manipulating private customer data in the cloud.

The Prize on the Horizon

Enterprises who do not have a clear Big Data strategy by now need to get a move on. Recent research indicates that the majority of Fortune 1000 firms now have at least one instance of big data in production – twice as many as in 2013 – and over half are creating new senior data-specific roles, in particular that of Chief Data Officer.

At the service provider end the prize is even bigger. Cisco claim that in what they call the ‘Internet of Everything’ there is $4.6 trillion of ‘value at stake’. Whether or not you would go that far, the prize for the winning data processing solution will be huge, and all the leading players are forging ahead with their offerings. SAP are evolving their HANA database solution; Cisco has bought Cologne-based edge analytics specialist Parstream; Dell continue to work with Intel on their IoT Labs and IoT Gateway servers, and IBM and HP are both investing heavily.

The greatest prize will undoubtedly be in the interoperation of IoT networks; where one dataset meets another and they generate something new and valuable.  Today there are plenty of networks of data, but they don’t talk to each other.  Edge analytics -whether they take place in edge data centers or in the cloud – will be the key to realizing this value.

Further Reading

  • Information Week: Edge Analytics An Antidote To IoT Data Deluge. Read article here.
  • Information Week: Big Data Goes Mainstream: What Now? Read article here.