Big data from the Internet of Things may create big challenge for data centres: report
- 18 March, 2014 14:58
Devices that use the Internet of Things (IoT) will generate big data that needs to be processed and analysed in real time, putting more pressure on data centre providers, according to a new Gartner report.
The Impact of the Internet of Things on Data Centres forecasts that there will be 26 billion IoT units installed by 2020 and IoT service suppliers will generate US$300 billion in revenue.
However, Gartner US distinguished analyst Joe Skorupa said this increase in IoT units will give data centre technology providers more challenges due to the volume and structure of IoT data.
“Existing data centre wide area networks [WAN] links are sized for the moderate-bandwidth requirements generated by human interactions with applications. IoT promises to dramatically change these patterns by transferring massive amounts of small message sensor data to the data centre for processing, dramatically increasing inbound data centre bandwidth requirements,” he said in a statement.
- Where is the Internet of Things heading in 2014?
- Five Internet of Everything gadgets
- APAC data centre spending to hit US$15.6 billion in 2014: industry census
Another challenge is user privacy. According to Skorupa, there will be a “vast amount” of data from IoT devices providing information about what the user has accessed.
“The big data created as a result of devices will increase security complexity. If this [data] is not secured, it can give rise to breaches of privacy,” he said.
“The recent trend to centralise applications in order to reduce costs and increase security is incompatible with the IoT. Organisations will be forced to aggregate data in multiple distributed mini data centres where [data] processing can occur. Relevant data will then be forwarded to a central site for additional processing.”
To manage these problems, Gartner UK research director Fabrizio Biscotti suggested data centre providers implement capacity management platforms.
These could include a data centre infrastructure management (DCIM) system which aligns IT and operational technology standards.
“This will provide the production facility to process IoT data points based on the priorities and business needs. Throughput models derived from statistical capacity management platforms or infrastructure capacity toolkits will include business applications and associated data streams,” he said.
“Those scenarios will impact design and architecture changes by moving towards virtualization, as well as cloud services. This will boost on-demand [data centre] capacity to deliver business continuity.”
Follow Hamish Barwick on Twitter: @HamishBarwick