IoT analytics guide: Understanding Internet of Things data
A comprehensive collection of articles, videos and more, hand-picked by our editors
What is the Internet of Things, exactly? It is an ambiguous term, but it is fast becoming a tangible technology that can be applied in data centers to collect information on just about anything that IT wants to control.
The Internet of Things (IoT) is essentially a system of machines or objects outfitted with data-collecting technologies so that those objects can communicate with one another. The machine-to-machine (M2M) data that is generated has a wide range of uses, but is commonly seen as a way to determine the health and status of things -- inanimate or living.
IT administrators can use the IoT for anything in their physical environment that they want information about. In fact, they already do.
In one case, IoT is being used to stymie deforestation in the Amazon rainforest. A Brazilian location-services company called Cargo Tracck places M2M sensors from security company Gemalto in trees in protected areas. When a tree is cut or moved, law enforcement receives a message with its GPS location, allowing authorities to track down the illegally removed tree.
One analyst explained the IoT using the iPhone as an analogy. Disconnected third-party applications that are hosted in the cloud can be connected, and users can access all sorts of data from the device, according to Sam Lucero, senior principal analyst, M2M and Internet of Things, at IHS Electronics & Media in Tempe, Ariz.
How the Internet of Things works
While some consider IoT to be M2M communication over a closed network, that model is really just an intranet of things, Lucero said.
With an Intranet of Things, apps are deployed for a specific purpose and don’t interact outside of that network. The true IoT is where different applications are deployed for specific reasons and the data collected from the machines and objects being monitored are made available to third-party applications. The expectation is that true IoT will provide more value than what can be derived from secluded islands of information, Lucero said.
For the IoT to work in data centers, platforms from competing vendors need to be able to communicate with one another. This requires standard APIs that all vendors and equipment can plug into, for both the systems interfaces as well as various devices, said Mike Sapien, a principal analyst with Ovum.
IBM proposed in February that its IoT protocol, called Message Queuing Telemetry Transport (MQTT), be used as the open standard. This would help multiple vendors participate in the IoT.
“[System integrators] like HP, IBM and others are starting to open up their systems to be less restrictive, just as telecom operators are allowing different networks—not just their own—to be part of the IoT ecosystem,” Sapien said. “But this has taken many years to happen.”
Meanwhile, a number of platforms serve as the plumbing to connect systems from different vendors so that they can communicate and be managed. One such platform is Xively Cloud Services, which is LogMeIn Inc.’s public IoT Platform as a Service. It allows IT to design, prototype and put into production any Internet-connected device.
For example, companies that have to monitor energy use might use closed, vendor-specific systems. They can use something like Xively as a secondary system to monitor heating and cooling and control energy use across multiple locations.
Over the long term, one consequence of the Internet of Things for the enterprise data center could be a large volume of incoming data coming that requires significant infrastructure upgrades, particularly for data processing and storage, Lucero said.
Let us know what you think. Write to us at email@example.com