The evolution of data center storage architecture
A comprehensive collection of articles, videos and more, hand-picked by our editors
The Internet of Things is more than just an explosion of sensors.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
The future of the Internet of Things (IoT) can be inferred from how storage technologies are advancing. One thing is certain: Data generation and data flow will grow dramatically.
IoT data is considered unstructured, but in reality there are many streams of data, each of which has its own structure. Much of the data is stored, used once and then discarded or archived.
What complicates the whole issue is that data usage is often near-real-time. For example, when you shop, your location in the store can be detected. The time for personalized advertising is just one or two seconds.
Most IoT data is "digested" on entry to the data center (e.g., face recognition turned into store location). The raw data is kept for a while, depending on what it is, so it will be streamed off to a disk farm. The output of digestion is a new, more valuable data stream. This, along with other streams, is sent to powerful analytics engines using big data techniques to generate inferences.
Big data engines generally are in-memory database engines, with very large dynamic random access memory, several cores and PCI express solid state drives (SSDs). Data streams directly into memory, or stages in through the SSD. Graphics processing units capable of dramatically speeding a parallel process, such as data searching, assist with Hadoop or other broadcast driver architecture approaches. Used data stages out to SSD and then it tiers to large bulk storage arrays.
The data center will have clouds of ingestors and analytics engines. Bulk storage will be scale-out object stores, with powerful compression/deduplication and 10-TB-sized drives. Network performance will be high, with 10 GbE de rigeur, moving to 25/50 GbE. With aggregated bandwidth needs as much as 10 times current levels, backbones will move to 100 GbE as that technology goes mainstream.
In the future of IoT, outbound devices will proliferate, from advertising displays to smart price tickets and intelligent road signs.
About the author:
Jim O'Reilly is a consultant focused on storage and cloud computing. He previously held top positions at Germane Systems -- creating ruggedized servers and storage for the U.S. submarine fleet -- as well as SGI/Rackable and Verari, startups Scalant and CDS, and PC Brand, Metalithic, Memorex-Telex and NCR.
Related Q&A from Jim O'Reilly
Don't let backup data encryption fall through the cracks. When encrypting backups, key management and compression are just two of the best practices ...continue reading
While tape is notably offline and thus protected from cyberattacks, the cloud could comprehensively surpass it for backup if service providers figure...continue reading
Despite a changing market, disk-based approaches like continuous backup and snapshots might still have a place in your backup and archiving strategy.continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.