It was a different world in 1997, pre-Google. While many people were online, most had 28.8Kbps to 33.6Kbps dial-up internet connections, with some devil-may-cares riding the tempest with a new 56Kbps modem. There were just over a million websites in 1997, compared to roughly 1.2 billion in 2017.
You would think that with all that has happened in technology since the late 1990s, we would be at a kind of exhaustion point when it comes to innovation. Not so. Seventeen years into the new millennium, tech and innovation is showing no signs of slowing down; in fact, many would say (and provide statistical evidence to the effect) that the pace is intensifying. The Internet of Things (IoT) has arrived, and in short order has evolved into a pulsating hub that will eventually connect billions of people, places, and things.
Truly, data has become central to everything, and it is not overly dramatic to say it is the key to the future for most enterprises. Future-facing organizations know two things: first, that we’ve entered Era Z (global IP traffic is set to triple between 2016 and 2012, from 1.2 zettabytes (ZB) to 3.3 ZB, with 63 per cent of that traffic coming from wireless and mobile devices); and second, that data center managers must come up with new strategies — and settle on new approaches and services — for handling the coming tidal wave of data or be submerged and swept away.
Taming the monster
The IoT involves billions of user devices (number getting higher every day), and enterprises, generating billions of packets of data. As devices learn more about users, they will spit out even more data. This is not a monster that is going to stop growing. The amount of data ping-ponging around an ever-growing IoT is going to reach mind-boggling proportions, and suddenly, organizations from the smallest e-com websites to the largest multinationals are faced with an enormous question: “How do we capture and use effectively the ‘stuff’ we pull out of the online ether without spending obscene amounts of capital?”
Narrowing it down
While it may at one time have been in vogue, the centralized model of having everything in one place to capture and process data is wholly incompatible with the IoT. Companies today are faced with several data center options, including public cloud, private and hybrid cloud, and leasing space in a multi-tenant colocation data center (retail or wholesale). Most businesses, regardless of sector, will utilize some form of hybrid.
Doing the needful
While the cloud may be ideal for certain lower performance, lower customization SaaS applications, as well as systems that are used infrequently, organizations that are looking to scale their mission-critical infrastructure may find the greatest cost benefits and best data security overall lie in leasing space in a colocation data center. That the colocation data center market is expected to garner over $50 billion by 2020, registering a CAGR of 12.4 per cent from 2015 to 2020, is compelling evidence that organizations around the world are recognizing en masse the challenges and requirements of the IoT.
To read the DFT white paper “From Cloud to Colo: The Current Canadian Data Centre Landscape,” visit the DFT website.
Unmatched data center experience
Operating 12 data centers in three major U.S. markets and Canada, totaling 3.5 million gross square feet and 302 megawatts of available critical load, DuPont Fabros Technology, Inc. (DFT) powers, cools, and protects the servers and IT assets of customers who outsource their mission- and business-critical applications. The combination of DFT’s robust data center designs and our superior operating methods, carried out by an exceptional team, provides an unmatched data center experience.
Are you interested in learning more about data centers? Contact DFT today.