Even data centers appear to be going through a Digital Transformation. The trend of centralizing data into fewer and fewer mega-data centers appears to be changing as companies look to a more distributed form of cloud computing where the data is stored closer to the edge.
Thierry Coupaye, Head of Cloud Platforms Research at Orange Labs, recently published the slides he used during an interesting IEEE International presentation on Cloud Networking, “Towards the extinction of mega data centers”, in which he discussed the changing role of the data center – and this may have implications on technology providers attempting to stay in line with local data protection laws.
Two examples of the latter:
- In Russia, there’s a rush to build local data centers following a recent law change demanding that no data on Russian citizens should be stored outside the country.
- In Germany, the courts recently invalidated existing “safe harbor” agreements that allow US companies to export European personal data (to the US) because US law does not comply with European data protection law.
In both cases, big firms like Microsoft, Salesforce or Google are now looking to extend their data center footprint in Europe to accommodate these regulatory demands.
Coupaye’s vision sees a solution to this need. It’s a solution in which some data may be stored in centralized data centers, while other information (such as the customer data protected by EU and Russia law) may be stored elsewhere, including on devices situated at the network edge. He calls this Fog Computing in which highly virtualized platforms provide compute, storage and networking services between end user devices and traditional cloud data centers.
This model means in some cases data never needs to leave the device, increasing security, availability and reducing bandwidth demands while also improving customer privacy and security. This is a little like Apple’s model for user privacy, in which end-to-end data encryption means no user data ever leaves the device itself, but services are still made available.
His presentation recognizes the evolution of cloud services.
First generation services were proprietary and highly dependent on data centers. Today, cloud services are based on multiple data storage providers, use a range of virtualization technologies, are secure, resilient and rely on both traditional data centres, and edge-based geo-distributed cloud computing services.
Fully virtualized server environments represent the direction of travel; around 20 percent of European businesses believe they have fully software-defined data centers, claimed a recent survey.
There are many good reasons to get behind this way of thinking. Limitations like bandwidth, space, data allocation, security, even data nationalities mean not everything wants or needs to be stored in the central servers. That’s even before considering the data deluge as 20 billion objects become connected worldwide by 2020. This environment means data has different priorities and needs.
He still sees huge importance in mega data centers, as you might expect given these are expected to proliferate at a CAGR of 4.62 percent until 2016, but he also understands that not every slab of data needs to travel to the central data repository.
Within these contexts it makes sense for technology firms (and others) to store sensitive data at the network edge in a user’s own jurisdiction. “Centralized public clouds are in fact generally distributed over multiple (mega) data centers for availability reasons,” says Coupaye in his presentation.
Traditional industry logic has applied a dualistic approach to the problem – centralized or distributed storage, but in the new epoch both are acceptable. Networks are becoming intelligent which enables them to converge with a distributed vision of cloud computing. Virtual CDN, Cloud RAN and Mobile Edge Computing (MEC) are just three examples of the new cloud models enabled by growing network intelligence.
The other key take away within Coupaye’s model is the inherent flexibility of the systems he describes. That makes a great deal of sense given the roller coaster nature of digital transformation, all the analysis tells us we need to question everything and that no business process is immune from the effects of the techno tidal wave. This means agility and flexibility isn’t just a mantra for the mobile workforce, but should be a prayer across every aspect of the modern enterprise. Why should storage be immune to this demand?
Orange Labs is developing its own approach, a kind of distributed cloud computing in which storage is provide by a constituency of actors, from more traditional mega data centers to tiny (nano or pico) data repositories hosted on user devices and connected things. The model also includes storing data on the network itself.
Coupaye describes a ubiquitous cloud platform that leverages a continuum of data centers, from mega data centers to nano/pico data centers through mini/micro data centers (on the network PoP) with Orange as a geo-distributed cloud platform operator. This solution should enable new classes of application for M2M, from social to smart cities, connected cars to industrial IoT, gaming, health and more.
Read Thierry Coupaye’s presentation on future data centers and take a look inside some of the work at Orange Labs here.
Jon Evans is a highly experienced technology journalist and editor. He has been writing for a living since 1994. These days you might read his daily regular Computerworld AppleHolic and opinion columns. Jon is also technology editor for men's interest magazine, Calibre Quarterly, and news editor for MacFormat magazine, which is the biggest UK Mac title. He's really interested in the impact of technology on the creative spark at the heart of the human experience. In 2010 he won an American Society of Business Publication Editors (Azbee) Award for his work at Computerworld.