Enterprises are more distributed than ever—and really, a better term now might be dispersed. Large enterprises have operations in multiple sites, regions, and countries while smaller ones are rearranging their organization in response to the pandemic and the changing scenarios in which they operate.
As a result, data is created everywhere now, including on mobile devices, at remote industrial sites, in R&D in one country and in manufacturing in another, and so on. Usually, people and applications need to access this data from locations that are different from the source, and they need it quickly to make decisions in real time.
Flexibility and speed are everything and traditional approaches don’t work. Making copies of data doesn’t make any sense from either a cost or security perspective. Concentrating data in a single location (On-premises? In the cloud?) is an option, but then there is the problem of cost and accessing data from remote and sometimes poorly connected locations.
Think Global, Act Local
Users want data to look local, even when they access it from the other side of the planet. And they want a seamless user experience, something that won’t change their processes and how they need to operate their applications. At the same time, enterprises are moving more applications in the cloud and they want data to be close to the applications as well. In this case, it is more about efficiency, response time and, of course, cost.
In the end, everybody wants global access and local performance. But security and compliance have emerged as primary concerns in every conversation, and this brings up the global/local dichotomy again. Everyone needs global security on one hand and compliance with local regulations on the other.
Unfortunately, data consolidation and remote access are not compatible. Data gravity makes the movement of data complicated and slow while remote access brings with it additional latency.
Defying Data Gravity
You can’t defy data gravity, but with the right approach, you can deceive it. Last year I wrote a report about cloud file systems, which I plan to update with a focus on these aspects of data gravity and how a global data layer can be built so it is accessible from everywhere.
Approaches may differ both in terms of technology and scope. You can have solutions like Nasuni or Panzura that leverage object storage in the back end and cache appliances in the front end. This type of architecture is highly scalable and provides an infrastructure that has a better TCO than traditional solutions, while delivering performance that’s responsive to local needs. Data is consolidated in an object storage back end, more often than not in the public cloud, and all the storage becomes OpEx-friendly. I touched on some of the advantages of object storage in one of my latest posts, here.
You can learn about Nasuni’s approach from a presentation at the recent Storage Field Day 21 event.
One alternative approach from Hammerspace eschews physical consolidation in favor of a virtualization layer that makes storage assets accessible from anywhere. In this case, the user has limited impact on the existing infrastructure, while additional features enable the creation of virtual views of the data. There’s other cool stuff here that can really change your perspective on data management. You can check out the Hammerspace presentation at the Storage Field Days 21 event to learn how, for example, caching mechanisms allow users to solve performance issues in use cases such as big data analytics.
Closing the Circle
Demand for modern data storage services is going up. Many organizations are delayed in their hybrid cloud projects because of storage infrastructure challenges. At the same time, they must address urgent challenges related to providing access for small remote offices and work-from-home scenarios.
Solutions in this space are now both solid and mature and have already proven their positive impact on overall infrastructure TCO. The examples described here are joined by others that I will analyze in my upcoming reports. This is all good news, but there is a diversity of approaches that can impact how each solution performs in specific use cases. So the onus is on organizations to be careful in their evaluations.