Predictions for 2016: Trends that will transform the storage industry

By J?r?me Lecat, CEO, Scality www.scality.com.

  • Thursday, 7th January 2016 Posted 8 years ago in by Phil Alsop
In late 2014 I made some predictions about two trends that would shake up the storage industry over the next year and eventually phase out IT SAN and NAS storage as we know it.  These were:
 
SaaS Storage: More and more small- and medium-sized organisations would delegate not only software, but also storage to SaaS providers such as Microsoft Office 365, Salesforce and Box.com. Therefore they wouldn’t be buying much traditional storage in the future. SaaS providers would continue to forge the EMC, NetApp SAN/NAS storage model for the hyperscale, software-defined, hardware independent, shared nothing architecture model of Facebook, Amazon, and Google.
 
The result? Traditional storage industry players such as EMC and NetApp could wave goodbye to about $20 billion.
 
Performance and Scalable storage: Larger enterprises and other data intensive organisations were reaching a fork in the storage road leading them to run their own software-defined disk storage for scale and silicon in the form of flash arrays, hybrid storage and converged storage for performance hungry, latency sensitive applications such as streaming video or financial modelling.
 
The result? The traditional storage players would see their market slashed by start-ups in these two emerging categories.
 
One year later these trends are taking shape, as the revenues and profitability of legacy enterprise storage giants EMC, NetApp and IBM continue to take big revenue hits. In the second quarter of 2015 EMC’s revenues declined 4 percent compared to last year, NetApp’s 19.6 percent and IBM’s a whopping 28.9 percent, according to IDC. This year two other emerging trends promise to accelerate the process:
 
Deep Learning: Big Data Analytics have been around for a few years, digging through huge volumes of information from enterprise databases, files, and emails as well as social media and consumer purchases to recognise small and big hidden trends to exploit for competitive advantage, predictive maintenance, and other purposes. These and other strategies save organisations millions, enhance revenues and improve services and, consequently, the general quality of life for end users.
 
Deep Learning takes off from Big Data, incorporating the artificial intelligence and neural network movements of yore to yield systems that can harness information, multi-layered algorithms, and software to actually mimic human learning. For example, these systems can teach themselves to understand spoken commands, sort through photos, recognise objects and faces, discover potential new drugs or a carry out a host of other breakthrough functions on their own, automatically. Today you can even find robots that can walk and learn any number of new things by example--after example after example, after example.
 
The Emerging Internet of Things (IoT): IoT promises to transform the Web from a medium of human and information interaction to interaction among appliances, machines, components, systems and humans--with humongous volumes of information produced daily. Just a single airline flight can produce a terabyte or more of IoT information from scores of airplane component sensors. When you imagine one hundred thousand flying airplanes producing a total of 100 petabytes of data daily, the volume of information to be mined for trends, maintenance issues and other revelations, both large and small is almost unimaginable.
 
Deep learning, Big Data and IoT will save lives by preventing airplane and automobile crashes, diagnosing and treating patients, discovering powerful new drugs and so on.
 
In fact, in more and more industries, the value of information and software has begun to outgrow the value of the traditional products manufacturers and retailers have built and sold for years. Many new types of businesses have been, and will be, created just to harness this information, while sharing information across industries and industry players in manufacturing, distribution and other sectors will lead to powerful new innovations.
 
In such an information-hungry environment, in which every aspect of corporations’ existence and of our own lives is data driven and where all data - past, present, and future - is a potential gold mine for business and other insight, traditional limited scale storage models are doomed in favour of hyperscale and performance. Hyperscale is necessary to store and harness years and endless quantities of Big Data, IoT and Deep-Learning-related information nearby continuously for quick analysis. Performance is the goal when real time and low latency are required.
 
This level of scalability can only come from software-defined storage, just as compute and networks have moved to an intelligent, scale out software defined architecture.
 
In a traditional IT storage environment, scaling up inevitably leads to complexity, instability and performance issues. However, the hyperscale, storage defined, shared nothing model is well on the way to achieving:
1.       Technical Scalability, in which systems become more stable and reliable as they grow.
2.       Operational Scalability, where systems become simpler, not more difficult, to manage
3.       Performance Scalability, where system performance scales as the system scales
4.       Cost Scalability where the system cost falls as the systems scale
5.       Time Scalability: Today, a lifetime or more is the time frame for storing and accessing information, rather than a few weeks, months, or years. I’ve met with people who want their Facebook posts to be conserved after they’ve passed.
 
This is what Google, Amazon and Facebook have already achieved and the architecture that software-driven storage is providing to a host of organisations looking for a hyperscale storage solution.