All Over In a Flash – The Growth of Flash Arrays

“The market grew significantly faster than we expected” said Eric Burgener of IDC when talking about the Flash Array market in 2014 – not every technology “trend” can get that kind of an analyst reaction right? The numbers seem to justify the comment though – $ 11.3 Billion in revenues in 2014 across All Flash Arrays as well as Hybrid Flash Arrays according to IDC. It seems that every storage vendor has a Flash play now – Violin Memory to start things off, the biggies like Dell, EMC & NetApp and the “startups” like Nimble, Pure Storage and SolidFire are all solidly in there with AFA & HFA offerings.

Performance in specific applications drove early adoption of Flash but the corner seems to have been turned of late. More and more vendors seem to be bundling features into their systems that have enterprises firmly in their sights – snapshots, better security and replication are all available now. Then there is also the improving cost per gig – still not in the range of disks but steadily going down none the same. It’s not a surprise that Flash adoption is on the rise.

There is a fair amount of activity underway to make the value proposition even sweeter – last month Dell announced Flash products targeted at the enterprise based on new Triple Level Cell 3D NAND technology that, they claim, was the highest density and lowest cost out there. The claim was that before applying data-reduction the cost per gig could come down to as little as $ 1.66. That should make some HDD users consider shifting loyalties to SSD!

Then there is the capacity Q. Again there are all sorts of upward movements taking place here in growing the size of the array. Look at HP 3PAR as a case in point. HP’s enthusiasm is based, among other things, on research they have quoted to show that nearly 80% of enterprises believed they would be using flash “more or a lot more” in the near future. A sign of their commitment to the Flash market is the HP 3PAR 20800 which claims to be the biggest flash storage array in the world with the capability to store up to 15 PB of data. That could find a home in most data centers I would think.

So is it all roses for Flash then? Perhaps the choice is not quite as easy. For starters there is the problem that most technologies have to deal with at the start – the risk aversion of the largest enterprise customers. Enterprises are slow to change and especially in the case of Storage with the multiple special purpose systems that are already embedded into their data center and in many ways into their business operations. Making the change is not easy for them – the larger the enterprise the better may be the eventual value proposition for them to make the change to SSD but the change is also most difficult for them to commit to. A real enterprise Catch 22. This has been among the most important reasons for despite the Apples and Facebooks of the world the enterprise in general is only now getting onto the Flash bandwagon with enthusiasm.

The other concern with Flash seems to be longevity given the more finite number of write operations it can take. No one wants to have their data disappear because their Flash device failed due to “wear and tear”.

Until things settle one way or the other the administration and storage management task can also be a beast for the data center manager. Picking which kind of data and which application will be better served with which kind of Storage becomes a tough task when the data center has to juggle performance, security, cost and several other factors at the same time.

There seems to be a lot happening with flash though so despite these concerns expect it to keep growing and acquiring more followers. Let me give the last word to Mark Peters, of Enterprise Strategy Group (ESG). “Instead the questions are now about the type, speed and extent of adoption. Simply put, it is not if flash will be adopted, but where and how soon.” I agree with that news flash!

To know more email: marketing@calsoftinc.com

Anupam Bhide | Calsoft Inc.

 
Share:

Related Posts

Fine-Tuning GenAI - From Cool Demo to Reliable Enterprise Asset

Fine-Tuning GenAI: From Cool Demo to Reliable Enterprise Asset

Generative AI (GenAI) is quickly moving from experimentation to enterprise adoption. It can generate text, visuals, even code, but the real value emerges when these models are…

Share:
VMware to AWS Migration - 3 Technical Approaches

VMware to AWS Migration: 3 Technical Approaches That Work

Picture this: your IT team is staring at a renewal notice from VMware. Costs are higher than expected, bundles force you into features you don’t use, and…

Share:
Gen AI in Digital Product Engineering

How Gen AI is Transforming Digital Product Engineering Companies

Explore how Generative AI is reshaping digital product engineering companies by driving innovation, accelerating development, and improving customer experiences. Learn its role in modernizing workflows and building competitive advantage.

Share:
From Bottlenecks to Breakthroughs - Building Synthetic Data Pipelines with LLM Agents - Blog banner

From Bottlenecks to Breakthroughs: Building Synthetic Data Pipelines with LLM Agents

Recently, we collaborated with a team preparing to fine-tune a domain-specific Large Language Model (LLM) for their product. While the base model architecture was in place, they…

Share:
From Reactive to Proactive AI Predictive Testing in Software Development - Blog Banner

From Reactive to Proactive: AI Predictive Testing in Software Development

The old rhythm of software testing—write code, run tests, fix bugs—doesn’t hold up anymore. Continuous releases, sprawling microservices, and unpredictable user behavior are stretching QA teams beyond…

Share:
Applications of Large Language Models in Business - Blog Banner

Applications of Large Language Models in Business 

Enterprises today are buried under unstructured data, repetitive workflows, and rising pressure to move faster with fewer resources. Large Language Models (LLMs) are emerging as a practical…

Share: