Intel innovates for the 'data-centric' era

Intel has shared its strategy for the future of data-centric computing, as well as an expansive view of Intel’s total addressable market (TAM), and new details about our product roadmap. Central to the strategy is a keen understanding of both the biggest challenges – and opportunities – our customers are facing today. By Navin Shenoy executive VP and GM of Intel’s Data Center Group.

  • Friday, 10th August 2018 Posted 6 years ago in by Phil Alsop

As part of my role leading Intel’s data-centric businesses, I meet with customers and partners from all over the globe. While they come from many different industries and face unique business challenges, they have one thing in common: the need to get more value out of enormous amounts of data.

I find it astounding that 90 percent of the world’s data was generated in the past two years. And analysts forecast that by 2025 data will exponentially grow by 10 times and reach 163 zettabytes. But we have a long way to go in harnessing the power of this data. A safe guess is that only about 1 percent of it is utilized, processed and acted upon. Imagine what could happen if we were able to effectively leverage more of this data at scale.

The intersection of data and transportation is a perfect example of this in action. The life-saving potential of autonomous driving is profound – many lives globally could be saved as a result of fewer accidents. Achieving this, however, requires a combination of technologies working in concert – everything including computer vision, edge computing, mapping, the cloud and artificial intelligence (AI).

This, in turn, requires a significant shift in the way we as an industry view computing and data-centric technology. We need to look at data holistically, including how we move data faster, store more of it and process everything from the cloud to the edge.

Implications for Infrastructure

This end-to-end approach is core to Intel’s strategy and when we look at it through this lens – helping customers move, store and process data – the market opportunity is enormous. In fact, we’ve revised our TAM from $160 billion in 2021 to $200 billion in 2022 for our data-centric businesses. This is the biggest opportunity in the history of the company.

As part of my keynote today, I outlined the investments we’re making across a broad portfolio to maximize this opportunity.

Move Faster

With the explosion of data comes the need to move data faster, especially within hyperscale data centers. Connectivity and the network have become the bottlenecks to more effectively utilize and unleash high-performance computing. Innovations such as Intel’s silicon photonics are designed to break those boundaries using our unique ability to integrate the laser in silicon and, ultimately, deliver the lowest cost and power per bit and the highest bandwidth.

In addition, my colleague Alexis Bjorlin, announced today that we are further expanding our connectivity portfolio with a new and innovative SmartNIC product line – code-named Cascade Glacier – which is based on Intel® Arria® 10 FPGAs and enables optimized performance for Intel Xeon processor-based systems. Customers are sampling today and Cascade Glacier will be available in 2019’s first quarter.

Store More

For many applications running in today’s data centers, it’s not just about moving data, it’s also about storing data in the most economical way. To that end, we have challenged ourselves to completely transform the memory and storage hierarchy in the data center.

We recently unveiled more details about Intel® Optane™ DC persistent memory, a completely new class of memory and storage innovation that enables a large persistent memory tier between DRAM and SSDs, while being fast and affordable. And today, we shared new performance metrics that show that Intel Optane DC persistent memory-based systems can achieve up to 8 times the performance gains for certain analytics queries over configurations that rely on DRAM only.

Customers like GoogleCERNHuaweiSAP and Tencent already see this as a game-changer. And today, we’ve started to ship the first units of Optane DC persistent memory, and I personally delivered the first unit to Bart Sano, Google’s vice president of Platforms. Broad availability is planned for 2019, with the next generation of Intel Xeon processors.

In addition, at the Flash Memory Summit, we will unveil new Intel® QLC 3D NAND-based products, and demonstrate how companies like Tencent use this to unleash the value of their data.

Process Everything

A lot has changed since we introduced the first Intel Xeon processor 20 years ago, but the appetite for computing performance is greater than ever. Since launching the Intel Xeon Scalable platform last July, we’ve seen demand continue to rise, and I’m pleased to say that we have shipped more than 2 million units in 2018’s second quarter. Even better, in the first four weeks of the third quarter, we’ve shipped another 1 million units.

Our investments in optimizing Intel Xeon processors and Intel FPGAs for artificial intelligence are also paying off. In 2017, more than $1 billion in revenue came from customers running AI on Intel Xeon processors in the data center. And we continue to improve AI training and inference performance. In total, since 2014, our performance has improved well over 200 times.

Equally exciting to me is what is to come. Today, we disclosed the next generation roadmap for the Intel Xeon platform:

  • Cascade Lake is a future Intel Xeon Scalable processor based on 14nm technology that will introduce Intel Optane DC persistent memory and a set of new AI features called Intel DL Boost. This embedded AI accelerator will speed deep learning inference workloads, with an expected 11 times faster image recognition than the current generation Intel Xeon Scalable processors when they launched in July 2017. Cascade Lake is targeted to begin shipping late this year.
  • Cooper Lake is a future Intel Xeon Scalable processor that is based on 14nm technology. Cooper Lake will introduce a new generation platform with significant performance improvements, new I/O features, new Intel® DL Boost capabilities (Bfloat16) that improve AI/deep learning training performance, and additional Intel Optane DC persistent memory innovations. Cooper Lake is targeted for 2019 shipments.
  • Ice Lake is a future Intel Xeon Scalable processor based on 10nm technology that shares a common platform with Cooper Lake and is planned as a fast follow-on targeted for 2020 shipments.

In addition to investing in the right technologies, we are also offering optimized solutions – from hardware to software – to help our customers stay ahead of their growing infrastructure demands. As an example, we introduced three new Intel Select Solutions today, focused on AI, blockchain and SAP Hana*, which aim to simplify deployment and speed time-to-value for our ecosystem partners and customers.

The Opportunity Ahead

In summary, we’ve entered a new era of data-centric computing. The proliferation of the cloud beyond hyperscale and into the network and out to the edge, the impending transition to 5G, and the growth of AI and analytics have driven a profound shift in the market, creating massive amounts of largely untapped data.

And when you add the growth in processing power, breakthroughs in connectivity, storage, memory and algorithms, we end up with a completely new way of thinking about infrastructure. I’m excited about the huge and fast data-centric opportunity ($200 billion by 2022) that we see ahead.

To help our customers move, store and process massive amounts of data, we have actionable plans to win in the highest growth areas, and we have an unparalleled portfolio to fuel our growth – including, performance-leading products and a broad ecosystem that spans the entire data-centric market.

When people ask what I love about working at Intel, the answer is simple. We are inventing – and scaling – the technologies and solutions that will usher in a new era of computing and help solve some of society’s greatest problems.