Kubernetes and containers - powering tomorrow’s applications

Containers and Kubernetes are the driving force behind how the industry is reinventing the way we build and run applications, fueling enterprise IT efficiency. By James Petter, VP International at Pure Storage.

  • Friday, 15th October 2021 Posted 3 years ago in by Phil Alsop

Containers are a standard unit of software that packages up code and all its dependencies so that an application runs quickly and reliably from one computing environment to another. Containers make it easier to roll out cloud-based applications because they contain all the information needed to run them in manageable packages. In September 2020 we announced the acquisition of Portworx®, the industry’s leading Kubernetes data services platform for approximately $370 million, so it’s safe to say we recognise the significance of the technology. Let’s take a look at how we got here.

Importance of data centricity

Data is at the heart of tomorrow’s businesses. Leading digital organisations are using a new “cloud native” technology stack to process this data into value and insight. Cloud-native applications are specifically designed to operate in a cloud-like manner, whether in the public cloud or on-prem, from day one. They can be deployed and fixed faster, and can be moved across different environments easily. Cloud-native applications are typically made up of microservices (more on these later) and are packaged in containers. This new cloud-native stack includes a new set of applications - apps that analyze streaming data in real-time, apps that index massive quantities of data for search, and apps that train machine learning algorithms on increasingly large data sets - undoubtedly this cloud native revolution is being powered by a combination of containers and Kubernetes.

Containers make it efficient to run disaggregated applications at high degrees of scale and fluidity with minimal overhead, and Kubernetes creates the machine-driven orchestration that can juggle all these application fragments and assemble them into a composite application as necessary.

Container adoption speaks for itself

Adoption rates of this new cloud native stack have been staggering. According to 451 research, 95% of new apps are developed in containers. Enterprises are evolving their cloud strategies to be multi-cloud, and containers are also key to this. Gartner reports that 81% of enterprises are already multi-cloud, working with more than two cloud providers. Gartner also predicts that 85% of all global businesses will use containers in production by 2025 - a huge rise from just 35% in 2019.

It’s still an early market with huge growth potential so it’s inherently hard to forecast, but IDC predicts that the commercial market for container infrastructure software alone will top $1.5B by 2022, and enterprises are paying attention.

Microservices and containers - a perfect match

Put simply, microservices are the individual functions within an application, and form the basis of a new architectural approach to building applications. Microservices enable IT teams to more easily build and run the applications their users want and need to stay ahead of competitors. Many of the largest consumer and enterprise applications today run in microservices, proving that it’s not just a trend for small organizations but also for the largest and most complex. Indeed, the larger the organization is, the more benefits there are to gain from adopting microservices because teams are often spread out with limited direct communication.

When was the last time you got a maintenance notification from your favourite streaming service to let you know you won’t be able to access services? It doesn’t happen. There’s never a good time to update these services because someone is always binge-watching a new show. The principle of microservices states that you should break an application into smaller pieces that communicate via APIs, where each part can be updated independently from other parts. As a result, if a streaming service needs to update its password-reset functionality, it doesn’t need to kick millions of users offline. This feature is a different microservice that can be updated independently. This results in happy developers and happy users.

Microservices are here to stay and will underpin the applications of tomorrow. In what kind of environment should you run them? Containers are the perfect building block for microservices. They present a lightweight, consistent environment for microservices, that can follow the application from the developers desktop, to testing, to final deployment. In addition, containers can run on physical or virtual machines, and they start up in seconds or even milliseconds, which is faster than VMs.

Packaging applications with their dependencies

Traditionally, software packages have included all the code needed to run the application on a particular operating system, like Windows or Linux. However, you need more than just application code to run an application, you also need other applications. For instance, an application for looking up stock prices might use a library to convert company names to ticker symbols and vice versa. This functionality is generic and not value-added, but it’s still important to allow a user to type “Apple” and get the stock “AAPL.” The library is an example of a dependency. Without IT knowing it, any application might have hundreds of these types of dependencies.

One of the main reasons that containers became so popular is that they provided a mechanism and format to package application code—with its dependencies—in a way that made it easy to run an application in different environments. This solved a big problem for developers who were constantly fighting environment-compatibility issues between their development laptops, testing environments, and production. By using containers to package their applications, they could “code once and run anywhere,” dramatically speeding up application delivery.

Not all container services are created equal

In terms of challenges, the first generation of cloud native applications were designed to be stateless - using containers which did application work but didn’t need to store any persistent data in associated volumes. As container usage evolves, developers are increasingly building stateful apps inside containers - apps that need to store data in a volume that must be persisted and kept. This is where the world of storage becomes challenging. The flexibility and openness of containers turns into hurdles and bottlenecks at the storage layer, and simple storage capabilities that we’ve been taking for granted for years in the traditional application stack (high availability, disaster recovery, backup, encryption) become challenges in the container world. What’s worse, what often happens is that each application devises it’s own storage strategy, making it impossible to drive standards and data compliance across an organization.

This is why as a best practice we recommend choosing a solution that delivers the Kubernetes-native data services that both cloud native and traditional apps require (since those traditional apps aren't going away anytime soon). This means delivering block, file, and object storage services, in multiple performance classes, provisioned on-demand as Kubernetes requires. It means providing instant data access, protection across all types of failures, the ability to mobilize data between clouds and even to/from the edge, and robust security no matter where an application travels. If organisations do this they will see for themselves why Kubernetes has become the not-so-secret special sauce for modern organisations.