Kubernetes — solving use cases of industry
Case Study of Nokia, Spotify and The New York Times.
Kubernetes was originally developed at Google and released as open source in 2014. It is for automating deployment, scaling, and management of containerized applications. It groups containers that make up an application into logical units for easy management and discovery.
Why to use Kubernetes:
Service discovery and load balancing
Kubernetes can expose a container using the DNS name or using their own IP address. If traffic to a container is high, Kubernetes is able to load balance and distribute the network traffic so that the deployment is stable.
Storage orchestration
Kubernetes allows you to automatically mount a storage system of your choice, such as local storages, public cloud providers, and more.
Automated rollouts and rollbacks
You can describe the desired state for your deployed containers using Kubernetes, and it can change the actual state to the desired state at a controlled rate. For example, you can automate Kubernetes to create new containers for your deployment, remove existing containers and adopt all their resources to the new container.
Automatic bin packing
You provide Kubernetes with a cluster of nodes that it can use to run containerized tasks. You tell Kubernetes how much CPU and memory (RAM) each container needs. Kubernetes can fit containers onto your nodes to make the best use of your resources.
Self-healing
Kubernetes restarts containers that fail, replaces containers, kills containers that don’t respond to your user-defined health check, and doesn’t advertise them to clients until they are ready to serve.
Secret and configuration management
Kubernetes lets you store and manage sensitive information, such as passwords, OAuth tokens, and SSH keys. You can deploy and update secrets and application configuration without rebuilding your container images, and without exposing secrets in your stack configuration.
Use cases solved by Kubernetes:
Case study: Nokia
When people are picking up their phones and making a call on Nokia networks, they are creating containers in the background with Kubernetes. — GERGELY CSATARI, SENIOR OPEN SOURCE ENGINEER, NOKIA
— GERGELY CSATARI, SENIOR OPEN SOURCE ENGINEER, NOKIA
Nokia was the first name in mobile phones when they were becoming ubiquitous in the late 1990s and early 2000s. But by 2014, the company had sold off its mobile device division and was focusing its core business not on the handhelds used for calls, but on the networks.
Challenge
As telecom vendors, we have to deliver our software to several telecom operators and put the software into their infrastructure, and each of the operators have a bit different infrastructure. There are operators who are running on bare metal, virtual machines, VMware Cloud and OpenStack Cloud.
We want to run the same product on all of these different infrastructures without changing the product itself.
says Gergely Csatari, Senior Open Source Engineer.
Solution
The company decided that moving to cloud native technologies would allow teams to have infrastructure-agnostic behavior in their products.
“The simplicity of the label-based scheduling of Kubernetes was a sign that showed us this architecture will scale, will be stable, and will be good for our purposes,”
says Csatari. The first Kubernetes-based product, the Nokia Telephony Application Server, went live in early 2018.
Case Study: Spotify
An Early Adopter of Containers, Spotify Is Migrating from Homegrown Orchestration to Kubernetes.
Launched in 2008, the audio-streaming platform has grown to over 200 million monthly active users across the world.
“Our goal is to empower creators and enable a really immersive listening experience for all of the consumers that we have today — and hopefully the consumers we’ll have in the future.”
By late 2017, it became clear that “having a small team working on the features was just not as efficient as adopting something that was supported by a much bigger community.”
— Jai Chakrabarti, Director of Engineering, Infrastructure and Operations.
An early adopter of microservices and Docker, Spotify had containerized microservices running across its fleet of VMs with a homegrown container orchestration system called Helios.
Case study: The New York Times
Challenge
When the company decided a few years ago to move out of its data centers, its first deployments on the public cloud were smaller, less critical applications managed on virtual machines.
We started building more and more tools, and at some point we realized that we were doing a disservice by treating Amazon as another data center.
— Deep Kapadia, Executive Director, Engineering, The New York Times
He was tapped to lead a Delivery Engineering Team that would
“design for the abstractions that cloud providers offer us.”
Solution
The team decided to use Google Cloud Platform and its Kubernetes-as-a-service offering, GKE.
Feel free to contact on my linkedin.