Remember the huge container ship that ran aground in the Suez Canal and blocked vessels at both ends of the international waterway for a week in March? That’s what managing software containers might feel like if not for Kubernetes.
It’s fair to say that without open-source Kubernetes and its capabilities for automatically deploying, upgrading, and provisioning apps and services across clouds at scale, we could still be stuck in the old world of monolithic software rather than the modular, flexible paradigm now underpinning digital innovation globally.
Eighty-three percent of 1,324 IT pros who responded to a survey by the Cloud Native Computing Foundation last year said they use Kubernetes to manage their container lifecycle. Yet despite its soaring popularity, the technology isn’t perfect. It can be complex to learn, operate, and maintain.
To reduce the barriers to entry that developers, DevOps engineers, and others have faced with Kubernetes, also known as K8s, Canonical in late 2018 introduced MicroK8s.
SEE ALSO: Kubernetes Is Much Bigger Than Containers: Here’s Where It Will Go Next
Canonical’s Kubernetes pedigree already was strong by that point. The company had released a fully supported, enterprise K8s distribution two years earlier, and its Ubuntu operating system was (and remains) the reference platform for Kubernetes on all major public clouds — Amazon’s EKS, Google’s GKE, and Microsoft’s AKS.
Now, after it had become clear that usability challenges were a fly in the Kubernetes ointment, Canonical wanted to offer a simple way to consume K8s services and tools. From the beginning, MicroK8s was designed as a lightweight, pure-upstream, production-ready Kubernetes distribution enabling developers to get a fully featured, conformant, and secure K8s system running in under a minute.
Two and a half years later, MicroK8s (pronounced “micro-kates”) remains faithful to its original purpose while also maturing into a more full-bodied solution for helping developers be more productive and reduce the notorious operational headaches in configuring, monitoring, and managing Kubernetes clusters.
MicroK8s is now being used in a variety of scenarios, from single-node installation on a developer’s workstation to support of compute-intensive artificial intelligence (AI) and machine learning (ML) workloads. MicroK8s’ low-resource footprint and support for both ARM and Intel architectures make it ideal for the edge, the Internet of Things (IoT), and appliances. MicroK8s is a popular choice on Raspberry PIs.
Source : JAXenter