The pace of technology change tends to leave tried-and-true processes in the dust. And so it goes with software development. Running software in containers is catching on, and a primary reason is because when code is developed in one environment, it may not run the same if it is deployed in another. This leads to errors, which take time to fix. In this era of rapid iteration and release, no one can afford that. So, containerization ecosystems like Docker and Kubernetes keep gaining in popularity. The 2018 Container Adoption Benchmark Survey reveals that nearly half (47%) of surveyed IT leaders are planning to deploy containers in production, while another 12% say they already have.
The advantages of containers
By changing the way software delivery is achieved, containers are making developers’ lives easier. They hold a great deal of promise, particularly in terms of increasing developer speed and efficiency across hybrid infrastructures. Containers typically are a collection of pieces of software and environment that run together as a coherent system. Developers make these systems in the form of container images, test them and make sure they’re acceptable. Then developers deploy them to large environments where the container platform instantiates identical replicas from the image, ensuring it’s the same software running everywhere. The reason that there’s a need for containerization is that it enables repeatable deployments of identical software.
SEE ALSO: What is Kubernetes and how does it relate to Docker?
Because they don’t include operating system images, containers require fewer system resources than hardware or traditional virtual machine (VM) environments. With VMs, developers may need to buy more hardware because they reach capacity more quickly. Though workloads can certainly be placed in VMs, using containers is a superior approach because it has a better chance of success as cloud computing moves from simple to complex, distributed architectures.
Containers make software delivery simpler and more predictable, because they provide a consistent deployment environment that can be used at all stages of the delivery pipeline. Applications running in containers can be deployed easily to multiple, different container platforms and cloud providers. Whether you’re building your software, testing your software or deploying software in production, you can use the same environment to host the software. Containers also can help enterprises modernize legacy applications and create new cloud-native applications that are both scalable and agile.
Pluses and minuses
Containers are extremely useful, but they have their limitations. For example, they do eliminate some concerns around how the differences between your development environment and your production environment will affect your application. But containers aren’t totally immune to the types of bug and error concerns that plague traditional software development. The fact that flaws, outages and security incidents still occur is proof that testing tools don’t catch 100% of issues.
There are at least 30 vulnerabilities each in the top 10 most popular Docker images, according to a recent report by Snyk. On top of that, if you install any container with an older version of an application, there’s a high likelihood that it will contain vulnerabilities. And that means your organization is still at risk for potential system outages and downtime that can cause significant economic and reputational impact.
The Ponemon Institute Cost of a Data Breach Study 2018 found that an hour of disruption can cost a small company $8,000, a medium company $74,000, and larger enterprises roughly $700,000. It’s been a challenge in IT maintaining consistent service with mixed-and-matched software, and that’s what containers solve. However, the issue is that if someone creates an exploit that works against one container, now there will be identical software running everywhere – and it’s going to work against all those containers.
Avoiding pitfalls
Instead of merely putting applications in containers and never looking back, developers need a new approach in terms of testing, to help ward off these potential problems. Quality assurance (QA) teams need to make sure they test containerized apps under all of the circumstances that might be present in production. That’s because containers could behave differently due to variables ranging from system hardware to unexpected network traffic. And by testing in production, bugs are detected before they go live, and threats are isolated before they have an impact.
SEE ALSO: Enhance your Docker usage: Launch build containers with Floki
Proceed with caution
Containers provide a tremendous solution to the current problems associated with the multiple environments developers must deal with. They make software development and testing easier. However, they have some disadvantages that need to be addressed. Thus, they need to be handled with care. Bugs and errors can still occur, and a vulnerability in one container means a vulnerability in all of them. Testing in production will catch unwanted anomalies before damage is done. Containers will help you release trouble-free software that benefits your customers – but only if you use them wisely.
The post Containerization: What you need to know appeared first on JAXenter.
Source : JAXenter