A buzzword among coders
Microservices is a coding buzzword, which has come to prominence over the last few years. The technology rose to fame as a way to help developers who were having difficulties managing large scale, monolithic applications. While much ink has been spilled over microservices, there is no generally accepted definition of the term. This article explores the meaning, benefits and challenges of microservices.
Micro vs. macro applications
The basic idea behind microservices is to breakdown a monolithic application into a suite of modular services. Before microservices, any minor modification to a monolithic application demanded creating a new application. Microservices are easier to manage in comparison to monolithic architecture because they are divided into smaller parts, which are compatible with each other.
Take Amazon.com as an example. The website comes with a variety of services, from wish lists to credit card authentication, all of which work together. This arrangement is different in that it is not organized around software layers. Rather, each module supports a particular business capability. The modules communicate with other sets of services through well-defined interfaces, which tend to be stateless. Other companies like Netflix have taken advantage of this business model.
As previously noted, one of the main benefits of the technology is the ability to make updates or repairs to a system without having to rebuild an entire application. Additionally, since the system is divided into small parts focused on a particular business capability, development teams can spend more time concentrated on their specific service. Additionally, the architecture allows each service to be launched independently, providing more flexibility in return.
Every technology has drawbacks, microservices included. The possibility of failure points increases with microservices, since there are more services that interact with each other. Whenever new services are added, it can become increasingly difficult to maintain, configure and monitor them all. Consequently, the cost of monitoring applications in production tends to be higher with the technology. Furthermore, the excess of services in the architecture gives hackers and cyber criminals more opportunities to seize upon.