Microservices are the latest evolution of services-oriented architecture, where an application is built out of many independent pieces all working together, and often deployed in containers.
Benefits include speed to market, lower costs, and greater flexibility -- but they also come with their own set of security and management challenges.
Lattice Engines, which offers sales and marketing analytics, is planning ahead with a very deliberate roll out schedule.
"We're taking our time to build this out," said Walt Williams, the company's head of security. "We're not going to allow just anyone to spin up a new service or process. This is going to be something that is very tightly controlled here at Lattice. There will be some services-on-demand capabilities, but we're not going to see a spontaneous peak of demand that far exceed our plans, and we're restricting the use of automation to create new services."
The company is currently moving its production environment to the Amazon cloud, using the microservices approach.
"We don't want to do this using traditional approach -- deploy your servers, maintain your servers, put your application on the server, and maintain the application," he said.
These are going to be customer-facing applications, so security is critical.
Using microservices and Docker containers allows the company to quickly and flexibly deploy applications, and the container architecture means that the services themselves will be structurally isolated from the operating system.
"So we can make updates to them independent of any changes to the operating system in which the container resides," he said.
That offers both operational and security benefits.
"This way, if there's a vulnerability in the operating system, we're isolated from that," he said. "It prevents escalation of compromise. We're exploiting the modular nature of the environment to enhance security."
However, there are also management challenges and worries about access control and patching schedules.
To address these issues right from the start, Lattice uses the Chef container configuration management tool.
"So, all the Docker containers we are deploying will be centrally managed," he said.
That leaves a gap when it comes to patch management, he said.
There, Lattice uses tools from Tenable Networking Security, one of the small number of vendors that is starting to offer container security and management products.
"Frankly, we look at these containers as a way to ensure patch management," said Williams. "That's one of the areas where the Tenable product is helping us -- we can do vulnerability scanning in the container itself before it is pushed out to production."
Communication and libraries
It used to be that companies could secure the communications of their applications because they were running on a fixed number of physical or virtual servers.
"Now, it's harder to identify where those microservices reside," said Dave Burton, vice president of marketing at security vendor GuardiCore.
Plus, all of the microservices need to communicate with one another, and there are more of them than ever before.
"This dramatically increases the attack surfaces that attackers can exploit," he said.
To set policies for those communications, security teams need to understand where the microservices are, and how they communicate with one another.
Vendors, including GuardiCore, are starting to provide tools that help identify communication endpoints in cloud environments, he said.
"And we think it needs to go even further, and map the communications between individual processes to get a more granular view of how microservices communicate with one another," he said. "You need to auto-discover there, and have it be automatically updated. The second thing that you need to do -- and this is a trend we're seeing in the marketplace -- is to move to more granular security policies inside the data centers and clouds."
This is called microsegmentation, and allows companies to set policies about traffic between individual processes.
"In the event that one of those services gets compromised as a footprint for launching an attack, they'll be able to contain the attack," he said.
In addition, those individual messages all need to be properly authenticated and encrypted, said Owen Garrett, head of product at NGINX, which makes a popular open source web server and load balancer.
He suggested that enterprises look carefully at how they deploy their web application firewalls, encryption, and network segmentation.
Ensuring the security of communications was a big issue for Alkami Technology, which develops and hosts online banking software.
The financial industry is heavily regulated, and vendors need to be sure that they comply with industry best practices, he said. "Is all of our data encrypted at rest? Is it encrypted with best practices in transit?"
To make sure that the security is in place, Alkami uses NGINX as its middleware tier.
"It lets us make sure that we can propagate the best practices for secure controls, rather than having to depend on each developer to do it for each particular microservice," McElroy said. "We want developers to develop more quickly, but we don't want to give up anything from a security design standpoint."
To check for third-party vulnerabilities, Alkami uses Veracode, he added.
"I see tremendous value in that," he said. "If we didn't have something to take care of some of the basics, then we would be spending a lot of time chasing ghosts."
Short life cycles require automation
In addition to helping speed up development, automated security tools also help companies deal with the microservices-related issue of containers.
Containers are a very popular way to deploy microservices, and because of how easy they are to spin up, any particular service might be up and running for just a couple of days -- or even just a few minutes, said Gavin Millard, technical director at Tenable Network Security.
"It allows organizations to be hyper-scalable when required," he said.
The ease and speed of deploying the containers mean that security is often forgotten.
"Quite often, the CISOs don't even know that Docker is running within their infrastructure," he said. "With a lack of visibility, you get a lack of understanding of what vulnerabilities and misconfigurations that exist in those systems."
For example, he recently scanned popular Linux-based containers, and found 80 vulnerabilities right out of the box. Those need to be patched before the container is actually put into production.
When they're not, a company's attack surface could expand dramatically overnight.
"We acquired a container security company, giving us the ability to look into those container images," Millard said.
FlawCheck, which Tenable bought last fall, scans container images for vulnerabilities, malware and other risks, and also provides continuous monitoring to ensure that containers stay up-to-date after they've been deployed.
"Even if you download a vulnerable image, you can take the appropriate remediation step and get rid of those vulnerabilities before it's pushed out into the production environment," he said.
This process has to be automated. Traditional, physical servers, would typically last three to five years, and companies could manage them manually.
"Fifteen years ago, I knew exactly how many servers I had," Millard said. "I even named them after famous race horses."
The average lifespan of virtual servers is two to three weeks, he added.
Containers, however, have an average lifespan of just 9.25 hours, according to software performance analytics company New Relic. And the biggest growth last year was in containers that have a lifespan of less than a minute.
Last summer, Docker CEO Ben Golub reported that more than 4 billion containers had been deployed, with more than 460,000 Dockerized applications, a growth of 3,100 percent over the past two years.
And companies that were already running containers saw their usage increase 192 percent between 2015 and 2016, according to New Relic. Even more dramatically, the maximum number of containers within a single company rose from 1,596 in 2015 to 135,630 in 2016. The average number of active containers at a single company is now 28,000.
According to a report released earlier this month by RightScale, 40 percent of enterprises are already using Docker containers, and another 30 percent are planning to do so.
"It's massively exploding," said Jay Leek, managing director at cybersecurity consultancy ClearSky Cyber Security. "Most CISOs have no idea of the magnitude of the problem they're facing."
One reason that adoption of containers is growing so quickly, and often without the oversight of security, is that developers are using them to quickly create applications.
And it's not just the security organization that's bypassed, Leek said. Sometimes the CIOs are out of the loop, as well.
Then when the applications are finished, the company has a dilemma -- do they put the containers into production, or do they throw out all the work?
"Then there's a big scramble to secure and support the new environment," he said. "This is a big issue right now that every organization is facing -- whether they realize it or not. It may be happening outside the purview of your team, until it slams in your face."
And it's not just developers rushing to meet deadlines.
"People in finance or in business with development skills, it's so easy for them to spin up a container to do things quickly," he said. "So it's even broadened how you think about traditional development."
Meanwhile, the number of vendors offering management and security tools can be counted on two hands -- with fingers left over, he said.
"We're at the starting line, we really haven't started the race here," he said. "But the benefits are so significant, and so important to businesses being able to innovate faster, that we have no choice about whether we accept it."