Paid Feature Of all the ideas to surface in the 20-year history of cloud computing, few have proved as compelling as the hybrid cloud. Organizations understand on-premises data centers and how computing power can be rented through public clouds or accessed through dedicated private clouds.

What they increasingly need is a way of using all three at the same time, including deploying multiple public clouds simultaneously to avoid provider lock-in, allow for portability of workloads, scalability, cost savings, and adherence to compliance standards.

In a hybrid cloud, physical location – whose physical hardware is being used to run applications – becomes less meaningful. A private cloud can be run on a cloud provider just as a cloud service can be run on-premises. What matters is the software infrastructure used to connect and manage multiple, diverse clouds as a single logical infrastructure.

This used to be done through middleware and a big VPN pipe for secure transfer, but a growing number of enterprises have moved on from this old-world approach and embraced the cloud native approach. Instead of connecting clouds to one another, they are unified through a unified software layer comprising a single base OS, a common set of container images, and an orchestration system such as Kubernetes to provide automation and management of the underlying resources.

One vendor that’s nailed its future to the hybrid cloud, cloud native approach is Red Hat, which not surprisingly argues this should be built using open standards. With the foundational security provided by Red Hat Enterprise Linux(RHEL), the layered products that run on top, such as Red Hat OpenShift, benefit by inheriting the security technologies provided by RHEL.

Red Hat has packaged and delivered trusted Linux content for years and now delivers that same trusted content packaged as Linux containers, through the Red Hat Universal Base Image. This allows enterprises to build a security-focused hybrid cloud, manage and control a hybrid cloud with integrated security, and build, deploy, and run security-focused applications on top of a hybrid cloud using DevSecOps practices.

Growing control of the application lifecycle by developers

From the DevOps shop floor, cloud native development looks democratic, flexible, and fast while from the security team’s perspective it can be an example of shadow IT on a grand and risky scale.

When we spoke to Lucy Huh Kerner, Red Hat’s Director of Security Global Strategy and Evangelism, she said “Before, developers used the machines their IT teams allocated to them. Now, developers have more and more control over the application lifecycle and can download a container image from the internet, spin up instances on AWS, and there’s no longer the need to wait potentially weeks to be given a machine from the IT team.”

“Many times, security teams are not brought in early in the evaluation process of new technologies, including cloud technologies. Developers want the path of least resistance to develop applications quickly in a more agile way without necessarily always involving the security teams, who are viewed as blockers in many cases.”

And this assumes that the security team is even aware that developers might be doing this behind their backs. A lot of this loss of control can be invisible until a misconfiguration strikes or data is left unprotected by mistake. According to Kerner, common errors include human errors such as embedding secrets inside apps or failing to lock down credentials.

The underlying issue is a mismatch between a traditional development approach where security is done at the end of the application development lifecycle; and one where security is done early in the application lifecycle using agile development and tools – all on top of a security-focused hybrid cloud. It’s the perfect definition of reactive security – fixing a problem when you see it, not before it occurs.

“Security is no longer just the responsibility of security teams” says Kerner. “Because developers have more and more control of the application lifecycle, they now have access to a lot of security tooling they may not have had access to before, such as container security tooling .”

Also, you may have different development teams using different types of infrastructure for their apps, including AWS, Azure, or using containers and Kubernetes directly or via a container platform, such as Red Hat OpenShift. The result of this is that organizations find it harder to control their infrastructure, and may find it harder to meet compliance standards.

Creating a security-focused ‘software factory’ with DevSecOps

According to Kerner, putting the Sec into DevOps is a key long-term solution. Put security into DevOps and security stops being something only the security team worries about and becomes part of every developer and IT Operation team’s job. Security is integrated early – and shifted left – in the application lifecycle.

The point about DevSecOps is that it makes the issue of security explicit, something Kerner argues needs to be consciously adopted at the C-level. Without a commitment at the CISO or CIO level to impose rules on collaboration and standardization, cultural change will likely never happen, and development, IT operations, and security teams will remain siloed.

For example, “Someone has to say, ‘these are the hardened images we are going to allow in our organization and these are the security gates that will be part of our application lifecycle. As a result, if necessary, we will break the build and alert relevant teams accordingly if security gate checks are not met. ”

What this looks like in practice is using only approved cloud providers, an agreed system of access control, an agreed set of hardened container images, and an agreed upon framework in place for the application lifecycle used by all development teams for consistency around security gates,security tools, and processes.

This agreement has to be established between the application development, IT operations, and security teams. “Without that, everybody just does their own thing and the silos won’t go away,” observes Kerner, who likens a more ideal process to building a factory production line for software with security gates built in.

This production line analogy also expresses the use of automation by codifying everything in the development and IT operations pipeline. For example, by using technologies such as Kubernetes operators or Red Hat Ansible Automation Platform, you can automate scanning and remediate against security controls for compliance in an automated way. “At the end of the pipeline, not only do you have scan results, you can also automatically generate compliance reports to hand to the auditors.”

Doing DevSecOps at scale with container and Kubernetes

Cloud technologies – including containers, Kubernetes, and public cloud services – can help you implement DevSecOps at scale. Using a security-focused, enterprise-grade, container-based application platform, such as Red Hat OpenShift, empowers developers and operators to achieve their project goals, while also improving operationational efficiency and infrastructure utilization.

Red Hat OpenShift is built on a core of standard and portable Linux containers that deliver built-in security features. Red Hat Enterprise Linux CoreOS (RHCOS) is a container-optimized operating system for running containers at scale with strong isolation and provides that foundational security layer that Red Hat OpenShift runs on top of. Red Hat OpenShift can also run on multiple different cloud environments, including AWS, Microsoft Azure, Google Cloud, and more, Kerner says

“Red Hat also provides the tools for developers to build security-focused applications, whether they are traditional applications or cloud-native, containerized applications. We also have technologies for automation and management. Using Red Hat Ansible Automation Platform, you can implement IT automation at scale, in a consistent and repeatable way, across development, IT operations, and security teams.”

“We are constantly monitoring all the open source packages that make up our products so we can act quickly whenever a security vulnerability is detected. This contributes to the software supply chain of our products. We are also investing in future technologies that could help with software supply chain security”, says Kerner.

A good example of this in action is the sigstore project, a Linux Foundation initiative originally developed by Red Hat and backed by Google intended to improve software supply chain integrity and verification using digital signing of source code. Sigstore allows you to verify the provenance of all software – including open source packages. The mission of sigstore is to make it easy for developers to sign releases and for users to verify them. Many digital signing solutions are expensive, Red Hat argues, and don’t fit the model of open source’s innovation engine and has the challenge of who holds the private key. .

It’s been compared to the way services such as Let’s Encrypt made TLS certificates free and mainstream rather than something only accessible by specialists with wallets. Sigstore provides free certificates and tooling to automate and verify signatures of source code.

Big challenges remain, not the least of which is infusing the culture of DevSecOps into an enterprise environment where speed, cost and siloed security is still the name of the game. This could take years to develop, time the industry arguably can’t afford. But in the end, security won’t win the war with just more and better tools or more and better security and DevOps staff. The overhaul promised by DevSecOps was always going to be a culture war as much as tools.

JavaScript Disabled

Please Enable JavaScript to use this feature.

Sponsored by Red Hat.