Enterprises stand at a crossroads. The industry is moving towards cloud infrastructure, cloud-native applications and containerized workloads. That’s the future. However, most enterprises built their business on legacy applications. And these legacy workloads make up the majority of business critical, revenue-generating applications. Migrating to the cloud and cloud-native applications can’t be taken lightly. Enterprises can’t turn their back on legacy applications.
Applications are the fabric of what makes a modern organization function, innovate and deliver new or existing products. It’s also true that the modern business is looking for faster; faster time to market, time to value, innovation—you name it—faster everything. Considering the fact that applications are so critical to your business and containers help you move “faster” it would stand to reason that having an appropriate security strategy to protect those applications and associated data would be critical.
In the technology space there has always been the struggle between building a hardcore skill and being flexible enough to adapt as technology adapts. My first experience with that was Cobol (yes, I’ve been around for a while!), and since then I’ve seen in play out in client server, virtualization, storage, phone switches, and much more.
Since the dawn of the client server age we’ve been working on finding ways to treat the rack mounted server the same way we used to treat a mainframe, with some caveats; Like a mainframe we’re trying to make the server useful for more than one application, so that we’re more efficient with costs and resources. The caveat vs a mainframe is that if you were running a mainframe from Tandem, you weren’t going to be sharing applications between it and an IBM.
Last week at DockerCon in Seattle, I spoke with John Furrier and Brian Gracely of theCUBE and discussed how Apcera and Docker have paved the way to “containerize”, secure and scale traditional applications and green field applications in production.
You might be wondering, "what is Apcera doing that Mark found so interesting?” Those of you who have been participating in the industry with me, following my twitter feed or reading my blogs will know that I’ve had a fascination with the strategic business value of cloud for many years now. Some of you who were paying closer attention might have noticed that I was a very early adopter of the idea that enterprises would end up needing to manage a heterogeneous application delivery environment (aka hybrid). In fact, I’ve been thinking hybrid since 2009.
As you may have read, Apcera will be integrating with Kubernetes to provide a trusted fabric for containers. This is a BIG DEAL for us and a big bet on the future of cloud computing. It comes after I joined the Governing Board of the Cloud Native Computing Foundation - the recently new custodian of Kubernetes - to become more personally involved in the project.
Here’s the backstory.
Over the past 15 years I’ve seen the power of cloud computing radically transform the way we innovate. Speed, agility and scale have captivated many business leaders and technologists. Organizations need to innovate at a faster pace, and cloud computing technologies have long been considered the catalyst. But while some organizations were quick to adopt cloud infrastructure, many large enterprises have been hesitant to take the leap.
As organizations maintain diverse workloads across multi-cloud environments, IT and Ops teams are often challenged with balancing the speed of innovation with the need for security and control in development to deployment and production.
Concerns intensify when teams also consider that sensitive business data now lives inside Docker containers.
The Apcera platform bridges this development-production gap with a policy aware platform that enables secure workloads and containers to run smoothly in enterprise production environments.