Abonnez-vous au flux

We are tremendously excited to be part of the Knative (pronounced kay-nay-tiv) project recently announced at Google Next. Red Hat's strategy around serverless will embrace and benefit from the ideas Knative supports.

Red Hat has always focused on portability and consistency across hybrid and multi-cloud environments, as needed by many of our customers. We recently introduced Red Hat OpenShift Cloud Functions to further satisfy those needs in a changing technological environment, enabling an enterprise-grade serverless platform with OpenShift.

Knative provides fundamental building blocks for serverless workloads in Kubernetes, empowering the creation of modern, container-based and cloud-native applications which can be deployed anywhere on Kubernetes.

“Red Hat’s focus on the hybrid cloud is rooted in customer choice and we carry that same perspective to our work to make hybrid serverless a reality. Customers should be able to create functions that are triggered by events in a hybrid cloud environment that spans multiple public clouds as well as the datacenter. By joining the Knative community, we are combining our deep enterprise Kubernetes and open source expertise to help create a common building block for serverless on top of Kubernetes across the hybrid cloud.” (Chris Wright, Red Hat Vice President and Chief Technology Officer)

Let’s take this opportunity to draw a distinction between “functions-as-a-service” and “serverless”:

Functions-as-a-Service (FaaS) is an event-driven computing execution model that runs in stateless containers and those functions manage server-side logic and state through the use of services. Serverless is the architectural pattern that describe applications that combine FaaS and those hosted (managed) services. MartinFowler.com has a great article that provides more details and the origin of the terms.

By providing the same building blocks across multiple vendors, Knative addresses the issue of portability and interoperability across different serverless platforms, which can be tremendously valuable because it allows parties building FaaS platforms to focus on providing business value as opposed to building the fundamentals themselves, while leveraging the talent and resources of a wider community.

Knative implements primitives for function and application development through a series of CRDs (Custom Resource Definition) and associated controllers in Kubernetes which provides for declarative specification of what a developer wants, instead of procedural instructions on how to attain it. Just as prior efforts to extend Kubernetes drove the development and refinement of the mechanisms available to Knative, making it possible to extend Kubernetes today, we believe that Knative development will drive improvements in Kubernetes and Istio.

Organizations are looking to adopt FaaS and Serverless solutions, but there are lots of workloads that can be brought to Kubernetes and OpenShift today. Combining Operators, Knative and OpenShift enables stateful, stateless, and serverless workloads to all run on a single multi-cloud container platform with automated operations. We plan to build OpenShift Cloud Functions on Knative, and Red Hat has been working in collaboration with Google and other partners, bringing its enterprise Kubernetes and community expertise to the project.

Knative and OpenShift will be cornerstones of OpenShift Cloud Functions and our goal is to provide the best operational capabilities and developer experience for serverless application developers, following Red Hat’s commitment to community first and open source development. Developers can benefit from being able to use a single platform for hosting their microservices, legacy, and serverless applications. For Red Hat customers that want to run more of their application workloads to Kubernetes and also want a flexible consumption model, Knative and OpenShift Cloud Functions can help deliver that. We like Knative and think our customers will also.

This post was written in collaboration with Paul Morie and Steve Speicher.  


À propos de l'auteur

William Markito Oliveira is an energetic and passionate product leader with expertise in software engineering and distributed systems. He leads a group of product managers working on innovative and emerging technologies.

He has worked in global and distributed organizations, with success building high-performance teams and successful products. He also has experience establishing internal and external partnerships delivering high-value outcomes. Oliveira is engaged in open source by fostering communities and governance, and he's looking forward to embracing new technologies and paradigms with a focus on distributed systems. He's spoken at multiple software conferences and co-authored a few books on the field.

Currently, Oliveira is focused on the intersection between serverless, cloud computing, and Kubernetes and applying AI/ML concepts whenever possible.

Read full bio
UI_Icon-Red_Hat-Close-A-Black-RGB

Parcourir par canal

automation icon

Automatisation

Les dernières nouveautés en matière d'automatisation informatique pour les technologies, les équipes et les environnements

AI icon

Intelligence artificielle

Actualité sur les plateformes qui permettent aux clients d'exécuter des charges de travail d'IA sur tout type d'environnement

open hybrid cloud icon

Cloud hybride ouvert

Découvrez comment créer un avenir flexible grâce au cloud hybride

security icon

Sécurité

Les dernières actualités sur la façon dont nous réduisons les risques dans tous les environnements et technologies

edge icon

Edge computing

Actualité sur les plateformes qui simplifient les opérations en périphérie

Infrastructure icon

Infrastructure

Les dernières nouveautés sur la plateforme Linux d'entreprise leader au monde

application development icon

Applications

À l’intérieur de nos solutions aux défis d’application les plus difficiles

Original series icon

Programmes originaux

Histoires passionnantes de créateurs et de leaders de technologies d'entreprise