订阅内容

We are tremendously excited to be part of the Knative (pronounced kay-nay-tiv) project recently announced at Google Next. Red Hat's strategy around serverless will embrace and benefit from the ideas Knative supports.

Red Hat has always focused on portability and consistency across hybrid and multi-cloud environments, as needed by many of our customers. We recently introduced Red Hat OpenShift Cloud Functions to further satisfy those needs in a changing technological environment, enabling an enterprise-grade serverless platform with OpenShift.

Knative provides fundamental building blocks for serverless workloads in Kubernetes, empowering the creation of modern, container-based and cloud-native applications which can be deployed anywhere on Kubernetes.

“Red Hat’s focus on the hybrid cloud is rooted in customer choice and we carry that same perspective to our work to make hybrid serverless a reality. Customers should be able to create functions that are triggered by events in a hybrid cloud environment that spans multiple public clouds as well as the datacenter. By joining the Knative community, we are combining our deep enterprise Kubernetes and open source expertise to help create a common building block for serverless on top of Kubernetes across the hybrid cloud.” (Chris Wright, Red Hat Vice President and Chief Technology Officer)

Let’s take this opportunity to draw a distinction between “functions-as-a-service” and “serverless”:

Functions-as-a-Service (FaaS) is an event-driven computing execution model that runs in stateless containers and those functions manage server-side logic and state through the use of services. Serverless is the architectural pattern that describe applications that combine FaaS and those hosted (managed) services. MartinFowler.com has a great article that provides more details and the origin of the terms.

By providing the same building blocks across multiple vendors, Knative addresses the issue of portability and interoperability across different serverless platforms, which can be tremendously valuable because it allows parties building FaaS platforms to focus on providing business value as opposed to building the fundamentals themselves, while leveraging the talent and resources of a wider community.

Knative implements primitives for function and application development through a series of CRDs (Custom Resource Definition) and associated controllers in Kubernetes which provides for declarative specification of what a developer wants, instead of procedural instructions on how to attain it. Just as prior efforts to extend Kubernetes drove the development and refinement of the mechanisms available to Knative, making it possible to extend Kubernetes today, we believe that Knative development will drive improvements in Kubernetes and Istio.

Organizations are looking to adopt FaaS and Serverless solutions, but there are lots of workloads that can be brought to Kubernetes and OpenShift today. Combining Operators, Knative and OpenShift enables stateful, stateless, and serverless workloads to all run on a single multi-cloud container platform with automated operations. We plan to build OpenShift Cloud Functions on Knative, and Red Hat has been working in collaboration with Google and other partners, bringing its enterprise Kubernetes and community expertise to the project.

Knative and OpenShift will be cornerstones of OpenShift Cloud Functions and our goal is to provide the best operational capabilities and developer experience for serverless application developers, following Red Hat’s commitment to community first and open source development. Developers can benefit from being able to use a single platform for hosting their microservices, legacy, and serverless applications. For Red Hat customers that want to run more of their application workloads to Kubernetes and also want a flexible consumption model, Knative and OpenShift Cloud Functions can help deliver that. We like Knative and think our customers will also.

This post was written in collaboration with Paul Morie and Steve Speicher.  


关于作者

William Markito Oliveira is an energetic and passionate product leader with expertise in software engineering and distributed systems. He leads a group of product managers working on innovative and emerging technologies.

He has worked in global and distributed organizations, with success building high-performance teams and successful products. He also has experience establishing internal and external partnerships delivering high-value outcomes. Oliveira is engaged in open source by fostering communities and governance, and he's looking forward to embracing new technologies and paradigms with a focus on distributed systems. He's spoken at multiple software conferences and co-authored a few books on the field.

Currently, Oliveira is focused on the intersection between serverless, cloud computing, and Kubernetes and applying AI/ML concepts whenever possible.

Read full bio
UI_Icon-Red_Hat-Close-A-Black-RGB

按频道浏览

automation icon

自动化

有关技术、团队和环境 IT 自动化的最新信息

AI icon

人工智能

平台更新使客户可以在任何地方运行人工智能工作负载

open hybrid cloud icon

开放混合云

了解我们如何利用混合云构建更灵活的未来

security icon

安全防护

有关我们如何跨环境和技术减少风险的最新信息

edge icon

边缘计算

简化边缘运维的平台更新

Infrastructure icon

基础架构

全球领先企业 Linux 平台的最新动态

application development icon

应用领域

我们针对最严峻的应用挑战的解决方案

Original series icon

原创节目

关于企业技术领域的创客和领导者们有趣的故事