Luckily for us serverless enthusiasts, October had lots of interesting meetup videos regarding the serverless framework. Check out our favourites below.
The hype around serverless architecture have been buzzing for the last 3 years. It comes as a result of the rising popularity of cloud computing, where providers like Google, Microsoft and Amazon have raised the abstraction level when deploying software. Now, Function as a Service (FaaS), provider needs one thing: your code, in a bare-bones fashion, no binaries needed.
The main criticism the cloud providers get is portability and fear of “lock-in” in FaaS solutions. A function written for Amazon Lambda looks nothing like one written for Azure Functions, and Google Cloud Functions is different again. Some languages are not available at your provider of choice, making the threshold of getting started even higher. If you haven’t moved to the cloud at all, running some kind of serverless solutions has also proven hard.
OpenFaaS is an open source serverless platform that leverages container technology to make sure that your functions can run anywhere: Your laptop, your on-premise hardware, your cluster in the cloud, your spare Raspberry Pis or a mix of all of it. OpenFaaS can also manage your microservices that haven’t been split up into functions yet.
This talk will give a brief introduction to the key-features of OpenFaaS, and show how you’re company can leverage serverless and FaaS, even if you haven’t moved to the cloud yet.
Imagine one hot and humid day in summer. Would you expect your air-con at home has already been cooling down the air so that you feel refreshed on arrival? It would be perfect if I can turn on the air-con just 10 minutes before home! Are you to die for it? What if I can connect my air-con with Raspberry PI that runs a serverless application, which is triggered by a public cloud application? It sounds so sweet! But… life is not that easy. I’d like to share my hard learning experiences while setting up this home automation system, using Raspberry PI, Azure Functions, Power Automate and Power Apps.
Serverless is a hot topic right now, and something that a lot of developers are keen to try.
A lot of focus has been on implementations that are proprietary to and only run on a single provider’s cloud.
In this talk, I’ll show how you can develop serverless functions on your laptop, with an open source platform and run them where you like. However, if you attend a Serverless event, you may come away with the impression that it’s a world of proprietary walled gardens from the major cloud providers.
In this talk Ewan Slater will cover:
- The case for open source serverless frameworks in general
- The fnproject (fnproject.io) in particular
- Implementing serverless Shakespeare on a laptop
Serverless applications are usually made up of functions interacting with fully-managed services, so you can develop applications without having to think about servers. This enables us to build applications that scale quickly and reliably based on incoming requests, often in the form of events that go well beyond API requests and scheduled cron job type rules. In the event-driven model, the components communicate with events and that helps you adopt some of the best practices for distributed systems by default. In this talk, we’ll explore what events are, the different types of events available to your serverless applications, where they come from and how to utilize them to build applications that can provide more value to your customers. All of this with a lot of architectural pattern examples.
This introduces Knative & Cloud Run and shows how they can be used to run modern serverless workloads. Knative is a reference API & implementation and Cloud Run is a product built on the Knative specification. Knative is a Kubernetes-based platform to build, deploy, and manage modern serverless workloads. It provides a set of middleware components that are essential to build modern, source-centric, and container-based applications. Knative components are built on Kubernetes and codify the best practices shared by successful real-world Kubernetes-based frameworks. Cloud Run is a managed compute platform that is built upon Knative that automatically scales your stateless containers. Cloud Run is serverless: it abstracts away all infrastructure management. It is compatible with Knative, letting you choose to easily run your containers either fully managed with Cloud Run, or in your Google Kubernetes Engine cluster with Cloud Run on GKE.