How to publish and use AWS Lambda Layers with the Serverless Framework

AWS re:Invent is in full swing, with AWS announcing a slew of new features. Most notably, we’re pretty excited about AWS Lambda’s support for Layers.

Layers allows you to include additional files or data for your functions. This could be binaries such as FFmpeg or ImageMagick, or it could be difficult-to-package dependencies, such as NumPy for Python. These layers are added to your function’s zip file when published. In a way, they are comparable to EC2 AMIs, but for functions.

The killer feature of Lambda’s Layers is that they can be shared between Lambda functions, accounts, and even publicly!

There are two aspects to using Lambda Layers:

  1. Publishing a layer that can be used by other functions
  2. Using a layer in your function when you publish a new function version.

We’re excited to say that the Serverless Framework has day 1 support for both publishing and using Lambda Layers with your functions with Version 1.34.0!

See how you can publish and use Lambda Layers with the Serverless Framework below…

https://serverless.com/blog/publish-aws-lambda-layers-serverless-framework/

AWS Lambda Programming Language Comparison

benjamin-wong-485320-unsplash

Now that AWS Lambda has added PowerShell to its growing list of supported languages, let’s take a moment to compare and contrast the different languages available to us.

In this post, we’ll take a look at these languages from a number of angles:

  • Cold start performance: performance during a cold start
  • Warm performance: performance after the initial cold start
  • Cost: does it cost you more to run functions in one language over another? If so, why?
  • Ecosystem: libraries, deployment tooling, etc.
  • Platform support: is the language supported by other function-as-a-service (FAAS) platforms?

We will also talk about specialized use cases such as Machine Learning (ML) as well as paying attention to the special needs of the enterprise. Finally, we’ll round off the discussion by looking at a few languages that are not officially supported but that you can use with Lambda via shims.

I should stress that the goal of this post is to consider the relative strengths and weaknesses of each language within the specific context of AWS Lambda. This is not a general purpose language comparison!

https://blog.epsagon.com/aws-lambda-programming-language-comparison

How to create a REST API with pre-written Serverless Components

You might have already heard about our new project, Serverless Components. Our goal was to encapsulate common functionality into so-called “components”, which could then be easily re-used, extended and shared with other developers and other serverless applications.

In this post, I’m going to show you how to compose a fully-fledged, REST API-powered application, all by using several pre-built components from the component registry.

https://serverless.com/blog/how-create-rest-api-serverless-components/

Use Amazon DynamoDB Accelerator (DAX) from AWS Lambda to increase performance while reducing costs

Using Amazon DynamoDB Accelerator (DAX) from AWS Lambda has several benefits for serverless applications that also use Amazon DynamoDB. DAX can improve the response time of your application by dramatically reducing read latency, as compared to using DynamoDB. Using DAX can also lower the cost of DynamoDB by reducing the amount of provisioned read throughput needed for read-heavy applications. For serverless applications, DAX provides an additional benefit: Lower latency results in shorter Lambda execution times, which means lower costs.

Connecting to a DAX cluster from Lambda functions requires some special configuration. In this post, I show an example URL-shortening application based on the AWS Serverless Application Model (AWS SAM). The application uses Amazon API Gateway, Lambda, DynamoDB, DAX, and AWS CloudFormation to demonstrate how to access DAX from Lambda.

https://aws.amazon.com/pt/blogs/database/how-to-increase-performance-while-reducing-costs-by-using-amazon-dynamodb-accelerator-dax-and-aws-lambda/

Notes on structured concurrency, or: Go statement considered harmful

Every concurrency API needs a way to run code concurrently. Here’s some examples of what that looks like using different APIs:

go myfunc(); // Golang
pthread_create(&thread_id, NULL, &myfunc); /* C with POSIX threads */
spawn(modulename, myfuncname, []) % Erlang
threading.Thread(target=myfunc).start() # Python with threads
asyncio.create_task(myfunc()) # Python with asyncio

There are lots of variations in the notation and terminology, but the semantics are the same: these all arrange for myfunc to start running concurrently to the rest of the program, and then return immediately so that the parent can do other things.

Another option is to use callbacks:

QObject::connect(&emitter, SIGNAL(event()), &receiver, SLOT(myfunc())) // C++ with Qt
g_signal_connect(emitter, "event", myfunc, NULL) /* C with GObject */
document.getElementById("myid").onclick = myfunc; // Javascript
promise.then(myfunc, errorhandler) // Javascript with Promises
deferred.addCallback(myfunc) # Python with Twisted
future.add_done_callback(myfunc) # Python with asyncio

Again, the notation varies, but these all accomplish the same thing: they arrange that from now on, if and when a certain event occurs, then myfunc will run. Then once they’ve set that up, they immediately return so the caller can do other things. (Sometimes callbacks get dressed up with fancy helpers like promise combinators, or Twisted-style protocols/transports, but the core idea is the same.)

And… that’s it. Take any real-world, general-purpose concurrency API, and you’ll probably find that it falls into one or the other of those buckets (or sometimes both, like asyncio).

https://vorpus.org/blog/notes-on-structured-concurrency-or-go-statement-considered-harmful/

Node.js APIs on AWS — the pros and cons of Express versus Serverless

Recently I have been playing around with Serverless + AWS lambda and I have to say, I have been awestruck.

Over the past few years I have almost exclusively used Express and AWS EC2(and more recently Docker) to build JavaScript REST APIs.

This piece outlines the pros and cons of Express and Serverless and explains why it made sense for our team at Pilcro to switchover. This piece is aimed at tech teams looking to deploy and manage Node.js APIs on AWS (or similar).

https://medium.freecodecamp.org/node-js-apis-on-aws-the-pros-and-cons-of-express-versus-serverless-a370ab7eadd7

How to escape async/await hell

async/await freed us from callback hell, but people have started abusing it — leading to the birth of async/await hell.

In this article, I will try to explain what async/await hell is, and I’ll also share some tips to escape it.

What is async/await hell

While working with Asynchronous JavaScript, people often write multiple statements one after the other and slap an await before a function call. This causes performance issues, as many times one statement doesn’t depend on the previous one — but you still have to wait for the previous one to complete.

https://medium.freecodecamp.org/avoiding-the-async-await-hell-c77a0fb71c4c