1 | ## AWS Lambda Event Sources
|
2 |
|
3 |
|
4 | ---
|
5 |
|
6 | ![Stability: Stable](https://img.shields.io/badge/stability-Stable-success.svg?style=for-the-badge)
|
7 |
|
8 |
|
9 | ---
|
10 |
|
11 |
|
12 | An event source mapping is an AWS Lambda resource that reads from an event source and invokes a Lambda function.
|
13 | You can use event source mappings to process items from a stream or queue in services that don't invoke Lambda
|
14 | functions directly. Lambda provides event source mappings for the following services. Read more about lambda
|
15 | event sources [here](https://docs.aws.amazon.com/lambda/latest/dg/invocation-eventsourcemapping.html).
|
16 |
|
17 | This module includes classes that allow using various AWS services as event
|
18 | sources for AWS Lambda via the high-level `lambda.addEventSource(source)` API.
|
19 |
|
20 | NOTE: In most cases, it is also possible to use the resource APIs to invoke an
|
21 | AWS Lambda function. This library provides a uniform API for all Lambda event
|
22 | sources regardless of the underlying mechanism they use.
|
23 |
|
24 | The following code sets up a lambda function with an SQS queue event source -
|
25 |
|
26 | ```ts
|
27 | const fn = new lambda.Function(this, 'MyFunction', { /* ... */ });
|
28 |
|
29 | const queue = new sqs.Queue(this, 'MyQueue');
|
30 | const eventSource = lambda.addEventSource(new SqsEventSource(queue);
|
31 |
|
32 | const eventSourceId = eventSource.eventSourceId;
|
33 | ```
|
34 |
|
35 | The `eventSourceId` property contains the event source id. This will be a
|
36 | [token](https://docs.aws.amazon.com/cdk/latest/guide/tokens.html) that will resolve to the final value at the time of
|
37 | deployment.
|
38 |
|
39 | ### SQS
|
40 |
|
41 | Amazon Simple Queue Service (Amazon SQS) allows you to build asynchronous
|
42 | workflows. For more information about Amazon SQS, see Amazon Simple Queue
|
43 | Service. You can configure AWS Lambda to poll for these messages as they arrive
|
44 | and then pass the event to a Lambda function invocation. To view a sample event,
|
45 | see [Amazon SQS Event](https://docs.aws.amazon.com/lambda/latest/dg/eventsources.html#eventsources-sqs).
|
46 |
|
47 | To set up Amazon Simple Queue Service as an event source for AWS Lambda, you
|
48 | first create or update an Amazon SQS queue and select custom values for the
|
49 | queue parameters. The following parameters will impact Amazon SQS's polling
|
50 | behavior:
|
51 |
|
52 | * __visibilityTimeout__: May impact the period between retries.
|
53 | * __receiveMessageWaitTime__: Will determine [long
|
54 | poll](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-long-polling.html)
|
55 | duration. The default value is 20 seconds.
|
56 |
|
57 | ```ts
|
58 | import sqs = require('@aws-cdk/aws-sqs');
|
59 | import { SqsEventSource } from '@aws-cdk/aws-lambda-event-sources';
|
60 | import { Duration } from '@aws-cdk/core';
|
61 |
|
62 | const queue = new sqs.Queue(this, 'MyQueue', {
|
63 | visibilityTimeout: Duration.seconds(30) // default,
|
64 | receiveMessageWaitTime: Duration.seconds(20) // default
|
65 | });
|
66 |
|
67 | lambda.addEventSource(new SqsEventSource(queue, {
|
68 | batchSize: 10 // default
|
69 | });
|
70 | ```
|
71 |
|
72 | ### S3
|
73 |
|
74 | You can write Lambda functions to process S3 bucket events, such as the
|
75 | object-created or object-deleted events. For example, when a user uploads a
|
76 | photo to a bucket, you might want Amazon S3 to invoke your Lambda function so
|
77 | that it reads the image and creates a thumbnail for the photo.
|
78 |
|
79 | You can use the bucket notification configuration feature in Amazon S3 to
|
80 | configure the event source mapping, identifying the bucket events that you want
|
81 | Amazon S3 to publish and which Lambda function to invoke.
|
82 |
|
83 | ```ts
|
84 | import s3 = require('@aws-cdk/aws-s3');
|
85 | import { S3EventSource } from '@aws-cdk/aws-lambda-event-sources';
|
86 |
|
87 | const bucket = new s3.Bucket(...);
|
88 |
|
89 | lambda.addEventSource(new S3EventSource(bucket, {
|
90 | events: [ s3.EventType.OBJECT_CREATED, s3.EventType.OBJECT_REMOVED ],
|
91 | filters: [ { prefix: 'subdir/' } ] // optional
|
92 | }));
|
93 | ```
|
94 |
|
95 | ### SNS
|
96 |
|
97 | You can write Lambda functions to process Amazon Simple Notification Service
|
98 | notifications. When a message is published to an Amazon SNS topic, the service
|
99 | can invoke your Lambda function by passing the message payload as a parameter.
|
100 | Your Lambda function code can then process the event, for example publish the
|
101 | message to other Amazon SNS topics, or send the message to other AWS services.
|
102 |
|
103 | This also enables you to trigger a Lambda function in response to Amazon
|
104 | CloudWatch alarms and other AWS services that use Amazon SNS.
|
105 |
|
106 | For an example event, see [Appendix: Message and JSON
|
107 | Formats](https://docs.aws.amazon.com/sns/latest/dg/json-formats.html) and
|
108 | [Amazon SNS Sample
|
109 | Event](https://docs.aws.amazon.com/lambda/latest/dg/eventsources.html#eventsources-sns).
|
110 | For an example use case, see [Using AWS Lambda with Amazon SNS from Different
|
111 | Accounts](https://docs.aws.amazon.com/lambda/latest/dg/with-sns.html).
|
112 |
|
113 | ```ts
|
114 | import sns = require('@aws-cdk/aws-sns');
|
115 | import { SnsEventSource } from '@aws-cdk/aws-lambda-event-sources';
|
116 |
|
117 | const topic = new sns.Topic(...);
|
118 |
|
119 | lambda.addEventSource(new SnsEventSource(topic));
|
120 | ```
|
121 |
|
122 | When a user calls the SNS Publish API on a topic that your Lambda function is
|
123 | subscribed to, Amazon SNS will call Lambda to invoke your function
|
124 | asynchronously. Lambda will then return a delivery status. If there was an error
|
125 | calling Lambda, Amazon SNS will retry invoking the Lambda function up to three
|
126 | times. After three tries, if Amazon SNS still could not successfully invoke the
|
127 | Lambda function, then Amazon SNS will send a delivery status failure message to
|
128 | CloudWatch.
|
129 |
|
130 | ### DynamoDB Streams
|
131 |
|
132 | You can write Lambda functions to process change events from a DynamoDB Table. An event is emitted to a DynamoDB stream (if configured) whenever a write (Put, Delete, Update)
|
133 | operation is performed against the table. See [Using AWS Lambda with Amazon DynamoDB](https://docs.aws.amazon.com/lambda/latest/dg/with-ddb.html) for more information about configuring Lambda function event sources with DynamoDB.
|
134 |
|
135 | To process events with a Lambda function, first create or update a DynamoDB table and enable a `stream` specification. Then, create a `DynamoEventSource`
|
136 | and add it to your Lambda function. The following parameters will impact Amazon DynamoDB's polling behavior:
|
137 |
|
138 | * __batchSize__: Determines how many records are buffered before invoking your lambda function - could impact your function's memory usage (if too high) and ability to keep up with incoming data velocity (if too low).
|
139 | * __bisectBatchOnError__: If a batch encounters an error, this will cause the batch to be split in two and have each new smaller batch retried, allowing the records in error to be isolated.
|
140 | * __maxBatchingWindow__: The maximum amount of time to gather records before invoking the lambda. This increases the likelihood of a full batch at the cost of delayed processing.
|
141 | * __maxRecordAge__: The maximum age of a record that will be sent to the function for processing. Records that exceed the max age will be treated as failures.
|
142 | * __onFailure__: In the event a record fails after all retries or if the record age has exceeded the configured value, the record will be sent to SQS queue or SNS topic that is specified here
|
143 | * __parallelizationFactor__: The number of batches to concurrently process on each shard.
|
144 | * __retryAttempts__: The maximum number of times a record should be retried in the event of failure.
|
145 | * __startingPosition__: Will determine where to being consumption, either at the most recent ('LATEST') record or the oldest record ('TRIM_HORIZON'). 'TRIM_HORIZON' will ensure you process all available data, while 'LATEST' will ignore all reocrds that arrived prior to attaching the event source.
|
146 |
|
147 | ```ts
|
148 | import dynamodb = require('@aws-cdk/aws-dynamodb');
|
149 | import lambda = require('@aws-cdk/aws-lambda');
|
150 | import sqs = require('@aws-cdk/aws-sqs');
|
151 | import { DynamoEventSource, SqsDlq } from '@aws-cdk/aws-lambda-event-sources';
|
152 |
|
153 | const table = new dynamodb.Table(..., {
|
154 | partitionKey: ...,
|
155 | stream: dynamodb.StreamViewType.NEW_IMAGE // make sure stream is configured
|
156 | });
|
157 |
|
158 | const deadLetterQueue = new sqs.Queue(this, 'deadLetterQueue');
|
159 |
|
160 | const function = new lambda.Function(...);
|
161 | function.addEventSource(new DynamoEventSource(table, {
|
162 | startingPosition: lambda.StartingPosition.TRIM_HORIZON,
|
163 | batchSize: 5,
|
164 | bisectBatchOnError: true,
|
165 | onFailure: new SqsDlq(deadLetterQueue),
|
166 | retryAttempts: 10
|
167 | }));
|
168 | ```
|
169 |
|
170 | ### Kinesis
|
171 |
|
172 | You can write Lambda functions to process streaming data in Amazon Kinesis Streams. For more information about Amazon Kinesis, see [Amazon Kinesis
|
173 | Service](https://aws.amazon.com/kinesis/data-streams/). To learn more about configuring Lambda function event sources with kinesis and view a sample event,
|
174 | see [Amazon Kinesis Event](https://docs.aws.amazon.com/lambda/latest/dg/with-kinesis.html).
|
175 |
|
176 | To set up Amazon Kinesis as an event source for AWS Lambda, you
|
177 | first create or update an Amazon Kinesis stream and select custom values for the
|
178 | event source parameters. The following parameters will impact Amazon Kinesis's polling
|
179 | behavior:
|
180 |
|
181 | * __batchSize__: Determines how many records are buffered before invoking your lambda function - could impact your function's memory usage (if too high) and ability to keep up with incoming data velocity (if too low).
|
182 | * __bisectBatchOnError__: If a batch encounters an error, this will cause the batch to be split in two and have each new smaller batch retried, allowing the records in error to be isolated.
|
183 | * __maxBatchingWindow__: The maximum amount of time to gather records before invoking the lambda. This increases the likelihood of a full batch at the cost of possibly delaying processing.
|
184 | * __maxRecordAge__: The maximum age of a record that will be sent to the function for processing. Records that exceed the max age will be treated as failures.
|
185 | * __onFailure__: In the event a record fails and consumes all retries, the record will be sent to SQS queue or SNS topic that is specified here
|
186 | * __parallelizationFactor__: The number of batches to concurrently process on each shard.
|
187 | * __retryAttempts__: The maximum number of times a record should be retried in the event of failure.
|
188 | * __startingPosition__: Will determine where to being consumption, either at the most recent ('LATEST') record or the oldest record ('TRIM_HORIZON'). 'TRIM_HORIZON' will ensure you process all available data, while 'LATEST' will ignore all reocrds that arrived prior to attaching the event source.
|
189 |
|
190 | ```ts
|
191 | import lambda = require('@aws-cdk/aws-lambda');
|
192 | import kinesis = require('@aws-cdk/aws-kinesis');
|
193 | import { KinesisEventSource } from '@aws-cdk/aws-lambda-event-sources';
|
194 |
|
195 | const stream = new kinesis.Stream(this, 'MyStream');
|
196 |
|
197 | myFunction.addEventSource(new KinesisEventSource(queue, {
|
198 | batchSize: 100, // default
|
199 | startingPosition: lambda.StartingPosition.TRIM_HORIZON
|
200 | });
|
201 | ```
|
202 |
|
203 | ## Roadmap
|
204 |
|
205 | Eventually, this module will support all the event sources described under
|
206 | [Supported Event
|
207 | Sources](https://docs.aws.amazon.com/lambda/latest/dg/invoking-lambda-function.html)
|
208 | in the AWS Lambda Developer Guide.
|