UNPKG

10.8 kBMarkdownView Raw
1## AWS Lambda Event Sources
2<!--BEGIN STABILITY BANNER-->
3
4---
5
6![Stability: Stable](https://img.shields.io/badge/stability-Stable-success.svg?style=for-the-badge)
7
8
9---
10<!--END STABILITY BANNER-->
11
12An event source mapping is an AWS Lambda resource that reads from an event source and invokes a Lambda function.
13You can use event source mappings to process items from a stream or queue in services that don't invoke Lambda
14functions directly. Lambda provides event source mappings for the following services. Read more about lambda
15event sources [here](https://docs.aws.amazon.com/lambda/latest/dg/invocation-eventsourcemapping.html).
16
17This module includes classes that allow using various AWS services as event
18sources for AWS Lambda via the high-level `lambda.addEventSource(source)` API.
19
20NOTE: In most cases, it is also possible to use the resource APIs to invoke an
21AWS Lambda function. This library provides a uniform API for all Lambda event
22sources regardless of the underlying mechanism they use.
23
24The following code sets up a lambda function with an SQS queue event source -
25
26```ts
27const fn = new lambda.Function(this, 'MyFunction', { /* ... */ });
28
29const queue = new sqs.Queue(this, 'MyQueue');
30const eventSource = lambda.addEventSource(new SqsEventSource(queue);
31
32const eventSourceId = eventSource.eventSourceId;
33```
34
35The `eventSourceId` property contains the event source id. This will be a
36[token](https://docs.aws.amazon.com/cdk/latest/guide/tokens.html) that will resolve to the final value at the time of
37deployment.
38
39### SQS
40
41Amazon Simple Queue Service (Amazon SQS) allows you to build asynchronous
42workflows. For more information about Amazon SQS, see Amazon Simple Queue
43Service. You can configure AWS Lambda to poll for these messages as they arrive
44and then pass the event to a Lambda function invocation. To view a sample event,
45see [Amazon SQS Event](https://docs.aws.amazon.com/lambda/latest/dg/eventsources.html#eventsources-sqs).
46
47To set up Amazon Simple Queue Service as an event source for AWS Lambda, you
48first create or update an Amazon SQS queue and select custom values for the
49queue parameters. The following parameters will impact Amazon SQS's polling
50behavior:
51
52* __visibilityTimeout__: May impact the period between retries.
53* __receiveMessageWaitTime__: Will determine [long
54 poll](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-long-polling.html)
55 duration. The default value is 20 seconds.
56
57```ts
58import sqs = require('@aws-cdk/aws-sqs');
59import { SqsEventSource } from '@aws-cdk/aws-lambda-event-sources';
60import { Duration } from '@aws-cdk/core';
61
62const queue = new sqs.Queue(this, 'MyQueue', {
63 visibilityTimeout: Duration.seconds(30) // default,
64 receiveMessageWaitTime: Duration.seconds(20) // default
65});
66
67lambda.addEventSource(new SqsEventSource(queue, {
68 batchSize: 10 // default
69});
70```
71
72### S3
73
74You can write Lambda functions to process S3 bucket events, such as the
75object-created or object-deleted events. For example, when a user uploads a
76photo to a bucket, you might want Amazon S3 to invoke your Lambda function so
77that it reads the image and creates a thumbnail for the photo.
78
79You can use the bucket notification configuration feature in Amazon S3 to
80configure the event source mapping, identifying the bucket events that you want
81Amazon S3 to publish and which Lambda function to invoke.
82
83```ts
84import s3 = require('@aws-cdk/aws-s3');
85import { S3EventSource } from '@aws-cdk/aws-lambda-event-sources';
86
87const bucket = new s3.Bucket(...);
88
89lambda.addEventSource(new S3EventSource(bucket, {
90 events: [ s3.EventType.OBJECT_CREATED, s3.EventType.OBJECT_REMOVED ],
91 filters: [ { prefix: 'subdir/' } ] // optional
92}));
93```
94
95### SNS
96
97You can write Lambda functions to process Amazon Simple Notification Service
98notifications. When a message is published to an Amazon SNS topic, the service
99can invoke your Lambda function by passing the message payload as a parameter.
100Your Lambda function code can then process the event, for example publish the
101message to other Amazon SNS topics, or send the message to other AWS services.
102
103This also enables you to trigger a Lambda function in response to Amazon
104CloudWatch alarms and other AWS services that use Amazon SNS.
105
106For an example event, see [Appendix: Message and JSON
107Formats](https://docs.aws.amazon.com/sns/latest/dg/json-formats.html) and
108[Amazon SNS Sample
109Event](https://docs.aws.amazon.com/lambda/latest/dg/eventsources.html#eventsources-sns).
110For an example use case, see [Using AWS Lambda with Amazon SNS from Different
111Accounts](https://docs.aws.amazon.com/lambda/latest/dg/with-sns.html).
112
113```ts
114import sns = require('@aws-cdk/aws-sns');
115import { SnsEventSource } from '@aws-cdk/aws-lambda-event-sources';
116
117const topic = new sns.Topic(...);
118
119lambda.addEventSource(new SnsEventSource(topic));
120```
121
122When a user calls the SNS Publish API on a topic that your Lambda function is
123subscribed to, Amazon SNS will call Lambda to invoke your function
124asynchronously. Lambda will then return a delivery status. If there was an error
125calling Lambda, Amazon SNS will retry invoking the Lambda function up to three
126times. After three tries, if Amazon SNS still could not successfully invoke the
127Lambda function, then Amazon SNS will send a delivery status failure message to
128CloudWatch.
129
130### DynamoDB Streams
131
132You can write Lambda functions to process change events from a DynamoDB Table. An event is emitted to a DynamoDB stream (if configured) whenever a write (Put, Delete, Update)
133operation is performed against the table. See [Using AWS Lambda with Amazon DynamoDB](https://docs.aws.amazon.com/lambda/latest/dg/with-ddb.html) for more information about configuring Lambda function event sources with DynamoDB.
134
135To process events with a Lambda function, first create or update a DynamoDB table and enable a `stream` specification. Then, create a `DynamoEventSource`
136and add it to your Lambda function. The following parameters will impact Amazon DynamoDB's polling behavior:
137
138* __batchSize__: Determines how many records are buffered before invoking your lambda function - could impact your function's memory usage (if too high) and ability to keep up with incoming data velocity (if too low).
139* __bisectBatchOnError__: If a batch encounters an error, this will cause the batch to be split in two and have each new smaller batch retried, allowing the records in error to be isolated.
140* __maxBatchingWindow__: The maximum amount of time to gather records before invoking the lambda. This increases the likelihood of a full batch at the cost of delayed processing.
141* __maxRecordAge__: The maximum age of a record that will be sent to the function for processing. Records that exceed the max age will be treated as failures.
142* __onFailure__: In the event a record fails after all retries or if the record age has exceeded the configured value, the record will be sent to SQS queue or SNS topic that is specified here
143* __parallelizationFactor__: The number of batches to concurrently process on each shard.
144* __retryAttempts__: The maximum number of times a record should be retried in the event of failure.
145* __startingPosition__: Will determine where to being consumption, either at the most recent ('LATEST') record or the oldest record ('TRIM_HORIZON'). 'TRIM_HORIZON' will ensure you process all available data, while 'LATEST' will ignore all reocrds that arrived prior to attaching the event source.
146
147```ts
148import dynamodb = require('@aws-cdk/aws-dynamodb');
149import lambda = require('@aws-cdk/aws-lambda');
150import sqs = require('@aws-cdk/aws-sqs');
151import { DynamoEventSource, SqsDlq } from '@aws-cdk/aws-lambda-event-sources';
152
153const table = new dynamodb.Table(..., {
154 partitionKey: ...,
155 stream: dynamodb.StreamViewType.NEW_IMAGE // make sure stream is configured
156});
157
158const deadLetterQueue = new sqs.Queue(this, 'deadLetterQueue');
159
160const function = new lambda.Function(...);
161function.addEventSource(new DynamoEventSource(table, {
162 startingPosition: lambda.StartingPosition.TRIM_HORIZON,
163 batchSize: 5,
164 bisectBatchOnError: true,
165 onFailure: new SqsDlq(deadLetterQueue),
166 retryAttempts: 10
167}));
168```
169
170### Kinesis
171
172You can write Lambda functions to process streaming data in Amazon Kinesis Streams. For more information about Amazon Kinesis, see [Amazon Kinesis
173Service](https://aws.amazon.com/kinesis/data-streams/). To learn more about configuring Lambda function event sources with kinesis and view a sample event,
174see [Amazon Kinesis Event](https://docs.aws.amazon.com/lambda/latest/dg/with-kinesis.html).
175
176To set up Amazon Kinesis as an event source for AWS Lambda, you
177first create or update an Amazon Kinesis stream and select custom values for the
178event source parameters. The following parameters will impact Amazon Kinesis's polling
179behavior:
180
181* __batchSize__: Determines how many records are buffered before invoking your lambda function - could impact your function's memory usage (if too high) and ability to keep up with incoming data velocity (if too low).
182* __bisectBatchOnError__: If a batch encounters an error, this will cause the batch to be split in two and have each new smaller batch retried, allowing the records in error to be isolated.
183* __maxBatchingWindow__: The maximum amount of time to gather records before invoking the lambda. This increases the likelihood of a full batch at the cost of possibly delaying processing.
184* __maxRecordAge__: The maximum age of a record that will be sent to the function for processing. Records that exceed the max age will be treated as failures.
185* __onFailure__: In the event a record fails and consumes all retries, the record will be sent to SQS queue or SNS topic that is specified here
186* __parallelizationFactor__: The number of batches to concurrently process on each shard.
187* __retryAttempts__: The maximum number of times a record should be retried in the event of failure.
188* __startingPosition__: Will determine where to being consumption, either at the most recent ('LATEST') record or the oldest record ('TRIM_HORIZON'). 'TRIM_HORIZON' will ensure you process all available data, while 'LATEST' will ignore all reocrds that arrived prior to attaching the event source.
189
190```ts
191import lambda = require('@aws-cdk/aws-lambda');
192import kinesis = require('@aws-cdk/aws-kinesis');
193import { KinesisEventSource } from '@aws-cdk/aws-lambda-event-sources';
194
195const stream = new kinesis.Stream(this, 'MyStream');
196
197myFunction.addEventSource(new KinesisEventSource(queue, {
198 batchSize: 100, // default
199 startingPosition: lambda.StartingPosition.TRIM_HORIZON
200});
201```
202
203## Roadmap
204
205Eventually, this module will support all the event sources described under
206[Supported Event
207Sources](https://docs.aws.amazon.com/lambda/latest/dg/invoking-lambda-function.html)
208in the AWS Lambda Developer Guide.