Implementing DynamoDB triggers (streams) using CloudFormation
10 January 2018
In serverless architectures, as much as possible of the implementation should be done event-driven. One driver of this is using triggers whenever possible.
DynamoDB comes in very handy since it does support triggers through DynamoDB Streams. On the other end of a Stream usually is a Lambda function which processes the changed information asynchronously.
So I tried building that pattern and recognized, that it is not that straightforward to implement in cloudformation.
Here the visual overview of what I am building:
The first part of the CloudFormation template is the definition of the Lambda function which will receive the DynamoDB event stream.
This basically just implements an echo of all incoming information.
Now the role attached to this function needs the policy to read from the event stream.
After setting up the receiving part, I needed to define a DynamoDB table. The only significant property here is the StreamSpecification. This property actually defines the trigger and configures the trigger payload. In my case, I’m only interested in the new document. It is also possible to pass the new and old document around (see here).
Now comes the tricky part. To actually connect the Lambda with the trigger, I had to introduce an “AWS::Lambda::EventSourceMapping”-object. This is the glue which can connect both ends.
All of this combined results in DynamoDB Table which trigger a Lambda on every change event. Filtering the event stream is only possible within the Lambda implementation.