top of page

ᴀᴡꜱ ᴅᴀᴛᴀ ꜱᴛʀᴇᴀᴍɪɴɢ ᴘɪᴘᴇʟɪɴᴇ


Inspired by AWS's latest blog on BIG DATA.

Here is a reference to the implementation of AWS Data Streaming Pipeline using AWS DynamoDB, Kinesis Data Streams, Kinesis Firehose, AWS S3, and AWS Athena.

Amazon Kinesis Data Streams for DynamoDB enables you to directly send data from DynamoDB tables to Kinesis data streams (without having to use Lambda or write custom code).

Additionally, you can leverage this feature for use cases that require longer data retention on the stream and fan-out to multiple, concurrent stream readers.

Make use of AWS Cloudformation to kick-off:

  • A Kinesis data stream to replicate data from the DynamoDB table

  • An AWS Identity and Access Management (IAM) role

  • A destination S3 bucket to store and analyze data

  • A Kinesis Data Firehose delivery stream

or else you can set up above mentioned resources separately and assign proper IAM permission. For more info:

About Author: Riyaz Ul Haque

Senior Software Engineer | AWS Community Builder

1 Comment

Post: Blog2_Post
bottom of page