AWS SAA C003 #33

A company runs an online marketplace web application on AWS. The application serves hundreds of thousands of users during peak hours. The company needs a scalable, near-real-time solution to share the details of millions of financial transactions with several other internal applications. Transactions also need to be processed to remove sensitive data before being stored in a document database for low-latency retrieval.

What should a solutions architect recommend to meet these requirements?

A. Store the transactions data into Amazon DynamoDB. Set up a rule in DynamoDB to remove sensitive data from every transaction upon write. Use DynamoDB Streams to share the transactions data with other applications.

B. Stream the transactions data into Amazon Kinesis Data Firehose to store data in Amazon DynamoDB and Amazon S3. Use AWS Lambda integration with Kinesis Data Firehose to remove sensitive data. Other applications can consume the data stored in Amazon S3.

C. Stream the transactions data into Amazon Kinesis Data Streams. Use AWS Lambda integration to remove sensitive data from every transaction and then store the transactions data in Amazon DynamoDB. Other applications can consume the transactions data off the Kinesis data stream.

D. Store the batched transactions data in Amazon S3 as files. Use AWS Lambda to process every file and remove sensitive data before updating the files in Amazon S3. The Lambda function then stores the data in Amazon DynamoDB. Other applications can consume transaction files stored in Amazon S3.


The best option would be C. Stream the transactions data into Amazon Kinesis Data Streams.

This is because Amazon Kinesis Data Streams can handle the high volume of data and provide near-real-time data processing, which is crucial for this scenario. AWS Lambda integration can be used to process each transaction and remove sensitive data before storing it in Amazon DynamoDB. DynamoDB is a good choice for storing the processed transactions due to its low-latency data access capabilities. Other applications can consume the transactions data off the Kinesis data stream, ensuring that all applications have access to the latest transactions data.

Options A, B, and D have certain limitations:

  • Option A: DynamoDB does not have a built-in feature to remove sensitive data upon write.
  • Option B: Storing data in S3 would not provide the low-latency retrieval required for this use case.
  • Option D: Processing files in S3 with Lambda would not provide near-real-time data processing.

Therefore, option C is the most suitable solution for this scenario.

相关推荐
TG_yunshuguoji1 天前
亚马逊云渠道商:本地SSD缓存如何保障数据安全?
运维·服务器·安全·云计算·aws
TG_yunshuguoji1 天前
亚马逊云渠道商:如何通过配置自动替换构建故障自愈的云架构?
运维·服务器·架构·云计算·aws
TG:@yunlaoda360 云老大1 天前
阿里云国际站GPU:怎么通过通过VNC连接实例?
服务器·阿里云·云计算
TG_yunshuguoji2 天前
亚马逊云渠道商:AWS 本地 SSD 缓存是什么?
缓存·云计算·aws
运维行者_2 天前
AWS云服务故障复盘——从故障中汲取的 IT 运维经验
大数据·linux·运维·服务器·人工智能·云计算·aws
王道长服务器 | 亚马逊云2 天前
AWS Systems Manager:批量服务器管理的隐藏利器
linux·网络·云计算·智能路由器·aws
亚林瓜子2 天前
给aws xray添加采样规则
aws·1024程序员节·xray·rule
weixin_307779132 天前
C#程序实现将MySQL的存储过程转换为Azure Synapse Dedicated SQL Pool的T-SQL存储过程
c#·自动化·云计算·运维开发·azure
TG_yunshuguoji2 天前
亚马逊云渠道商:AWS实例自动替换策略在哪里设置?
运维·服务器·云计算·aws
TG:@yunlaoda360 云老大2 天前
阿里云国际站GPU:阿里云GPU怎么释放实例?
服务器·阿里云·云计算