AWS SAA C003 #33

A company runs an online marketplace web application on AWS. The application serves hundreds of thousands of users during peak hours. The company needs a scalable, near-real-time solution to share the details of millions of financial transactions with several other internal applications. Transactions also need to be processed to remove sensitive data before being stored in a document database for low-latency retrieval.

What should a solutions architect recommend to meet these requirements?

A. Store the transactions data into Amazon DynamoDB. Set up a rule in DynamoDB to remove sensitive data from every transaction upon write. Use DynamoDB Streams to share the transactions data with other applications.

B. Stream the transactions data into Amazon Kinesis Data Firehose to store data in Amazon DynamoDB and Amazon S3. Use AWS Lambda integration with Kinesis Data Firehose to remove sensitive data. Other applications can consume the data stored in Amazon S3.

C. Stream the transactions data into Amazon Kinesis Data Streams. Use AWS Lambda integration to remove sensitive data from every transaction and then store the transactions data in Amazon DynamoDB. Other applications can consume the transactions data off the Kinesis data stream.

D. Store the batched transactions data in Amazon S3 as files. Use AWS Lambda to process every file and remove sensitive data before updating the files in Amazon S3. The Lambda function then stores the data in Amazon DynamoDB. Other applications can consume transaction files stored in Amazon S3.


The best option would be C. Stream the transactions data into Amazon Kinesis Data Streams.

This is because Amazon Kinesis Data Streams can handle the high volume of data and provide near-real-time data processing, which is crucial for this scenario. AWS Lambda integration can be used to process each transaction and remove sensitive data before storing it in Amazon DynamoDB. DynamoDB is a good choice for storing the processed transactions due to its low-latency data access capabilities. Other applications can consume the transactions data off the Kinesis data stream, ensuring that all applications have access to the latest transactions data.

Options A, B, and D have certain limitations:

  • Option A: DynamoDB does not have a built-in feature to remove sensitive data upon write.
  • Option B: Storing data in S3 would not provide the low-latency retrieval required for this use case.
  • Option D: Processing files in S3 with Lambda would not provide near-real-time data processing.

Therefore, option C is the most suitable solution for this scenario.

相关推荐
Akamai中国4 小时前
Linebreak赋能实时化企业转型:专业系统集成商携手Akamai以实时智能革新企业运营
人工智能·云计算·云服务
2401_865382507 小时前
工信部发布→《云计算综合标准化体系建设指南(2025版)》
网络安全·云计算·信息安全技术
云宏信息9 小时前
【深度解析】VMware替代的关键一环:云宏ROW快照如何实现高频业务下的“无感”数据保护?
服务器·网络·数据库·架构·云计算·快照
wanhengidc13 小时前
如何使用云手机进行游戏挂机?
运维·服务器·游戏·智能手机·云计算
王道长服务器 | 亚马逊云14 小时前
AWS CloudTrail:让每一次操作都“有迹可循”
服务器·网络·云计算·智能路由器·aws
不知道累,只知道类14 小时前
Java 在AWS上使用SDK凭证获取顺序
java·aws
王道长AWS_服务器15 小时前
AWS Elastic Load Balancing(ELB)—— 多站点负载均衡的正确打开方式
后端·程序员·aws
wanhengidc15 小时前
操作简单稳定选巨 椰 云手机
运维·服务器·游戏·智能手机·云计算
Apache Flink15 小时前
阿里云、Ververica、Confluent、Linkedin携手推进流式创新,共筑智能体AI未来
人工智能·阿里云·云计算
AWS官方合作商16 小时前
AWS WAF 深度体验:全新控制台,开启云原生WAF与CloudFront无缝联防新纪元
云原生·aws