AWS SAA C003 #33

A company runs an online marketplace web application on AWS. The application serves hundreds of thousands of users during peak hours. The company needs a scalable, near-real-time solution to share the details of millions of financial transactions with several other internal applications. Transactions also need to be processed to remove sensitive data before being stored in a document database for low-latency retrieval.

What should a solutions architect recommend to meet these requirements?

A. Store the transactions data into Amazon DynamoDB. Set up a rule in DynamoDB to remove sensitive data from every transaction upon write. Use DynamoDB Streams to share the transactions data with other applications.

B. Stream the transactions data into Amazon Kinesis Data Firehose to store data in Amazon DynamoDB and Amazon S3. Use AWS Lambda integration with Kinesis Data Firehose to remove sensitive data. Other applications can consume the data stored in Amazon S3.

C. Stream the transactions data into Amazon Kinesis Data Streams. Use AWS Lambda integration to remove sensitive data from every transaction and then store the transactions data in Amazon DynamoDB. Other applications can consume the transactions data off the Kinesis data stream.

D. Store the batched transactions data in Amazon S3 as files. Use AWS Lambda to process every file and remove sensitive data before updating the files in Amazon S3. The Lambda function then stores the data in Amazon DynamoDB. Other applications can consume transaction files stored in Amazon S3.


The best option would be C. Stream the transactions data into Amazon Kinesis Data Streams.

This is because Amazon Kinesis Data Streams can handle the high volume of data and provide near-real-time data processing, which is crucial for this scenario. AWS Lambda integration can be used to process each transaction and remove sensitive data before storing it in Amazon DynamoDB. DynamoDB is a good choice for storing the processed transactions due to its low-latency data access capabilities. Other applications can consume the transactions data off the Kinesis data stream, ensuring that all applications have access to the latest transactions data.

Options A, B, and D have certain limitations:

  • Option A: DynamoDB does not have a built-in feature to remove sensitive data upon write.
  • Option B: Storing data in S3 would not provide the low-latency retrieval required for this use case.
  • Option D: Processing files in S3 with Lambda would not provide near-real-time data processing.

Therefore, option C is the most suitable solution for this scenario.

相关推荐
Genie cloud16 小时前
外贸独立站建站完整教程
服务器·数据库·云计算
m0_6948455717 小时前
网站账号太多难管理?Enterr 开源自动化工具搭建教程
运维·服务器·前端·开源·自动化·云计算
yantaohk18 小时前
【2025亲测】中兴B860AV3.2M完美刷机包ATV版本安卓9-解决1G运存BUG,开ADB已ROOT
android·嵌入式硬件·adb·云计算
代码N年归来仍是新手村成员18 小时前
DynamoDB 速通
数据库·nosql·aws
许国栋_21 小时前
产品管理系统怎么选?2026主流工具横评、场景适配与避坑
大数据·安全·阿里云·云计算·团队开发
翼龙云_cloud21 小时前
阿里云渠道商:怎么实现阿里云ECI伸缩组镜像自动更新?
服务器·阿里云·云计算
i建模21 小时前
在Windows系统上通过SSH访问远程AWS主机
windows·ssh·aws
_运维那些事儿1 天前
GitLabCI-CD入门
运维·ci/cd·容器·云计算·k8s·运维开发
Akamai中国1 天前
Akamai Cloud客户案例 | Multivrse 信赖 Akamai 为其业务增长提供动力,实现更快资源调配、成本节约与更低延迟
人工智能·云计算·云服务·云存储
不做码农好多年,该何去何从。1 天前
阿里云上使用docker-compose安装禅道
阿里云·docker·云计算