AWS SAA C003 #33

A company runs an online marketplace web application on AWS. The application serves hundreds of thousands of users during peak hours. The company needs a scalable, near-real-time solution to share the details of millions of financial transactions with several other internal applications. Transactions also need to be processed to remove sensitive data before being stored in a document database for low-latency retrieval.

What should a solutions architect recommend to meet these requirements?

A. Store the transactions data into Amazon DynamoDB. Set up a rule in DynamoDB to remove sensitive data from every transaction upon write. Use DynamoDB Streams to share the transactions data with other applications.

B. Stream the transactions data into Amazon Kinesis Data Firehose to store data in Amazon DynamoDB and Amazon S3. Use AWS Lambda integration with Kinesis Data Firehose to remove sensitive data. Other applications can consume the data stored in Amazon S3.

C. Stream the transactions data into Amazon Kinesis Data Streams. Use AWS Lambda integration to remove sensitive data from every transaction and then store the transactions data in Amazon DynamoDB. Other applications can consume the transactions data off the Kinesis data stream.

D. Store the batched transactions data in Amazon S3 as files. Use AWS Lambda to process every file and remove sensitive data before updating the files in Amazon S3. The Lambda function then stores the data in Amazon DynamoDB. Other applications can consume transaction files stored in Amazon S3.


The best option would be C. Stream the transactions data into Amazon Kinesis Data Streams.

This is because Amazon Kinesis Data Streams can handle the high volume of data and provide near-real-time data processing, which is crucial for this scenario. AWS Lambda integration can be used to process each transaction and remove sensitive data before storing it in Amazon DynamoDB. DynamoDB is a good choice for storing the processed transactions due to its low-latency data access capabilities. Other applications can consume the transactions data off the Kinesis data stream, ensuring that all applications have access to the latest transactions data.

Options A, B, and D have certain limitations:

  • Option A: DynamoDB does not have a built-in feature to remove sensitive data upon write.
  • Option B: Storing data in S3 would not provide the low-latency retrieval required for this use case.
  • Option D: Processing files in S3 with Lambda would not provide near-real-time data processing.

Therefore, option C is the most suitable solution for this scenario.

相关推荐
卖芒果的潇洒农民6 小时前
20260201 AWS VPC相关概念
云计算·aws
Genie cloud20 小时前
1Panel SSL证书申请完整教程
服务器·网络协议·云计算·ssl
JiL 奥1 天前
Ubuntu系统安装AWS SAM
云计算·aws
liyuanchao_blog1 天前
linuxptp适配记录
linux·云计算
YongCheng_Liang1 天前
从零开始学虚拟化:性能优化全指南(资源分配 + 存储网络 + 监控)
运维·云计算
YongCheng_Liang1 天前
从零开始学虚拟化:高可用与灾备技术全解析(集群 + 备份 + 异地灾备)
运维·云计算
珠海西格2 天前
“主动预防” vs “事后补救”:分布式光伏防逆流技术的代际革命,西格电力给出标准答案
大数据·运维·服务器·分布式·云计算·能源
xianyinsuifeng2 天前
RAG + Code Analysis 的标准路线
数据仓库·自动化·云计算·原型模式·aws
Genie cloud2 天前
在 Mac 上使用 Docker 安装宝塔并部署 LNMP 环境
macos·docker·容器·云计算
php_kevlin2 天前
阿里云AI接口接口
阿里云·云计算