AWS SAA C003 #33

A company runs an online marketplace web application on AWS. The application serves hundreds of thousands of users during peak hours. The company needs a scalable, near-real-time solution to share the details of millions of financial transactions with several other internal applications. Transactions also need to be processed to remove sensitive data before being stored in a document database for low-latency retrieval.

What should a solutions architect recommend to meet these requirements?

A. Store the transactions data into Amazon DynamoDB. Set up a rule in DynamoDB to remove sensitive data from every transaction upon write. Use DynamoDB Streams to share the transactions data with other applications.

B. Stream the transactions data into Amazon Kinesis Data Firehose to store data in Amazon DynamoDB and Amazon S3. Use AWS Lambda integration with Kinesis Data Firehose to remove sensitive data. Other applications can consume the data stored in Amazon S3.

C. Stream the transactions data into Amazon Kinesis Data Streams. Use AWS Lambda integration to remove sensitive data from every transaction and then store the transactions data in Amazon DynamoDB. Other applications can consume the transactions data off the Kinesis data stream.

D. Store the batched transactions data in Amazon S3 as files. Use AWS Lambda to process every file and remove sensitive data before updating the files in Amazon S3. The Lambda function then stores the data in Amazon DynamoDB. Other applications can consume transaction files stored in Amazon S3.


The best option would be C. Stream the transactions data into Amazon Kinesis Data Streams.

This is because Amazon Kinesis Data Streams can handle the high volume of data and provide near-real-time data processing, which is crucial for this scenario. AWS Lambda integration can be used to process each transaction and remove sensitive data before storing it in Amazon DynamoDB. DynamoDB is a good choice for storing the processed transactions due to its low-latency data access capabilities. Other applications can consume the transactions data off the Kinesis data stream, ensuring that all applications have access to the latest transactions data.

Options A, B, and D have certain limitations:

  • Option A: DynamoDB does not have a built-in feature to remove sensitive data upon write.
  • Option B: Storing data in S3 would not provide the low-latency retrieval required for this use case.
  • Option D: Processing files in S3 with Lambda would not provide near-real-time data processing.

Therefore, option C is the most suitable solution for this scenario.

相关推荐
亚马逊云开发者32 分钟前
证书 47 天就过期,还在手动续?聊聊我在 AWS 上的自动化方案
aws
CS创新实验室2 小时前
CS实验室行业报告:云计算与云原生行业分析报告
云原生·云计算
xiejava10182 小时前
个人博客Hugo接入阿里云腾讯云ESA边缘加速实战指南
阿里云·云计算·腾讯云·hugo
weixin_307779131 天前
云计算大数据Azure服务分类详解
大数据·分类·自动化·云计算·azure
2601_958320571 天前
【零基础新手入门 】OpenClaw 2.6.6 对接阿里云百炼配置教程(包含安装包)
人工智能·阿里云·云计算·open claw·小龙虾·open claw安装·open claw一键安装
byoass1 天前
企业云盘高可用架构:主备切换、负载均衡与健康检查实战
运维·网络·安全·架构·云计算·负载均衡
OpenClawCSDN2 天前
2026年怎么集成Hermes Agent/OpenClaw?阿里云搭建及token Plan配置攻略
阿里云·云计算
byoass2 天前
企业云盘与设计软件深度集成:AutoCAD/Revit/SolidWorks插件开发与API集成实战
服务器·网络·数据库·安全·oracle·云计算
OpenClawCSDN2 天前
2026年5月阿里云怎么搭建OpenClaw/Hermes Agent?百炼token Plan配置详解教程
阿里云·云计算
snpgroupcn2 天前
通过退役17套SAP ECC历史系统,降低道达尔能源的SAP数据相关风险
云计算·数据迁移·sap ecc