AWS SAA C003 #33

A company runs an online marketplace web application on AWS. The application serves hundreds of thousands of users during peak hours. The company needs a scalable, near-real-time solution to share the details of millions of financial transactions with several other internal applications. Transactions also need to be processed to remove sensitive data before being stored in a document database for low-latency retrieval.

What should a solutions architect recommend to meet these requirements?

A. Store the transactions data into Amazon DynamoDB. Set up a rule in DynamoDB to remove sensitive data from every transaction upon write. Use DynamoDB Streams to share the transactions data with other applications.

B. Stream the transactions data into Amazon Kinesis Data Firehose to store data in Amazon DynamoDB and Amazon S3. Use AWS Lambda integration with Kinesis Data Firehose to remove sensitive data. Other applications can consume the data stored in Amazon S3.

C. Stream the transactions data into Amazon Kinesis Data Streams. Use AWS Lambda integration to remove sensitive data from every transaction and then store the transactions data in Amazon DynamoDB. Other applications can consume the transactions data off the Kinesis data stream.

D. Store the batched transactions data in Amazon S3 as files. Use AWS Lambda to process every file and remove sensitive data before updating the files in Amazon S3. The Lambda function then stores the data in Amazon DynamoDB. Other applications can consume transaction files stored in Amazon S3.


The best option would be C. Stream the transactions data into Amazon Kinesis Data Streams.

This is because Amazon Kinesis Data Streams can handle the high volume of data and provide near-real-time data processing, which is crucial for this scenario. AWS Lambda integration can be used to process each transaction and remove sensitive data before storing it in Amazon DynamoDB. DynamoDB is a good choice for storing the processed transactions due to its low-latency data access capabilities. Other applications can consume the transactions data off the Kinesis data stream, ensuring that all applications have access to the latest transactions data.

Options A, B, and D have certain limitations:

  • Option A: DynamoDB does not have a built-in feature to remove sensitive data upon write.
  • Option B: Storing data in S3 would not provide the low-latency retrieval required for this use case.
  • Option D: Processing files in S3 with Lambda would not provide near-real-time data processing.

Therefore, option C is the most suitable solution for this scenario.

相关推荐
咕噜企业分发小米7 小时前
腾讯云和火山引擎在多云管理工具方面的具体功能差异有哪些?
云计算·腾讯云·火山引擎
咕噜企业分发小米8 小时前
腾讯云在搭建平台的时候起什么作用
云计算·腾讯云
刘某某.9 小时前
obsidian 配置阿里云图床
阿里云·云计算
DO_Community10 小时前
海外云 AWS、GCP、Azure 与 DigitalOcean 的核心区别有哪些?
人工智能·云计算·azure·aws·谷歌云·digitalocean
hans汉斯10 小时前
建模与仿真|基于GWO-BP的晶圆机器人大臂疲劳寿命研究
大数据·数据结构·算法·yolo·机器人·云计算·汉斯出版社
合新通信 | 让光不负所托11 小时前
两相浸没式液冷中,冷却液沸腾产生的气泡会不会干扰光模块的正常工作?
网络·安全·云计算·信息与通信·光纤通信
翼龙云_cloud11 小时前
腾讯云渠道商:腾讯云 CVM 在搭建网站上有哪些常见问题?
服务器·云计算·腾讯云
峰顶听歌的鲸鱼12 小时前
Kubernetes核心概述
运维·笔记·云原生·容器·kubernetes·云计算
SaaS_Product12 小时前
企业网盘可以在局域网使用吗?
网络·人工智能·云计算·saas
FeelTouch Labs13 小时前
云计算数据中心架构的五大核心模块
服务器·架构·云计算