Valid Data-Architect Dumps shared by ExamDiscuss.com for Helping Passing Data-Architect Exam! ExamDiscuss.com now offer the newest Data-Architect exam dumps, the ExamDiscuss.com Data-Architect exam questions have been updated and answers have been corrected get the newest ExamDiscuss.com Data-Architect dumps with Test Engine here:
Universal Containers (UC) requires 2 years of customer related cases to be available on SF for operational reporting. Any cases older than 2 years and upto 7 years need to be available on demand to the Service agents. UC creates 5 million cases per yr. Which 2 data archiving strategies should a data architect recommend? Choose 2 options:
Correct Answer: C,D
The best data archiving strategies for UC are to use Big objects and Heroku with external objects. Big objects allow storing large amounts of data on the Salesforce platform without affecting performance or storage limits. They also support point-and-click tools, triggers, and Apex code. Heroku is a cloud platform that can host external databases and integrate with Salesforce using external objects. External objects enable on-demand access to external data sources via standard Salesforce APIs and user interfaces. Using bulk API to hard delete cases from Salesforce will free up storage space and improve performance.
Recent Comments (The most recent comments are at the top.)
haitran - Jan 23, 2025
I think B & C
haitran - Jan 06, 2025
B is definitely right because it's a recommended solution A cannot be right because over time, you'll hit a data limit C cannot be right because Heroku has 20million record limitation, and 5 years x 5 mill cases is 25million which is over the limit D - big objects have enough capacity to handle the data volume
Recent Comments (The most recent comments are at the top.)
I think B & C
B is definitely right because it's a recommended solution
A cannot be right because over time, you'll hit a data limit
C cannot be right because Heroku has 20million record limitation, and 5 years x 5 mill cases is 25million which is over the limit
D - big objects have enough capacity to handle the data volume
I think B & C