Valid Data-Architect Dumps shared by ExamDiscuss.com for Helping Passing Data-Architect Exam! ExamDiscuss.com now offer the newest Data-Architect exam dumps, the ExamDiscuss.com Data-Architect exam questions have been updated and answers have been corrected get the newest ExamDiscuss.com Data-Architect dumps with Test Engine here:
NTO need to extract 50 million records from a custom object everyday from its Salesforce org. NTO is facing query timeout issues while extracting these records. What should a data architect recommend in order to get around the time out issue?
Correct Answer: C
The best solution to extract 50 million records from a custom object everyday from Salesforce org without facing query timeout issues is to use an ETL tool for extraction of records. ETL stands for extract, transform, and load, and it refers to a process of moving data from one system to another. An ETL tool is a software application that can connect to various data sources, perform data transformations, and load data into a target destination. ETL tools can handle large volumes of data efficiently and reliably, and they often provide features such as scheduling, monitoring, error handling, and logging5. Using a custom auto number and formula field and use that to chunk records while extracting data is a possible workaround, but it requires creating additional fields and writing complex queries. The REST API can extract data as it automatically chunks records by 200, but it has some limitations, such as a maximum of 50 million records per query job6. Asking SF support to increase the query timeout value is not feasible because query timeout values are not configurable