Valid DP-200 Dumps shared by ExamDiscuss.com for Helping Passing DP-200 Exam! ExamDiscuss.com now offer the newest DP-200 exam dumps, the ExamDiscuss.com DP-200 exam questions have been updated and answers have been corrected get the newest ExamDiscuss.com DP-200 dumps with Test Engine here:
You are monitoring the Data Factory pipeline that runs from Cosmos DB to SQL Database for Race Central. You discover that the job takes 45 minutes to run. What should you do to improve the performance of the job?
Correct Answer: B
Explanation Performance tuning tips and optimization features. In some cases, when you run a copy activity in Azure Data Factory, you see a "Performance tuning tips" message on top of the copy activity monitoring, as shown in the following example. The message tells you the bottleneck that was identified for the given copy run. It also guides you on what to change to boost copy throughput. The performance tuning tips currently provide suggestions like: Use PolyBase when you copy data into Azure SQL Data Warehouse. Increase Azure Cosmos DB Request Units or Azure SQL Database DTUs (Database Throughput Units) when the resource on the data store side is the bottleneck. Remove the unnecessary staged copy. References: https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-performance Topic 4, ADatum Corporation Case study Overview ADatum Corporation is a retailer that sells products through two sales channels: retail stores and a website. Existing Environment ADatum has one database server that has Microsoft SQL Server 2016 installed. The server hosts three mission-critical databases named SALESDB, DOCDB, and REPORTINGDB. SALESDB collects data from the stored and the website. DOCDB stored documents that connect to the sales data in SALESDB. The documents are stored in two different JSON formats based on the sales channel. REPORTINGDB stores reporting data and contains server columnstore indexes. A daily process creates reporting data in REPORTINGDB from the data in SALESDB. The process is implemented as a SQL Server Integration Services (SSIS) package that runs a stored procedure from SALESDB. Requirements Planned Changes ADatum plans to move the current data infrastructure to Azure. The new infrastructure has the following requirements: * Migrate SALESDB and REPORTINGDB to an Azure SQL database. * Migrate DOCDB to Azure Cosmos DB. * The sales data including the documents in JSON format, must be gathered as it arrives and analyzed online by using Azure Stream Analytics. The analytic process will perform aggregations that must be done continuously, without gaps, and without overlapping. * As they arrive, all the sales documents in JSON format must be transformed into one consistent format. * Azure Data Factory will replace the SSIS process of copying the data from SALESDB to REPORTINGDB. Technical Requirements The new Azure data infrastructure must meet the following technical requirements: * Data in SALESDB must encrypted by using Transparent Data Encryption (TDE). The encryption must use your own key. * SALESDB must be restorable to any given minute within the past three weeks. * Real-time processing must be monitored to ensure that workloads are sized properly based on actual usage patterns. * Missing indexes must be created automatically for REPORTINGDB. * Disk IO, CPU, and memory usage must be monitored for SALESDB.