Valid Databricks-Certified-Professional-Data-Engineer Dumps shared by ExamDiscuss.com for Helping Passing Databricks-Certified-Professional-Data-Engineer Exam! ExamDiscuss.com now offer the newest Databricks-Certified-Professional-Data-Engineer exam dumps, the ExamDiscuss.com Databricks-Certified-Professional-Data-Engineer exam questions have been updated and answers have been corrected get the newest ExamDiscuss.com Databricks-Certified-Professional-Data-Engineer dumps with Test Engine here:
Which configuration parameter directly affects the size of a spark-partition upon ingestion of data into Spark?
Correct Answer: A
This is the correct answer because spark.sql.files.maxPartitionBytes is a configuration parameter that directly affects the size of a spark-partition upon ingestion of data into Spark. This parameter configures the maximum number of bytes to pack into a single partition when reading files from file-based sources such as Parquet, JSON and ORC. The default value is 128 MB, which means each partition will be roughly 128 MB in size, unless there are too many small files or only one large file. Verified References: [Databricks Certified Data Engineer Professional], under "Spark Configuration" section; Databricks Documentation, under "Available Properties - spark.sql.files.maxPartitionBytes" section.