Snowflake COF-C02 SnowPro Core Certification Exam Online Trainingexams
Snowflake COF-C02 Online Training
The questions for COF-C02 were last updated at Dec 06,2023.
- Exam Code: COF-C02
- Exam Name: SnowPro Core Certification Exam
- Certification Provider: Snowflake
- Latest update: Dec 06,2023
How would you determine the size of the virtual warehouse used for a task?
- A . Root task may be executed concurrently (i.e. multiple instances), it is recommended to leave some margins in the execution window to avoid missing instances of execution
- B . Querying (select) the size of the stream content would help determine the warehouse size. For example, if querying large stream content, use a larger warehouse size
- C . If using the stored procedure to execute multiple SQL statements, it’s best to test run the stored procedure separately to size the compute resource first
- D . Since task infrastructure is based on running the task body on schedule, it’s recommended to configure the virtual warehouse for automatic concurrency handling using Multi-cluster warehouse (MCW) to match the task schedule
True or False: Loading data into Snowflake requires that source data files be no larger than 16MB.
- A . True
- B . False
By default, COPY INTO location statements separate table data into a set of output files to take advantage of parallel operations. The maximum size for each file is set using the MAX_FILE_SIZE copy option. The default value is 16777216 (16 MB) but can be increased to accommodate larger files. The maximum file size supported is 5 GB for Amazon S3, Google Cloud Storage, or Microsoft Azure stages. To unload data to a single output file (at the potential cost of decreased performance), specify the SINGLE = true copy option in your statement. You can optionally specify a name for the file in the path.
What are two ways to create and manage Data Shares in Snowflake? (Choose two.)
- A . Via the Snowflake Web Interface (Ul)
- B . Via the data_share=true parameter
- C . Via SQL commands
- D . Via Virtual Warehouses