0% found this document useful (0 votes)
9 views

snowpro_core_dumps

The document contains a series of questions and answers related to Snowflake's features, architecture, and usage, including clustering keys, virtual warehouse scaling policies, and data management practices. Key topics include the types of tables, mechanisms for reducing storage costs, and the handling of data shares and network policies. It also addresses the implications of Snowflake's design for workloads and data loading processes.

Uploaded by

Chinni Thanneeru
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

snowpro_core_dumps

The document contains a series of questions and answers related to Snowflake's features, architecture, and usage, including clustering keys, virtual warehouse scaling policies, and data management practices. Key topics include the types of tables, mechanisms for reducing storage costs, and the handling of data shares and network policies. It also addresses the implications of Snowflake's design for workloads and data loading processes.

Uploaded by

Chinni Thanneeru
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 396

Snowflake provides a mechanism for its customers to override its natural clustering algorithms.

This method is:


A. Micro-partitions
B. Clustering keys
C. Key partitions
D. Clustered partitions

Correct Answer: B

Which of the following are valid Snowflake Virtual Warehouse Scaling Policies? (Choose two.)
A. Custom
B. Economy
C. Optimized
D. Standard

Correct Answer: BD

True or False: A single database can exist in more than one Snowflake account.
A. True
B. False

Correct Answer: B

Which of the following roles is recommended to be used to create and manage users and roles?
A. SYSADMIN
B. SECURITYADMIN
C. PUBLIC
D. ACCOUNTADMIN

Correct Answer: D

Question #5 Topic 1
True or False: Bulk unloading of data from Snowflake supports the use of a SELECT statement.

A. True
B. False

Correct Answer: A

Select the different types of Internal Stages: (Choose three.)


A. Named Stage Most Voted
B. User Stage Most Voted
C. Table Stage Most Voted
D. Schema Stage

Correct Answer: ABC

True or False: A customer using SnowSQL / native connectors will be unable to also use the Snowflake
Web Interface (UI) unless access to the UI is explicitly granted by support.
A. True
B. False

Correct Answer: B

Question #8 Topic 1
Account-level storage usage can be monitored via:
A. The Snowflake Web Interface (UI) in the Databases section
B. The Snowflake Web Interface (UI) in the Account -> Billing & Usage section
C. The Information Schema -> ACCOUNT_USAGE_HISTORY View
D. The Account Usage Schema -> ACCOUNT_USAGE_METRICS View

Correct Answer: B

Credit Consumption by the Compute Layer (Virtual Warehouses) is based on: (Choose two.)
A. Number of users
B. Warehouse size
C. Amount of data processed
D. # of clusters for the Warehouse Most Voted

Correct Answer: BC

Which statement best describes `clustering`?


A. Clustering represents the way data is grouped together and stored within Snowflake's micro-partitions
B. The database administrator must define the clustering methodology for each Snowflake table
C. The clustering key must be included on the COPY command when loading data into Snowflake
D. Clustering can be disabled within a Snowflake account

Correct Answer: A

True or False: The COPY command must specify a File Format in order to execute.
A. True
B. False Most Voted

Correct Answer: A

Which of the following commands sets the Virtual Warehouse for a session?
A. COPY WAREHOUSE FROM <<config file>>;
B. SET WAREHOUSE = <<warehouse name>>;
C. USE WAREHOUSE <<warehouse name>>;
D. USE VIRTUAL_WAREHOUSE <<warehouse name>>;

Correct Answer: C

Which of the following objects can be cloned? (Choose four.)


A. Tables Most Voted
B. Named File Formats Most Voted
C. Schemas Most Voted
D. Shares
E. Databases Most Voted
F. Users

Correct Answer: ACEF

Which object allows you to limit the number of credits consumed within a Snowflake account?
A. Account Usage Tracking
B. Resource Monitor
C. Warehouse Limit Parameter
D. Credit Consumption Tracker

Correct Answer: B
Question #15 Topic 1
Snowflake is designed for which type of workloads? (Choose two.)
A. OLAP (Analytics) workloads Most Voted
B. OLTP (Transactional) workloads
C. Concurrent workloads Most Voted
D. On-premise workloads

Correct Answer: AB

Question #16 Topic 1


What are the three layers that make up Snowflake's architecture? (Choose three.)
A. Compute Most Voted
B. Tri-Secret Secure
C. Storage Most Voted
D. Cloud Services Most Voted

Correct Answer: ACD

Question #17 Topic 1


Why would a customer size a Virtual Warehouse from an X-Small to a Medium?
A. To accommodate more queries
B. To accommodate more users
C. To accommodate fluctuations in workload
D. To accommodate a more complex workload Most Voted

Correct Answer: A

Question #18 Topic 1


True or False: Reader Accounts incur no additional Compute costs to the Data Provider since they are
simply reading the shared data without making changes.
A. True
B. False Most Voted

Correct Answer: B

Question #19 Topic 1


Which of the following connectors allow Multi-Factor Authentication (MFA) authorization when
connecting? (Choose all that apply.)
A. JDBC Most Voted
B. SnowSQL Most Voted
C. Snowflake Web Interface (UI) Most Voted
D. ODBC Most Voted
E. Python Most Voted

Correct Answer: ABCDE

Question #20 Topic 1


True or False: Snowflake charges a premium for storing semi-structured data.
A. True
B. False Most Voted

Correct Answer: B
Which of the following statements describes a benefit of Snowflake's separation of compute and storage?
(Choose all that apply.)
A. Growth of storage and compute are tightly coupled together
B. Storage expands without the requirement to add more compute Most Voted
C. Compute can be scaled up or down without the requirement to add more storage Most Voted
D. Multiple compute clusters can access stored data without contention Most Voted

Correct Answer: BCD

Question #22 Topic 1


True or False: It is possible to unload structured data to semi-structured formats such as JSON and
Parquet.
A. True Most Voted
B. False

Correct Answer: A

Question #23 Topic 1


In which layer of its architecture does Snowflake store its metadata statistics?
A. Storage Layer
B. Compute Layer
C. Database Layer
D. Cloud Services Layer Most Voted

Correct Answer: D

Question #24 Topic 1


True or False: Data in fail-safe can be deleted by a user or the Snowflake team before it expires.
A. True
B. False Most Voted

Correct Answer: B

Question #25 Topic 1


True or False: Snowflake's data warehouse was built from the ground up for the cloud in lieu of using an
existing database or a platform, like Hadoop, as a base.
A. True Most Voted
B. False Most Voted

Correct Answer: B

Question #26 Topic 1


Which of the following statements are true of Virtual Warehouses? (Choose all that apply.)
A. Customers can change the size of the Warehouse after creation Most Voted
B. A Warehouse can be resized while running Most Voted
C. A Warehouse can be configured to suspend after a period of inactivity
D. A Warehouse can be configured to auto-resume when new queries are submitted

Correct Answer: BC

Question #27 Topic 1


The PUT command: (Choose two.)
A. Automatically creates a File Format object
B. Automatically uses the last Stage created
C. Automatically compresses files using Gzip Most Voted
D. Automatically encrypts files Most Voted
Correct Answer: CD

Question #28 Topic 1


Which type of table corresponds to a single Snowflake session?
A. Temporary Most Voted
B. Transient
C. Provisional
D. Permanent

Correct Answer: A

Question #29 Topic 1


Which interfaces can be used to create and/or manage Virtual Warehouses?
A. The Snowflake Web Interface (UI)
B. SQL commands
C. Data integration tools
D. All of the above Most Voted

Correct Answer: B

Question #30 Topic 1


When a Pipe is recreated using the CREATE OR REPLACE PIPE command:
A. The Pipe load history is reset to empty Most Voted
B. The REFRESH parameter is set to TRUE
C. Previously loaded files will be ignored
D. All of the above

Correct Answer: B

What is the minimum Snowflake edition that customers planning on storing protected information in
Snowflake should consider for regulatory compliance?

A. Standard
B. Premier
C. Enterprise
D. Business Critical Edition Most Voted

Correct Answer: D

Question #32 Topic 1


Select the three types of tables that exist within Snowflake. (Choose three.)
A. Temporary Most Voted
B. Transient Most Voted
C. Provisional
D. Permanent Most Voted

Correct Answer: ABD

Question #33 Topic 1


True or False: Snowpipe via REST API can only reference External Stages as source.
A. True
B. False Most Voted

Correct Answer: B

Question #34 Topic 1


True or False: A third-party tool that supports standard JDBC or ODBC but has no Snowflake-specific
driver will be unable to connect to Snowflake.
A. True Most Voted
B. False Most Voted

Correct Answer: A

Question #35 Topic 1


True or False: It is possible to load data into Snowflake without creating a named File Format object.
A. True Most Voted
B. False

Correct Answer: A

Question #36 Topic 1


True or False: A table in Snowflake can only be queried using the Virtual Warehouse that was used to
load the data.

A. True
B. False Most Voted

Correct Answer: B

Question #37 Topic 1


Which of the following statements are true of Snowflake data loading? (Choose three.)
A. VARIANT ‫ג‬€null‫ג‬€ values are not the same as SQL NULL values Most Voted
B. It is recommended to do frequent, single row DMLs
C. It is recommended to validate the data before loading into the Snowflake target table Most Voted
D. It is recommended to use staging tables to manage MERGE statements Most Voted

Correct Answer: ACD

Question #38 Topic 1


Which statements are true of micro-partitions? (Choose two.)
A. They are approximately 16MB in size Most Voted
B. They are stored compressed only if COMPRESS=TRUE on Table
C. They are immutable Most Voted
D. They are only encrypted in the Enterprise edition and above

Correct Answer: AC

Question #39 Topic 1


True or False: Query ID's are unique across all Snowflake deployments and can be used in
communication with Snowflake Support to help troubleshoot issues.
A. True Most Voted
B. False Most Voted

Correct Answer: B

Question #40 Topic 1


A deterministic query is run at 8am, takes 5 minutes, and the results are cached. Which of the following
statements are true? (Choose two.)
A. The exact query will ALWAYS return the precomputed result set for the RESULT_CACHE_ACTIVE
= time period
B. The same exact query will return the precomputed results if the underlying data hasn't changed and
the results were last accessed within previous 24 hour period Most Voted
C. The same exact query will return the precomputed results even if the underlying data has changed as
long as the results were last accessed within the previous 24 hour period
D. The €24‫ ג‬hour‫ג‬€ timer on the precomputed results gets renewed every time the exact query is executed
Most Voted
Correct Answer: BD

Increasing the maximum number of clusters in a Multi-Cluster Warehouse is an example of:


A. Scaling rhythmically
B. Scaling max
C. Scaling out Most Voted
D. Scaling up

Correct Answer: D

Question #42 Topic 1


Which statement best describes Snowflake tables?
A. Snowflake tables are logical representations of underlying physical data Most Voted
B. Snowflake tables are the physical instantiation of data loaded into Snowflake
C. Snowflake tables require that clustering keys be defined to perform optimally
D. Snowflake tables are owned by a user

Correct Answer: A

Question #43 Topic 1


Which item in the Data Warehouse migration process does not apply in Snowflake?
A. Migrate Users
B. Migrate Schemas
C. Migrate Indexes Most Voted
D. Build the Data Pipeline

Correct Answer: C

Question #44 Topic 1


Snowflake provides two mechanisms to reduce data storage costs for short-lived tables. These
mechanisms are: (Choose two.)
A. Temporary Tables Most Voted
B. Transient Tables Most Voted
C. Provisional Tables
D. Permanent Tables

Correct Answer: AB

Question #45 Topic 1


What is the maximum compressed row size in Snowflake?
A. 8KB
B. 16MB
C. 50MB
D. 4000GB

Correct Answer: B

Question #46 Topic 1


Which of the following are main sections of the top navigation of the Snowflake Web Interface (UI)?
(Choose three.)
A. Databases Most Voted
B. Tables
C. Warehouses Most Voted
D. Worksheets Most Voted

Correct Answer: ACD

Question #47 Topic 1


What is the recommended Snowflake data type to store semi-structured data like JSON?
A. VARCHAR
B. RAW
C. LOB
D. VARIANT Most Voted

Correct Answer: D

Question #48 Topic 1


Which of the following statements are true of Snowflake releases: (Choose two.)
A. They happen approximately weekly Most Voted
B. They roll up and release approximately monthly, but customers can request early release application
C. During a release, new customer requests/queries/connections transparently move over to the newer
version Most Voted
D. A customer is assigned a 30 minute window (that can be moved anytime within a week) during which
the system will be unavailable and customer is upgraded

Correct Answer: CD

Question #49 Topic 1


Which of the following are common use cases for zero-copy cloning? (Choose three.)
A. Quick provisioning of Dev and Test/QA environments Most Voted
B. Data backups Most Voted
C. Point in time snapshots Most Voted
D. Performance optimization

Correct Answer: ABC

Question #50 Topic 1


If a Small Warehouse is made up of 2 servers/cluster, how many servers/cluster make up a Medium
Warehouse?
A. 4 Most Voted
B. 16
C. 32
D. 128

Correct Answer: A

True or False: When a data share is established between a Data Provider and a Data Consumer, the Data
Consumer can extend that data share to other Data
Consumers.
A. True
B. False

Correct Answer: B

Question #52 Topic 1


Which is true of Snowflake network policies? A Snowflake network policy: (Choose two.)
A. Is available to all Snowflake Editions Most Voted
B. Is only available to customers with Business Critical Edition
C. Restricts or enables access to specific IP addresses Most Voted
D. Is activated using an ‫ג‬€ALTER DATABASE‫ג‬€ command

Correct Answer: AC

Question #53 Topic 1


True or False: Snowflake charges additional fees to Data Providers for each Share they create.
A. True
B. False Most Voted

Correct Answer: A

Question #54 Topic 1


Query results are stored in the Result Cache for how long after they are last accessed, assuming no data
changes have occurred?
A. 1 Hour
B. 3 Hours
C. 12 hours
D. 24 hours

Correct Answer: D

Question #55 Topic 1


A role is created and owns 2 tables. This role is then dropped. Who will now own the two tables?
A. The tables are now orphaned
B. The user that deleted the role
C. SYSADMIN
D. The assumed role that dropped the role Most Voted

Correct Answer: D

Question #56 Topic 1


Which of the following connectors are available in the Downloads section of the Snowflake Web Interface
(UI)? (Choose two.)
A. SnowSQL Most Voted
B. ODBC Most Voted
C. R
D. HIVE

Correct Answer: AB

Question #57 Topic 1


Which of the following DML commands isn't supported by Snowflake?
A. UPSERT Most Voted
B. MERGE
C. UPDATE
D. TRUNCATE TABLE

Correct Answer: A

Question #58 Topic 1


Which of the following statements is true of zero-copy cloning?
A. Zero-copy clones increase storage costs as cloning the table requires storing its data twice
B. All zero-copy clone objects inherit the privileges of their original objects
C. Zero-copy cloning is licensed as an additional Snowflake feature
D. At the instance/instant a clone is created, all micro-partitions in the original table and the clone are
fully shared Most Voted

Correct Answer: D

Question #59 Topic 1


True or False: When a user creates a role, they are initially assigned ownership of the role and they
maintain ownership until it is transferred to another user.
A. True
B. False Most Voted

Correct Answer: A
Question #60 Topic 1
The Query History in the Snowflake Web Interface (UI) is kept for approximately:
A. 60 minutes
B. 24 hours
C. 14 days Most Voted
D. 30 days
E. 1 year

Correct Answer: E

To run a Multi-Cluster Warehouse in auto-scale mode, a user would:


A. Configure the Maximum Clusters setting to ‫ג‬€Auto-Scale‫ג‬€
B. Set the Warehouse type to ‫ג‬€Auto‫ג‬€
C. Set the Minimum Clusters and Maximum Clusters settings to the same value
D. Set the Minimum Clusters and Maximum Clusters settings to the different values Most Voted

Correct Answer: D

Question #62 Topic 1


Which of the following terms best describes Snowflake's database architecture?
A. Columnar shared nothing
B. Shared disk
C. Multi-cluster, shared data Most Voted
D. Cloud-native shared memory

Correct Answer: B

Question #63 Topic 1


Which of the following are options when creating a Virtual Warehouse? (Choose two.)
A. Auto-drop
B. Auto-resize
C. Auto-resume Most Voted
D. Auto-suspend Most Voted

Correct Answer: CD

Question #64 Topic 1


A Virtual Warehouse's auto-suspend and auto-resume settings apply to:
A. The primary cluster in the Virtual Warehouse
B. The entire Virtual Warehouse Most Voted
C. The database the Virtual Warehouse resides in
D. The queries currently being run by the Virtual Warehouse

Correct Answer: B

Question #65 Topic 1


Fail-safe is unavailable on which table types? (Choose two.)
A. Temporary Most Voted
B. Transient Most Voted
C. Provisional
D. Permanent

Correct Answer: AB

Question #66 Topic 1


Which of the following objects is not covered by Time Travel?
A. Tables
B. Schemas
C. Databases
D. Stages Most Voted

Correct Answer: D

Question #67 Topic 1


True or False: Micro-partition metadata enables some operations to be completed without requiring
Compute.
A. True
B. False

Correct Answer: A

Question #68 Topic 1


Which of the following commands are not blocking operations? (Choose two.)
A. UPDATE
B. INSERT
C. MERGE
D. COPY

Correct Answer: BD

Question #69 Topic 1


Which of the following is true of Snowpipe via REST API? (Choose two.)
A. You can only use it on Internal Stages
B. All COPY INTO options are available during pipe creation
C. Snowflake automatically manages the compute required to execute the Pipe's COPY INTO commands
Most Voted
D. Snowpipe keeps track of which files it has loaded Most Voted

Correct Answer: BD

Question #70 Topic 1


Snowflake recommends, as a minimum, that all users with the following role(s) should be enrolled in
Multi-Factor Authentication (MFA):
A. SECURITYADMIN, ACCOUNTADMIN, PUBLIC, SYSADMIN
B. SECURITYADMIN, ACCOUNTADMIN, SYSADMIN
C. SECURITYADMIN, ACCOUNTADMIN
D. ACCOUNTADMIN Most Voted

Correct Answer: D

When can a Virtual Warehouse start running queries?


A. 12am-5am
B. Only during administrator defined time slots
C. When its provisioning is complete
D. After replication

Correct Answer: B

Question #72 Topic 1


True or False: Users are able to see the result sets of queries executed by other users that share their same
role.
A. True
B. False Most Voted

Correct Answer: B
Question #73 Topic 1
True or False: The user has to specify which cluster a query will run on in a multi-cluster Warehouse.
A. True
B. False Most Voted

Correct Answer: B

Question #74 Topic 1


True or False: Pipes can be suspended and resumed.
A. True Most Voted
B. False

Correct Answer: B

Question #75 Topic 1


Which of the following languages can be used to implement Snowflake User Defined Functions (UDFs)?
(Choose two.)
A. Java
B. Javascript
C. SQL
D. Python

Correct Answer: BC

Question #76 Topic 1


When should you consider disabling auto-suspend for a Virtual Warehouse? (Choose two.)
A. When users will be using compute at different times throughout a 24/7 period
B. When managing a steady workload Most Voted
C. When the compute must be available with no delay or lag time Most Voted
D. When you do not want to have to manually turn on the Warehouse each time a user needs it

Correct Answer: BC

Question #77 Topic 1


Which of the following are valid approaches to loading data into a Snowflake table? (Choose all that
apply.)
A. Bulk copy from an External Stage Most Voted Most Voted
B. Continuous load using Snowpipe REST API Most Voted Most Voted
C. The Snowflake Web Interface (UI) data loading wizard Most Voted
D. Bulk copy from an Internal Stage Most Voted

Correct Answer: ABC

Question #78 Topic 1


If auto-suspend is enabled for a Virtual Warehouse, the Warehouse is automatically suspended when:
A. All Snowflakes sessions using the Warehouse are terminated.
B. The last query using the Warehouse completes.
C. There are no users logged into Snowflake.
D. The Warehouse is inactive for a specified period of time. Most Voted

Correct Answer: D

Question #79 Topic 1


True or False: Multi-Factor Authentication (MFA) in Snowflake is only supported in conjunction with
Single Sign-On (SSO).
A. True
B. False Most Voted

Correct Answer: B
Question #80 Topic 1
The number of queries that a Virtual Warehouse can concurrently process is determined by (Choose
two.):
A. The complexity of each query Most Voted
B. The CONCURRENT_QUERY_LIMIT parameter set on the Snowflake account
C. The size of the data required for each query Most Voted
D. The tool that is executing the query

Correct Answer: AC

Which of the following statements are true of VALIDATION_MODE in Snowflake? (Choose two.)
A. The VALIDATION_MODE option is used when creating an Internal Stage
B. VALIDATION_MODE=RETURN_ALL_ERRORS is a parameter of the COPY command Most
Voted
C. The VALIDATION_MODE option will validate data to be loaded by the COPY statement while
completing the load and will return the rows that could not be loaded without error
D. The VALIDATION_MODE option will validate data to be loaded by the COPY statement without
completing the load and will return possible errors Most Voted

Correct Answer: BC

Question #82 Topic 1


What privileges are required to create a task?
A. The GLOBAL privilege CREATE TASK is required to create a new task.
B. Tasks are created at the Application level and can only be created by the Account Admin role.
C. Many Snowflake DDLs are metadata operations only, and CREATE TASK DDL can be executed
without virtual warehouse requirement or task specific grants.
D. The role must have access to the target schema and the CREATE TASK privilege on the schema itself.
Most Voted

Correct Answer: A

Question #83 Topic 1


What are the three things customer want most from their enterprise data warehouse solution? (Choose
three.)
A. On-premise availability
B. Simplicity Most Voted
C. Open source based
D. Concurrency Most Voted
E. Performance Most Voted

Correct Answer: BDE

Question #84 Topic 1


True or False: Some queries can be answered through the metadata cache and do not require an active
Virtual Warehouse.
A. True Most Voted
B. False

Correct Answer: B

Question #85 Topic 1


When scaling out by adding clusters to a multi-cluster warehouse, you are primarily scaling for
improved:
A. Concurrency
B. Performance
Correct Answer: A

Question #86 Topic 1


What is the minimum Snowflake edition that provides data sharing?
A. Standard
B. Premier
C. Enterprise
D. Business Critical Edition

Correct Answer: A

Question #87 Topic 1


True or False: Each worksheet in the Snowflake Web Interface (UI) can be associated with different
roles, databases, schemas, and Virtual Warehouses.
A. True Most Voted
B. False

Correct Answer: B

Question #88 Topic 1


True or False: You can query the files in an External Stage directly without having to load the data into a
table.
A. True Most Voted
B. False

Correct Answer: A

Question #89 Topic 1


The FLATTEN function is used to query which type of data in Snowflake?
A. Structured data
B. Semi-structured data Most Voted
C. Both of the above
D. None of the above

Correct Answer: B

Question #90 Topic 1


True or False: An active warehouse is required to run a COPY INTO statement.
A. True Most Voted
B. False

Correct Answer: A

True or False: AWS Private Link provides a secure connection from the Customer's on-premise data
center to the Snowflake.
A. True
B. False Most Voted

Correct Answer: B

Question #92 Topic 1


True or False: Snowflake's Global Services Layer gathers and maintains statistics on all columns in all
micro-partitions.
A. True Most Voted
B. False

Correct Answer: B
Question #93 Topic 1
True or False: It is best practice to define a clustering key on every table.
A. True
B. False

Correct Answer: B

Question #94 Topic 1


Which of the following statements is true of Snowflake?
A. It was built specifically for the cloud
B. It was built as an on-premises solution and then ported to the cloud
C. It was designed as a hybrid database to allow customers to store data either on premises or in the
cloud
D. It was built for Hadoop architecture
E. It's based on an Oracle Architecture

Correct Answer: A

Question #95 Topic 1


What is the minimum Snowflake edition that provides multi-cluster warehouses and up to 90 days of
Time Travel?
A. Standard
B. Premier
C. Enterprise
D. Business Critical Edition

Correct Answer: C

Question #96 Topic 1


How many shares can be consumed by a single Data Consumer?
A. 1
B. 10
C. 100, but can be increased by contacting support
D. Unlimited

Correct Answer: D

Question #97 Topic 1


What is the lowest Snowflake edition that offers Time Travel up to 90 days?
A. Standard Edition
B. Premier Edition
C. Enterprise Edition
D. Business Critical Edition

Correct Answer: C

Question #98 Topic 1


Which of the following statements are true about Schemas in Snowflake? (Choose two.)
A. A Schema may contain one or more Databases
B. A Database may contain one or more Schemas Most Voted
C. A Schema is a logical grouping of Database Objects Most Voted
D. Each Schema is contained within a Warehouse

Correct Answer: BC

Question #99 Topic 1


True or False: You can resize a Virtual Warehouse while queries are running.
A. True Most Voted
B. False
Correct Answer: A

Question #100 Topic 1


What is the most granular object that the Time Travel retention period can be defined on?
A. Account
B. Database
C. Schema
D. Table Most Voted

Correct Answer: A

Which of the following statements is true of Snowflake micro-partitioning?


A. Micro-partitioning has been known to introduce data skew
B. Micro-partitioning: requires a partitioning schema to be defined up front
C. Micro-partitioning is transparently completed using the ordering that occurs when the data is
inserted/loaded Most Voted
D. Micro-partitioning can be disabled within a Snowflake account

Correct Answer: C

Question #102 Topic 1


True or False: Snowflake bills for a minimum of five minutes each time a Virtual Warehouse is started.

A. True
B. False Most Voted

Correct Answer: B

Question #103 Topic 1


When scaling up Virtual Warehouses by increasing Virtual Warehouse t-shirt size, you are primarily
scaling for improved:
A. Concurrency
B. Performance Most Voted

Correct Answer: B

Question #104 Topic 1


As a best practice, clustering keys should only be defined on tables of which minimum size?
A. Multi-Kilobyte (KB) Range
B. Multi-Megabyte (MB) Range
C. Multi-Gigabyte (GB) Range
D. Multi-Terabyte (TB) Range Most Voted

Correct Answer: D

Question #105 Topic 1


How a Snowpipe charges calculated?
A. Per-second/per Warehouse size
B. Per-second/per-core granularity Most Voted
C. Number of Pipes in account
D. Total storage bucket size

Correct Answer: B

Question #106 Topic 1


True or False: A Snowflake account is charged for data stored in both Internal and External Stages.
A. True
B. False Most Voted
Correct Answer: B

Question #107 Topic 1


True or False: When active, a Pipe requires a dedicated Virtual Warehouse to execute.
A. True
B. False Most Voted

Correct Answer: B

Question #108 Topic 1


True or False: Snowflake supports federated authentication in all editions.
A. True Most Voted
B. False
Correct Answer: A

True or False: When a new Snowflake object is created, it is automatically owned by the user who created
it.
A. True
B. False Most Voted

Correct Answer: A

True or False: A Virtual Warehouse consumes Snowflake credits even when inactive.
A. True
B. False

Correct Answer: B

True or False: During data unloading, only JSON and CSV files can be compressed.

• A. True
• B. False Most Voted

Correct Answer: B 🗳️

Which of the following are options when creating a Virtual Warehouse? (Choose two.)

• A. Auto-suspend
• B. Auto-resume
• C. Local SSD size
• D. User count

Correct Answer: AB 🗳️

Which formats are supported for unloading data from Snowflake? (Choose two.)

• A. Delimited (CSV, TSV, etc.) Most Voted


• B. Avro
• C. JSON Most Voted
• D. ORC

Correct Answer: AC 🗳️

True or False: Data Providers can share data with only the Data Consumer.

• A. True
• B. False Most Voted

Correct Answer: A 🗳️

The fail-safe retention period is how many days?

• A. 1 day
• B. 7 days Most Voted
• C. 45 days
• D. 90 days

Correct Answer: B 🗳️

True or False: Once created, a micro-partition will never be changed.

• A. True Most Voted


• B. False

Correct Answer: A 🗳️

What services does Snowflake automatically provide for customers that they may have been
responsible for with their on-premise system? (Choose all that apply.)

• A. Installing and configuring hardware Most Voted


• B. Patching software Most VotedMost Voted
• C. Physical security
• D. Maintaining metadata and statistics Most VotedMost Voted

Correct Answer: ABD 🗳️

Which of the following statements would be used to export/unload data from Snowflake?

• A. COPY INTO @stage


• B. EXPORT TO @stage
• C. INSERT INTO @stage
• D. EXPORT_TO_STAGE(stage => @stage, select => 'select * from t1');

Correct Answer: A 🗳️

True or False: A 4X-Large Warehouse may, at times, take longer to provision than a X-Small
Warehouse.

• A. True Most Voted


• B. False

Correct Answer: B 🗳️

How would you determine the size of the virtual warehouse used for a task?

• A. Root task may be executed concurrently (i.e. multiple instances), it is recommended


to leave some margins in the execution window to avoid missing instances of execution
• B. Querying (SELECT) the size of the stream content would help determine the
warehouse size. For example, if querying large stream content, use a larger warehouse
size
• C. If using the stored procedure to execute multiple SQL statements, it's best to test run
the stored procedure separately to size the compute resource first Most Voted
• D. Since task infrastructure is based on running the task body on schedule, it's
recommended to configure the virtual warehouse for automatic concurrency handling
using Multi-cluster warehouse (MCW) to match the task schedule

Correct Answer: C 🗳️
The Information Schema and Account Usage Share provide storage information for which of
the following objects? (Choose three.)

• A. Users
• B. Tables Most Voted
• C. Databases Most Voted
• D. Internal Stages Most Voted

Correct Answer: ABC 🗳️

What is the default File Format used in the COPY command if one is not specified?

• A. CSV
• B. JSON
• C. Parquet
• D. XML
Correct Answer: A 🗳️

True or False: Reader Accounts are able to extract data from shared data objects for use
outside of Snowflake.

• A. True Most Voted


• B. False Most Voted

Correct Answer: A 🗳️

True or False: You can define multiple columns within a clustering key on a table.

• A. True Most Voted


• B. False

Correct Answer: A 🗳️

True or False: Snowflake enforces unique, primary key, and foreign key constraints during DML
operations.

• A. True
• B. False Most Voted

Correct Answer: B 🗳️

True or False: Loading data into Snowflake requires that source data files be no larger than
16MB.

• A. True
• B. False Most Voted

Correct Answer: B 🗳️

True or False: A Virtual Warehouse can be resized while suspended.

• A. True Most Voted


• B. False

Correct Answer: A 🗳️

True or False: When you create a custom role, it is a best practice to immediately grant that role
to ACCOUNTADMIN.
• A. True
• B. False Most Voted

Correct Answer: B 🗳️

Which of the following accurately represents how a table fits into Snowflake's logical container
hierarchy?

• A. Account -> Schema -> Database -> Table


• B. Account -> Database -> Schema -> Table
• C. Database -> Schema -> Table -> Account
• D. Database -> Table -> Schema -> Account

Correct Answer: B 🗳️

True or False: All Snowflake table types include fail-safe storage.

• A. True
• B. False Most Voted

Correct Answer: B 🗳️
What are two ways to create and manage Data Shares in Snowflake? (Choose two.)

• A. Via the Snowflake Web Interface (UI) Most Voted


• B. Via the DATA_SHARE=TRUE parameter
• C. Via SQL commands Most Voted
• D. Via Virtual Warehouses

Correct Answer: AC 🗳️

True or False: Fail-safe can be disabled within a Snowflake account.

• A. True
• B. False Most Voted

Correct Answer: A 🗳️

True or False: It is possible for a user to run a query against the query result cache without
requiring an active Warehouse.

• A. True Most Voted


• B. False
Correct Answer: B 🗳️

True or False: When Snowflake is configured to use Single Sign-On (SSO), Snowflake receives
the usernames and credentials from the SSO service and loads them into the customer's
Snowflake account.

• A. True
• B. False Most Voted

Correct Answer: B 🗳️

Which of the following are best practices for loading data into Snowflake? (Choose three.)

• A. Aim to produce data files that are between 100 MB and 250 MB in size,
compressed. Most Voted
• B. Load data from files in a cloud storage service in a different region or cloud platform
from the service or region containing the Snowflake account, to save on cost.
• C. Enclose fields that contain delimiter characters in single or double quotes. Most Voted
• D. Split large files into a greater number of smaller files to distribute the load among
the compute resources in an active warehouse. Most Voted
• E. When planning which warehouse(s) to use for data loading, start with the largest
warehouse possible.
• F. Partition the staged data into large folders with random paths, allowing Snowflake to
determine the best way to load each file.

Correct Answer: BCE 🗳️

Which Snowflake feature is used for both querying and restoring data?

• A. Cluster keys
• B. Time Travel Most Voted
• C. Fail-safe
• D. Cloning

Correct Answer: B 🗳️

What do the terms scale up and scale out refer to in Snowflake? (Choose two.)

• A. Scaling out adds clusters of the same size to a virtual warehouse to handle more
concurrent queries. Most Voted
• B. Scaling out adds clusters of varying sizes to a virtual warehouse.
• C. Scaling out adds additional database servers to an existing running cluster to handle
more concurrent queries.
• D. Snowflake recommends using both scaling up and scaling out to handle more
concurrent queries.
• E. Scaling up resizes a virtual warehouse so it can handle more complex
workloads. Most Voted
• F. Scaling up adds additional database servers to an existing running cluster to handle
larger workloads.

Correct Answer: AE 🗳️

What is the minimum Snowflake edition that has column-level security enabled?

• A. Standard
• B. Enterprise
• C. Business Critical
• D. Virtual Private Snowflake

Correct Answer: B 🗳️

What parameter controls if the Virtual Warehouse starts immediately after the CREATE
WAREHOUSE statement?

•• A. INITIALLY_SUSPENDED == TRUE/FALSE
B. START_AFTER_CREATE TRUE/FALSE Most Voted
• C. START_THE = 60 // (seconds from now)
• D. START_TIME = CURRENT_DATE()

Correct Answer: A 🗳️

When cloning a database, what is cloned with the database? (Choose two.)

• A. Privileges on the database


• B. Existing child objects within the database Most Voted
• C. Future child objects within the database
• D. Privileges on the schemas within the database Most Voted
• E. Only schemas and tables within the database

Correct Answer: BE 🗳️
Which of the following describes the Snowflake Cloud Services layer?

• A. Coordinates activities in the Snowflake account Most Voted


• B. Executes queries submitted by the Snowflake account users
• C. Manages quotas on the Snowflake account storage
• D. Manages the virtual warehouse cache to speed up queries
Correct Answer: A 🗳️

What is the maximum total Continuous Data Protection (CDP) charges incurred for a temporary
table?

• A. 30 days
• B. 7 days
• C. 48 hours
• D. 24 hours Most Voted

Correct Answer: D 🗳️

When reviewing a query profile, what is a symptom that a query is too large to fit into the
memory?

• A. A single join node uses more than 50% of the query time
• B. Partitions scanned is equal to partitions total
• C. An AggregateOperator node is present
• D. The query is spilling to remote storage Most Voted

Correct Answer: B 🗳️

What type of query benefits the MOST from search optimization?

• A. A query that uses only disjunction (i.e., OR) predicates


• B. A query that includes analytical expressions
• C. A query that uses equality predicates or predicates that use IN
• D. A query that filters on semi-structured data types

Correct Answer: C 🗳️

What transformations are supported in a CREATE PIPE ... AS COPY `¦ FROM (`¦) statement?
(Choose two.)

• A. Data can be filtered by an optional WHERE clause.


• B. Incoming data can be joined with other tables.
• C. Columns can be reordered. Most Voted
• D. Columns can be omitted. Most Voted
• E. Row level access can be defined.

Correct Answer: CD 🗳️

Which of the following are characteristics of Snowflake virtual warehouses? (Choose two.)
• A. Auto-resume applies only to the last warehouse that was started in a multi-cluster
warehouse.
• B. The ability to auto-suspend a warehouse is only available in the Enterprise edition or
above.
• C. SnowSQL supports both a configuration file and a command line option for
specifying a default warehouse. Most Voted
• D. A user cannot specify a default warehouse when using the ODBC driver.
• E. The default virtual warehouse size can be changed at any time.

Correct Answer: A 🗳️

Which command should be used to load data from a file, located in an external stage, into a
table in Snowflake?

• A. PUT
B. INSERT
• C. GET
• D. COPY

Correct Answer: D 🗳️

The Snowflake Cloud Data Platform is described as having which of the following
architectures?

• A. Shared-disk
• B. Shared-nothing
• C. Multi-cluster shared data Most Voted
• D. Serverless query engine

Correct Answer: C 🗳️

Which of the following is a data tokenization integration partner?

• A. Protegrity Most Voted


• B. Tableau
• C. DBeaver
• D. SAP

Correct Answer: A 🗳️

What versions of Snowflake should be used to manage compliance with Personal Identifiable
Information (PII) requirements? (Choose two.)

• A. Custom Edition
• B. Virtual Private Snowflake Most Voted
• C. Business Critical Edition Most Voted
• D. Standard Edition
• E. Enterprise Edition

Correct Answer: BC 🗳️
What are supported file formats for unloading data from Snowflake? (Choose three.)

• A. XML
• B. JSON Most Voted
• C. Parquet Most Voted
• D. ORC
• E. AVRO
• F. CSV Most Voted

Correct Answer: BCF 🗳️

The Snowflake cloud services layer is responsible for which tasks? (Choose two.)

• A. Local disk caching


• B. Authentication and access control Most Voted
• C. Metadata management Most Voted
• D. Query processing
• E. Database storage

Correct Answer: CD 🗳️

What is a key feature of Snowflake architecture?

• A. Software
B. Zero-copyupdates
cloning are
creates a mirror copy
automatically of aon
applied database thatbasis.
a quarterly updates with the original.
• C. Snowflake eliminates resource contention with its virtual warehouse
implementation. Most Voted
• D. Multi-cluster warehouses allow users to run a query that spans across multiple
clusters.
• E. Snowflake automatically sorts DATE columns during ingest, for fast retrieval by date.

Correct Answer: D 🗳️

When publishing a Snowflake Data Marketplace listing into a remote region what should be
taken into consideration? (Choose two.)

• A. There is no need to have a Snowflake account in the target region, a share will be
created for each user.
• B. The listing is replicated into all selected regions automatically, the data is not. Most
Voted
• C. The user must have the ORGADMIN role available in at least one account to link
accounts for replication.
• D. Shares attached to listings in remote regions can be viewed from any account in an
organization.
• E. For a standard listing the user can wait until the first customer requests the data
before replicating it to the target region. Most Voted

Correct Answer: AC 🗳️

When loading data into Snowflake via Snowpipe what is the compressed file size
recommendation?

• A. 10-50 MB
• B. 100-250 MB
• C. 300-500 MB
• D. 1000-1500 MB

Correct Answer: B 🗳️

Which Snowflake feature allows a user to substitute a randomly generated identifier for
sensitive data, in order to prevent unauthorized users access to the data, before loading it into
Snowflake?

• A. External Tokenization Most Voted


• B. External Tables
• C. Materialized Views
• D. User-Defined Table Functions (UDTF)

Correct Answer: A 🗳️

Which of the following are examples of operations that require a Virtual Warehouse to
complete, assuming no queries have been executed previously? (Choose three.)

• A. MIN(<< column value >>)


• B. COPY
• C. SUM(<< column value >>)
• D. UPDATE

Correct Answer: BCD 🗳️

What is the SNOWFLAKE.ACCOUNT_USAGE view that contains information about which objects
were read by queries within the last 365 days (1 year)?
• A. VIEWS_HISTORY
• B. OBJECT_HISTORY
• C. ACCESS_HISTORY Most Voted
• D. LOGIN_HISTORY

Correct Answer: C 🗳️

Which feature is only available in the Enterprise or higher editions of Snowflake?

•• A. Column-level
B. SOC security
2 type II certification
• C. Multi-factor Authentication (MFA)
• D. Object-level access control

Correct Answer: A 🗳️

Will data cached in a warehouse be lost when the warehouse is resized?

• A. Possibly, if the warehouse is resized to a smaller size and the cache no longer
• fits.
B. Yes, because
Most Voted the compute resource is replaced in its entirety with a new compute
resource.
• C. No, because the size of the cache is independent from the warehouse size.
• D. Yes, because the new compute resource will no longer have access to the cache
encryption key.

Correct Answer: D 🗳️
Which semi-structured file formats are supported when unloading data from a table? (Choose
two.)

• A. ORC
• B. XML
• C. Avro
• D. Parquet Most Voted
• E. JSON Most Voted

Correct Answer: DE 🗳️

A running virtual warehouse is suspended.


What is the MINIMUM amount of time that the warehouse will incur charges for when it is
restarted?

• A. 1 second
• B. 60 seconds
• C. 5 minutes
• D. 60 minutes
Correct Answer: B 🗳️

What are the responsibilities of Snowflake's Cloud Service layer? (Choose three.)

• A. Authentication Most Voted


• B. Resource management Most Voted
• C. Virtual warehouse caching
• D. Query parsing and optimization Most Voted
• E. Query execution
• F. Physical storage of micro-partitions

Correct Answer: ADE 🗳️

How long is the Fail-safe period for temporary and transient tables?

• A. There is no Fail-safe period for these tables.


• B. 1 day
• C. 7 days
• D. 31 days
• E. 90 days

Correct Answer: A 🗳️

Which command should be used to download files from a Snowflake stage to a local folder on
a client's machine?

• A. PUT
• B. GET Most Voted
• C. COPY
• D. SELECT

Correct Answer: B 🗳️

How does Snowflake Fail-safe protect data in a permanent table?

• A. Fail-safe makes data available up to 1 day, recoverable by user operations.


• B. Fail-safe makes data available for 7 days, recoverable by user operations.
• C. Fail-safe makes data available for 7 days, recoverable only by Snowflake
Support. Most Voted
• D. Fail-safe makes data available up to 1 day, recoverable only by Snowflake Support.

Correct Answer: C 🗳️
A virtual warehouse is created using the following command:

Create warehouse my_WH with -


warehouse_size = MEDIUM
min_cluster_count = 1
max_cluster_count = 1
auto_suspend = 60
auto_resume = true;
The image below is a graphical representation of the warehouse utilization across two days.

What action should be taken to address this situation?

• A. Increase the warehouse size from Medium to 2XL.


• B. Increase the value for the parameter MAX_CONCURRENCY_LEVEL.
• C. Configure the warehouse to a multi-cluster warehouse. Most Voted
• D. Lower the value of the parameter STATEMENT_QUEUED_TIMEOUT_IN_SECONDS.

Correct Answer: B 🗳️

Which minimum Snowflake edition allows for a dedicated metadata store?

• A. Standard
• B. Enterprise
• C. Business Critical
• D. Virtual Private Snowflake Most Voted

Correct Answer: D 🗳️

Network policies can be set at which Snowflake levels? (Choose two.)

• A. Role
• B. Schema
• C. User Most Voted
• D. Database
• E. Account Most Voted
• F. Tables
• Correct Answer: CE 🗳️

What are the correct parameters for time travel and fail-safe in the Snowflake Enterprise
Edition?

• A. Default Time Travel Retention is set to 0 days. Maximum Time Travel Retention is 30
days. Fail Safe retention time is 1 day.
• B. Default Time Travel Retention is set to 1 day. Maximum Time Travel Retention is 365
days. Fail Safe retention time is 7 days.
• C. Default Time Travel Retention is set to 0 days. Maximum Time Travel Retention is 90
days. Fail Safe retention time is 7 days.
• D. Default Time Travel Retention is set to 1 day. Maximum Time Travel Retention is 90
days. Fail Safe retention time is 7 days. Most Voted
• E. Default Time Travel Retention is set to 7 days. Maximum Time Travel Retention is 1
day. Fail Safe retention time is 90 days.
• F. Default Time Travel Retention is set to 90 days. Maximum Time Travel Retention is 7
days. Fail Safe retention time is 356 days.

Correct Answer: D 🗳️
Which of the following objects are contained within a schema? (Choose two.)

• A. Role
• B. Stream Most Voted
• C. Warehouse
• D. External table Most Voted
• E. User
• F. Share

Correct Answer: CD 🗳️

Which of the following statements describe features of Snowflake data caching? (Choose two.)

• A. When a virtual warehouse is suspended, the data cache is saved on the remote
storage layer.
• B. When the data cache is full, the least-recently used data will be cleared to make
room. Most Voted
• C. A user can only access their own queries from the query result cache.
• D. A user must set USE_METADATA_CACHE to TRUE to use the metadata cache in
queries.
• E. The RESULT_SCAN table function can access and filter the contents of the query
result cache. Most Voted

Correct Answer: BD 🗳️
A table needs to be loaded. The input data is in JSON format and is a concatenation of multiple
JSON documents. The file size is 3 GB. A warehouse size S is being used. The following COPY
INTO command was executed:
COPY INTO SAMPLE FROM @~/SAMPLE.JSON (TYPE=JSON)
The load failed with this error:
Max LOB size (16777216) exceeded, actual size of parsed column is 17894470.
How can this issue be resolved?

• A. Compress the file and load the compressed file.


• B. Split the file into multiple files in the recommended size range (100 MB - 250 MB).
• C. Use a larger-sized warehouse.
• D. Set STRIP_OUTER_ARRAY=TRUE in the COPY INTO command. Most Voted

Correct Answer: A 🗳️

What is a feature of a stored procedure in Snowflake?

• A. They can be created as secure and hide the underlying metadata from all users.
• B. They can access tables from a single database.
• C. They can only contain a single SQL statement.
• D. They can be created to run with a caller's rights or an owner's rights. Most Voted

Correct Answer: A 🗳️

Which columns are part of the result set of the Snowflake LATERAL FLATTEN command?
(Choose two.)

• A. CONTENT
• B. PATH Most Voted
• C. BYTE_SIZE
• D. INDEX Most Voted
• E. DATATYPE

Correct Answer: BC 🗳️

What is the minimum Snowflake edition required to create a materialized view?

• A. Standard Edition
• B. Enterprise Edition Most Voted
• C. Business Critical Edition
• D. Virtual Private Snowflake Edition
Correct Answer: B 🗳️

Which Snowflake function will interpret an input string as a JSON document, and produce a
VARIANT value?

• A. parse_json() Most Voted


• B. json_extract_path_text()
• C. object_construct()
• D. flatten

Correct Answer: B 🗳️

How are serverless features billed?

• A. Per second multiplied by an automatic sizing for the job Most Voted
• B. Per minute multiplied by an automatic sizing for the job, with a minimum of one
minute
• C. Per second multiplied by the size, as determined by the
SERVERLESS_FEATURES_SIZE account parameter
• D. Serverless features are not billed, unless the total cost for the month exceeds 10% of
the warehouse credits, on the account

Correct Answer: D 🗳️

Which Snowflake architectural layer is responsible for a query execution plan?

• A. Compute
• B. Data storage
• C. Cloud services Most Voted
• D. Cloud provider

Correct Answer: D 🗳️

When unloading to a stage, which of the following is a recommended practice or approach?

• A. Set SINGLE = TRUE for larger files.


• B. Use OBJECT_CONSTRUCT(*) when using Parquet.
• C. Avoid the use of the CAST function.
• D. Define an individual file format. Most Voted

Correct Answer: C 🗳️
Which SQL commands, when committed, will consume a stream and advance the stream
offset? (Choose two.)

• A. UPDATE TABLE FROM STREAM


• B. SELECT FROM STREAM
• C. INSERT INTO TABLE SELECT FROM STREAM
• D. ALTER TABLE AS SELECT FROM STREAM
• E. BEGIN COMMIT

Correct Answer: CD 🗳️

Which methods can be used to delete staged files from a Snowflake stage? (Choose two.)

• A. Use the DROP command after the load completes.


• B. Specify the TEMPORARY option when creating the file format.
• C. Specify the PURGE copy option in the COPY INTO command.
• D. Use the REMOVE command after the load completes.
• E. Use the DELETE LOAD HISTORY command after the load completes.

On which of the following cloud platforms can a Snowflake account be hosted?


(Choose three.)

o A. Amazon Web Services


o B. Private Virtual Cloud
o C. Oracle Cloud
o D. Microsoft Azure Cloud
o E. Google Cloud Platform
o F. Alibaba Cloud

Correct Answer: ADE 🗳️

What Snowflake role must be granted for a user to create and manage accounts?

o A. ACCOUNTADMIN
o B. ORGADMIN
o C. SECURITYADMIN
o D. SYSADMIN

Correct Answer: B 🗳️

Assume there is a table consisting of five micro-partitions with values ranging from A to
Z.

Which diagram indicates a well-clustered table?


o A.

o B.

o C.

o D.

Correct Answer: A 🗳️

What feature can be used to reorganize a very large table on one or more columns?

o A. Micro-partitions
o B. Clustering keys
o C. Key partitions
o D. Clustered partitions

Correct Answer: B 🗳️

What is an advantage of using an explain plan instead of the query profiler to evaluate
the performance of a query?

o A. The explain plan output is available graphically.


o B. An explain plan can be used to conduct performance analysis without
executing a query.
o C. An explain plan will handle queries with temporary tables and the query
profiler will not.
o D. An explain plan's output will display automatic data skew optimization
information.

Correct Answer: B 🗳️

Which data types are supported by Snowflake when using semi-structured data?
(Choose two.)

o A. VARIANT
o B. VARRAY
o C. STRUCT
o D. ARRAY
o E. QUEUE

Correct Answer: AD 🗳️

Why does Snowflake recommend file sizes of 100-250 MB compressed when loading
data?

o A. Optimizes the virtual warehouse size and multi-cluster setting to economy


mode
o B. Allows a user to import the files in a sequential order
o C. Increases the latency staging and accuracy when loading the data
o D. Allows optimization of parallel operations

Correct Answer: D 🗳️

Which of the following features are available with the Snowflake Enterprise edition?
(Choose two.)

o A. Database replication and failover


o B. Automated index management
o C. Customer managed keys (Tri-secret secure)
o D. Extended time travel
o E. Native support for geospatial data

Correct Answer: AD 🗳️
What is the default file size when unloading data from Snowflake using the COPY command?

• A. 5 MB
• B. 8 GB
• C. 16 MB Most Voted
• D. 32 MB

Correct Answer: C 🗳️

What features that are part of the Continuous Data Protection (CDP) feature set in Snowflake
do not require additional configuration? (Choose two.)

• A. Data
B. Row level access
masking policies
policies
• C. Data encryption
• D. Time Travel
• E. External tokenization

Correct Answer: CD 🗳️
Which Snowflake layer is always leveraged when accessing a query from the result cache?

• A. Metadata
• B. Data Storage
• C. Compute
• D. Cloud Services

Correct Answer: D 🗳️

Which connectors are available in the downloads section of the Snowflake web interface (UI)?
(Choose two.)

• A. SnowSQL Most Voted


• B. JDBC Most Voted
• C. ODBC Most VotedMost Voted
• D. HIVE
• E. Scala

Correct Answer: CD 🗳️

A Snowflake Administrator needs to ensure that sensitive corporate data in Snowflake tables is
not visible to end users, but is partially visible to functional managers.

How can this requirement be met?

• A. Use data encryption.


• B. Use dynamic data masking. Most Voted
• C. Use secure materialized views.
• D. Revoke all roles for functional managers and end users.

Correct Answer: B 🗳️

Users are responsible for data storage costs until what occurs?

• A. Data expires from Time Travel


• B. Data expires from Fail-safe Most Voted
• C. Data is deleted from a table
• D. Data is truncated from a table

Correct Answer: B 🗳️

A user has an application that writes a new file to a cloud storage location every 5 minutes.

What would be the MOST efficient way to get the files into Snowflake?
• A. Create a task that runs a COPY INTO operation from an external stage every 5
minutes.
• B. Create a task that PUTS the files in an internal stage and automate the data loading
wizard.
• C. Create a task that runs a GET operation to intermittently check for new files.
• D. Set up cloud provider notifications on the file location and use Snowpipe with auto-
ingest. Most Voted

Correct Answer: D 🗳️

What affects whether the query results cache can be used?

• A. If the query contains a deterministic function


• B. If the virtual warehouse has been suspended
• C. If the referenced data in the table has changed Most Voted
• D. If multiple users are using the same virtual warehouse

Correct Answer: B 🗳️

Which of the following is an example of an operation that can be completed without requiring
compute, assuming no queries have been executed previously?

• A. SELECT SUM (ORDER_AMT) FROM SALES;


• B. SELECT AVG(ORDER_QTY) FROM SALES;
• C. SELECT MIN(ORDER_AMT) FROM SALES; Most Voted
• D. SELECT ORDER_AMT * ORDER_QTY FROM SALES;

Correct Answer: B 🗳️

How many days is load history for Snowpipe retained?

• A. 1 day
• B. 7 days
• C. 14 days Most Voted
• D. 64 days
• Correct Answer: C 🗳️

What Snowflake features allow virtual warehouses to handle high concurrency workloads?
(Choose two.)

• A. The ability to scale up warehouses


• B. The use of warehouse auto scaling Most Voted
• C. The ability to resize warehouses
• D. Use of multi-clustered warehouses Most Voted
• E. The use of warehouse indexing

Correct Answer: BD 🗳️

Which COPY INTO command outputs the data into one file?

• A. SINGLE=TRUE Most Voted


• B. MAX_FILE_NUMBER=1
• C. FILE_NUMBER=1
• D. MULTIPLE=FALSE

Correct Answer: A 🗳️

In which scenarios would a user have to pay Cloud Services costs? (Choose two.)

• A. Compute Credits = 50 Credits Cloud Services = 10 Most Voted


• B. Compute Credits = 80 Credits Cloud Services = 5
• C. Compute Credits = 100 Credits Cloud Services = 9
• D. Compute Credits = 120 Credits Cloud Services = 10
• E. Compute Credits = 200 Credits Cloud Services = 26 Most Voted

Correct Answer: DE 🗳️

A user created a new worksheet within the Snowsight UI and wants to share this with
teammates.

How can this worksheet be shared?

• A. Create a zero-copy clone of the worksheet and grant permissions to teammates.


• B. Create a private Data Exchange so that any teammate can use the worksheet.
• C. Share the worksheet with teammates within Snowsight. Most Voted
• D. Create a database and grant all permissions to teammates.

Correct Answer: B 🗳️

How can a row access policy be applied to a table or a view? (Choose two.)

• A. Within the policy DDL


• B. Within the create table or create view DDL Most Voted
• C. By future APPLY for all objects in a schema
• D. Within a control table
• E. Using the command ALTER [object] ADD ROW ACCESS POLICY [policy]; Most Voted
Correct Answer: BD 🗳️

Which command can be used to load data files into a Snowflake stage?

• A. JOIN
• B. COPY INTO
• C. PUT Most Voted
• D. GET

Correct Answer: C 🗳️

What types of data listings are available in the Snowflake Data Marketplace? (Choose two.)

• A. Reader
• B. Consumer
• C. Vendor
• D. Standard Most Voted
• E. Personalized Most Voted

Correct Answer: DE 🗳️

What is the maximum Time Travel retention period for a temporary Snowflake table?

• A. 90 days
• B. 1 day Most Voted
• C. 7 days
• D. 45 days

Correct Answer: B 🗳️

When should a multi-cluster warehouse be used in auto-scaling mode?

• A. When it is unknown how much compute power is needed


• B. If the select statement contains a large number of temporary tables or Common
Table Expressions (CTEs)
• C. If the runtime of the executed query is very slow
• D. When a large number of concurrent queries are run on the same warehouse Most Voted

Correct Answer: D 🗳️

What happens when a cloned table is replicated to a secondary database? (Choose two.)

• A. A read-only copy of the cloned tables is stored.


• B. The replication will not be successful.
• C. The physical data is replicated. Most Voted
• D. Additional costs for storage are charged to a secondary account. Most Voted
• E. Metadata pointers to cloned tables are replicated.

Correct Answer: CD 🗳️

Snowflake supports the use of external stages with which cloud platforms? (Choose three.)

• A. Amazon Web Services Most Voted


• B. Docker
• C. IBM Cloud
• D. Microsoft Azure Cloud Most Voted
• E. Google Cloud Platform Most Voted
• F. Oracle Cloud

Correct Answer: BDE 🗳️

What is a limitation of a Materialized View?

• A. A Materialized View cannot support any aggregate functions


• B. A Materialized View can only reference up to two tables
• C. A Materialized View cannot be joined with other tables
• D. A Materialized View cannot be defined with a JOIN Most Voted

Correct Answer: C 🗳️

In the Snowflake access control model, which entity owns an object by default?

• A. The user who created the object


• B. The SYSADMIN role
• C. Ownership depends on the type of object
• D. The role used to create the object Most Voted

Correct Answer: D 🗳️

What is the minimum Snowflake edition required to use Dynamic Data Masking?

• A. Standard
• B. Enterprise Most Voted
• C. Business Critical
• D. Virtual Private Snowflake (VPC)
Correct Answer: D 🗳️

Which services does the Snowflake Cloud Services layer manage? (Choose two.)

• A. Compute resources
• B. Query execution
• C. Authentication Most Voted
• D. Data storage
• E. Metadata Most Voted

Correct Answer: CE 🗳️

A company needs to allow some users to see Personally Identifiable Information (PII) while
limiting other users from seeing the full value of the PII.

Which Snowflake feature will support this?

• A. Row access policies


• B. Data masking policies Most Voted
• C. Data encryption
• D. Role based access control

Correct Answer: B 🗳️

A user has unloaded data from a Snowflake table to an external stage.

Which command can be used to verify if data has been uploaded to the external stage named
my_stage?

• A. view @my_stage
• B. list @my_stage Most Voted
• C. show @my_stage
• D. display @my_stage

Correct Answer: B 🗳️

Which tasks are performed in the Snowflake Cloud Services layer? (Choose two.)

• A. Management of metadata Most Voted


• B. Computing the data
• C. Maintaining Availability Zones
• D. Infrastructure security
• E. Parsing and optimizing queries Most Voted

Correct Answer: AE 🗳️

What is true about sharing data in Snowflake? (Choose two.)

• A. The Data Consumer pays for data storage as well as for data computing.
• B. The shared data is copied into the Data Consumer account, so the Consumer can
modify it without impacting the base data of the Provider.
• C. A Snowflake account can both provide and consume shared data. Most Voted
• D. The Provider is charged for compute resources used by the Data Consumer to query
the shared data.
• E. The Data Consumer pays only for compute resources to query the shared data. Most
Voted

Correct Answer: BC 🗳️

The following JSON is stored in a VARIANT column called src of the CAR_SALES table:

A user needs to extract the dealership information from the JSON.

How can this be accomplished?

• A. select src:dealership from car_sales; Most Voted


• B. select src.dealership from car_sales;
• C. select src:Dealership from car_sales;
• D. select dealership from car_sales;

Correct Answer: A 🗳️

Which of the following significantly improves the performance of selective point lookup queries
on a table?

• A. Clustering
• B. Materialized Views
• C. Zero-copy Cloning
• D. Search Optimization Service Most Voted
Correct Answer: A 🗳️

Which of the following accurately describes shares?

• A. Tables, secure views, and secure UDFs can be shared Most Voted
• B. Shares can be shared
• C. Data consumers can clone a new table from a share
• D. Access to a share cannot be revoked once granted

Correct Answer: A 🗳️

What are best practice recommendations for using the ACCOUNTADMIN system-defined role in
Snowflake? (Choose two.)

• A. Ensure all ACCOUNTADMIN roles use Multi-factor Authentication (MFA).


• B. All users granted ACCOUNTADMIN role must be owned by the ACCOUNTADMIN role.
• C. The ACCOUNTADMIN role must be granted to only one user.
• D. Assign the ACCOUNTADMIN role to at least two users, but as few as possible.
• E. All users granted ACCOUNTADMIN role must also be granted SECURITYADMIN role.

Correct Answer: AD 🗳️

In the query profiler view for a query, which components represent areas that can be used to
help optimize query performance? (Choose two.)

• A. Bytes scanned Most Voted


• B. Bytes sent over the network
• C. Number of partitions scanned Most Voted
• D. Percentage scanned from cache Most VotedMost Voted
• E. External bytes scanned

Correct Answer: BD 🗳️

What is the minimum Snowflake edition required for row level security?

• A. Standard
• B. Enterprise Most Voted
• C. Business Critical
• D. Virtual Private Snowflake

Correct Answer: B 🗳️

The is the minimum Fail-safe retention time period for transient tables?
• A. 1 day
• B. 7 days
• C. 12 hours
• D. 0 days Most Voted

Correct Answer: D 🗳️

What is a machine learning and data science partner within the Snowflake Partner Ecosystem?

• A. Informatica
• B. Power BI
• C. Adobe
• D. Data Robot Most Voted

Correct Answer: D 🗳️

Which statements are correct concerning the leveraging of third-party data from the Snowflake
Data Marketplace? (Choose two.)

• A. Data is live, ready-to-query, and can be personalized. Most Voted


• B. Data needs to be loaded into a cloud provider as a consumer account.
• C. Data is not available for copying or moving to an individual Snowflake account.
• D. Data is available without copying or moving. Most Voted
• E. Data transformations are required when combining Data Marketplace datasets with
existing data in Snowflake.

Correct Answer: AD 🗳️

What impacts the credit consumption of maintaining a materialized view? (Choose two.)

• A. Whether or not it is also a secure view


• B. How often the underlying base table is queried
• C. How often the base table changes Most Voted
• D. Whether the materialized view has a cluster key defined Most Voted
• E. How often the materialized view is queried

Correct Answer: DE 🗳️

What COPY INTO SQL command should be used to unload data into multiple files?

• A. SINGLE=TRUE
• B. MULTIPLE=TRUE
• C. MULTIPLE=FALSE
• D. SINGLE=FALSE Most Voted
Correct Answer: C 🗳️

When cloning a database containing stored procedures and regular views, that have fully
qualified table references, which of the following will occur?

• A. The cloned views and the stored procedures will reference the cloned tables in the
cloned database.
• B. An error will occur, as views with qualified references cannot be cloned.
• C. An error will occur, as stored objects cannot be cloned.
• D. The stored procedures and views will refer to tables in the source database. Most Voted

Correct Answer: D 🗳️

When loading data into Snowflake, how should the data be organized?

• A. Into single files with 100-250 MB of compressed data per file


• B. Into single files with 1-100 MB of compressed data per file
• C. Into files of maximum size of 1 GB of compressed data per file
• D. Into files of maximum size of 4 GB of compressed data per file

Correct Answer: A 🗳️

Which of the following objects can be directly restored using the UNDROP command? (Choose
two.)

• A. Schema Most Voted


• B. View
• C. Internal stage
• D. Table Most Voted
• E. User
• F. Role

Correct Answer: AD 🗳️

Which Snowflake SQL statement would be used to determine which users and roles have
access to a role called MY_ROLE?

• A. SHOW GRANTS OF ROLE MY_ROLE Most Voted


• B. SHOW GRANTS TO ROLE MY_ROLE
• C. SHOW GRANTS FOR ROLE MY_ROLE
• D. SHOW GRANTS ON ROLE MY_ROLE
Correct Answer: B 🗳️

What is the MINIMUM edition of Snowflake that is required to use a SCIM security integration?

• A. Business Critical Edition


• B. Standard Edition Most Voted
• C. Virtual Private Snowflake (VPS)
• D. Enterprise Edition

Correct Answer: D 🗳️

A user created a transient table and made several changes to it over the course of several
days. Three days after the table was created, the user would like to go back to the first version
of the table.

How can this be accomplished?

• A. Use Time Travel, as long as DATA_RETENTION_TIME_IN_DAYS was set to at least 3


days.
• B. The transient table version cannot be retrieved after 24 hours. Most Voted
• C. Contact Snowflake Support to have the data retrieved from Fail-safe storage.
• D. Use the FAIL_SAFE parameter for Time Travel to retrieve the data from Fail-safe
storage.

Correct Answer: B 🗳️

When reviewing the load for a warehouse using the load monitoring chart, the chart indicates
that a high volume of queries is always queuing in the warehouse.

According to recommended best practice, what should be done to reduce the queue volume?
(Choose two.)

• A. Use multi-clustered warehousing to scale out warehouse capacity. Most Voted


• B. Scale up the warehouse size to allow queries to execute faster.
• C. Stop and start the warehouse to clear the queued queries.
• D. Migrate some queries to a new warehouse to reduce load. Most Voted
• E. Limit user access to the warehouse so fewer queries are run against it.

Correct Answer: AD 🗳️

Which of the following features, associated with Continuous Data Protection (CDP), require
additional Snowflake-provided data storage? (Choose two.)

• A. Tri-Secret Secure
• B. Time Travel
• C. Fail-safe
• D. Data encryption
• E. External stages

Correct Answer: BC 🗳️

Where can a user find and review the failed logins of a specific user for the past 30 days?

• A. The USERS view in ACCOUNT_USAGE


• B. The LOGIN_HISTORY view in ACCOUNT_USAGE
• C. The ACCESS_HISTORY view in ACCOUNT_USAGE
• D. The SESSIONS view in ACCOUNT_USAGE

Correct Answer: B 🗳️

What is the purpose of an External Function?

• A. To call code that executes outside of Snowflake Most Voted


• B. To run a function in another Snowflake database
• C. To share data in Snowflake with external parties
• D. To ingest data from on-premises data sources

Correct Answer: A 🗳️

Which of the following statements apply to Snowflake in terms of security? (Choose two.)

• A. Snowflake leverages a Role-Based Access Control (RBAC) model. Most Voted


• B. Snowflake requires a user to configure an IAM user to connect to the database.
• C. All data in Snowflake is encrypted. Most Voted
• D. Snowflake can run within a user's own Virtual Private Cloud (VPC).
• E. All data in Snowflake is compressed.

Correct Answer: AD 🗳️

A single user of a virtual warehouse has set the warehouse to auto-resume and auto-suspend
after 10 minutes. The warehouse is currently suspended and the user performs the following
actions:

1. Runs a query that takes 3 minutes to complete


2. Leaves for 15 minutes
3. Returns and runs a query that takes 10 seconds to complete
4. Manually suspends the warehouse as soon as the last query was completed
When the user returns, how much billable compute time will have been consumed?

• A. 4 minutes
• B. 10 minutes
• C. 14 minutes Most Voted
• D. 24 minutes

Correct Answer: C 🗳️

What can be used to view warehouse usage over time? (Choose two.)

• A. The LOAD HISTORY view


• B. The query history view
• C. The SHOW WAREHOUSES command
• D. The WAREHOUSE_METERING_HISTORY view Most Voted
• E. The billing and usage tab in the Snowflake web UI Most Voted

Correct Answer: AD 🗳️

What actions will prevent leveraging of the ResultSet cache? (Choose two.)

• A. Removing a column from the query SELECT list Most Voted


• B. Stopping the virtual warehouse that the query is running against
• C. Clustering of the data used by the query Most Voted
• D. Executing the RESULTS_SCAN() table function
• E. Changing a column that is not in the cached query

Correct Answer: AD 🗳️

Which statement is true about running tasks in Snowflake?

• A. A task can be called using a CALL statement to run a set of predefined SQL
commands.
• B. A task allows a user to execute a single SQL statement/command using a predefined
schedule. Most Voted
• C. A task allows a user to execute a set of SQL commands on a predefined schedule.
• D. A task can be executed using a SELECT statement to run a predefined SQL
command.

Correct Answer: C 🗳️

Which data types does Snowflake support when querying semi-structured data? (Choose two.)
• A. VARIANT Most Voted
• B. VARCHAR
• C. XML
• D. ARRAY Most Voted
• E. BLOB

Correct Answer: CD 🗳️

In an auto-scaling multi-cluster virtual warehouse with the setting SCALING_POLICY =


ECONOMY enabled, when is another cluster started?

• A. When the system has enough load for 2 minutes


• B. When the system has enough load for 6 minutes Most Voted
• C. When the system has enough load for 8 minutes
• D. When the system has enough load for 10 minutes

Correct Answer: D 🗳️

What is the following SQL command used for?

Select * from table(validate(t1, job_id => '_last'));

• A. To validate external table files in table t1 across all sessions


• B. To validate task SQL statements against table t1 in the last 14 days
• C. To validate a file for errors before it gets executed using a COPY command
• D. To return errors from the last executed COPY command into table t1 in the current
session Most Voted

Correct Answer: D 🗳️

A sales table FCT_SALES has 100 million records.

The following query was executed:

SELECT COUNT (1) FROM FCT_SALES;

How did Snowflake fulfill this query?

• A. Query against the result set cache


• B. Query against a virtual warehouse cache
• C. Query against the most-recently created micro-partition
• D. Query against the metadata cache Most Voted

Correct Answer: D 🗳️
What happens when a virtual warehouse is resized?

• A. When increasing the size of an active warehouse the compute resource for all
running and queued queries on the warehouse are affected.
• B. When reducing the size of a warehouse the compute resources are removed only
when they are no longer being used to execute any current statements. Most Voted
• C. The warehouse will be suspended while the new compute resource is provisioned
and will resume automatically once provisioning is complete.
• D. Users who are trying to use the warehouse will receive an error message until the
resizing is complete.

Correct Answer: D 🗳️

What tasks can be completed using the COPY command? (Choose two.)

• A. Columns can be aggregated.


• B. Columns can be joined with an existing table.
• C. Columns can be reordered. Most Voted
• D. Columns can be omitted. Most Voted
• E. Data can be loaded without the need to spin up a virtual warehouse.

Correct Answer: CD 🗳️

Which Snowflake layer can be configured?

• A. Database Storage
• B. Cloud Services
• C. Query Processing Most Voted
• D. Application Services

Correct Answer: D 🗳️

Query compilation occurs in which architecture layer of the Snowflake Cloud Data Platform?

• A. Compute layer
• B. Storage layer
• C. Cloud infrastructure layer
• D. Cloud services layer Most Voted

Correct Answer: A 🗳️

If a size Small virtual warehouse is made up of two servers, how many servers make up a Large
warehouse?
• A. 4
• B. 8 Most Voted
• C. 16
• D. 32

Correct Answer: D 🗳️

A clustering key was defined on a table, but it is no longer needed.

How can the key be removed?

• A. ALTER TABLE [TABLE NAME] PURGE CLUSTERING KEY


• B. ALTER TABLE [TABLE NAME] DELETE CLUSTERING KEY
• C. ALTER TABLE [TABLE NAME] DROP CLUSTERING KEY Most Voted
• D. ALTER TABLE [TABLE NAME] REMOVE CLUSTERING KEY

Correct Answer: C 🗳️

What is a core benefit of clustering?

• A. To guarantee uniquely identifiable records in the database


• B. To increase scan efficiency in queries by improving pruning Most Voted
• C. To improve performance by creating a separate file for point lookups
• D. To provide data redundancy by duplicating micro-partitions

Correct Answer: B 🗳️

Which statement is true about Multi-Factor Authentication (MFA) in Snowflake?

• A. MFA can be enforced or applied for a given role.


• B. Snowflake users are automatically enrolled in MFA.
• C. Users enroll in MFA by submitting a request to Snowflake Support.
• D. MFA is an integrated Snowflake feature. Most Voted

Correct Answer: A 🗳️

What data type should be used to store JSON data natively in Snowflake?

• A. JSON
• B. String
• C. Object
• D. VARIANT Most Voted
Correct Answer: A 🗳️

What should be considered when deciding to use a Secure View? (Choose two.)

• A. No details of the query execution plan will be available in the query profiler. Most Voted
• B. Once created there is no way to determine if a view is secure or not.
• C. Secure views do not take advantage of the same internal optimizations as standard
views. Most Voted
• D. It is not possible to create secure materialized views.
• E. The view definition of a secure view is still visible to users by way of the information
schema.

Correct Answer: CE 🗳️

The information schema provides storage information for which of the following objects?
(Choose two.)

• A. Users
• B. Databases Most VotedMost Voted
• C. Internal stages Most Voted
• D. Resource monitors
• E. Pipes Most Voted

Correct Answer: BD 🗳️

What is a responsibility of Snowflake’s virtual warehouses?

• A. Infrastructure management
• B. Metadata management
• C. Query execution Most Voted
• D. Query parsing and optimization
• E. Management of the storage layer

Correct Answer: C 🗳️

Which data type is supported by Snowflake data classification?

• A. Binary
• B. Float Most Voted
• C. Geography
• D. Variant
Correct Answer: D 🗳️

When unloading data to an external stage, which compression format can be used for Parquet
files with the COPY INTO command?

• A. BROTLI
• B. GZIP
• C. LZO Most Voted
• D. ZSTD

Correct Answer: B 🗳️

Which SQL command can be used to verify the privileges that are granted to a role?

• A. SHOW GRANTS ON ROLE


• B. SHOW ROLES
• C. SHOW GRANTS TO ROLE Most Voted
• D. SHOW GRANTS FOR ROLE

Correct Answer: D 🗳️

Which Query Profile result indicates that a warehouse is sized too small?

• A. There are a lot of filter nodes.


• B. Bytes are spilling to external storage.
• C. The number of processed rows is very high.
• D. The number of partitions scanned is the same as partitions total.

Correct Answer: B 🗳️

What is the default Time Travel retention period?

• A. 1 day Most Voted


• B. 7 days
• C. 45 days
• D. 90 days

Correct Answer: D 🗳️

Which of the following are best practice recommendations that should be considered when
loading data into Snowflake? (Choose two.)

• A. Load files that are approximately 25 MB or smaller.


• B. Remove all dates and timestamps.
• C. Load files that are approximately 100-250 MB (or larger). Most Voted
• D. Avoid using embedded characters such as commas for numeric data types. Most Voted
• E. Remove semi-structured data types.

Correct Answer: AD 🗳️

Which schema has the RESOURCE_MONITORS view?

• A. ACCOUNT_USAGE
• B. READER_ACCOUNT_USAGE Most Voted
• C. INFORMATION_SCHEMA
• D. WAREHOUSE_USAGE_SCHEMA

Correct Answer: C 🗳️

What is the purpose of enabling Federated Authentication on a Snowflake account?

• A. Disables the ability to use key pair and basic authentication (e.g.,
username/password) when connecting
• B. Allows dual Multi-Factor Authentication (MFA) when connecting to Snowflake
• C. Forces users to connect through a secure network proxy
• D. Allows users to connect using secure single sign-on (SSO) through an external
identity provider Most Voted

Correct Answer: D 🗳️

Which Snowflake partner category is represented at the top of this diagram (labeled 1)?

• A. Business Intelligence
• B. Machine Learning and Data Science
• C. Security and Governance
• D. Data Integration Most Voted

Correct Answer: D 🗳️
1. The fail-safe retention period is how many days?
• 1 day
• 7 days correct
• 45 days
• 90 days
--

2. True or False: A 4X-Large Warehouse may, at times, take longer to provision than a X-
Small Warehouse.
• True correct
• False
--

Explanation: You can experiment the same with snowflake UI.

3. How would you determine the size of the virtual warehouse used for a task?
• Root task may be executed concurrently (i.e. multiple instances), it is recommended to
leave some margins in the execution window to avoid missing instances of execution
• Querying (select) the size of the stream content would help determine the warehouse
size. For example, if querying large stream content, use a larger warehouse size
• If using the stored procedure to execute multiple SQL statements, it's best to test run the
stored procedure separately to size the compute resource first
• Since task infrastructure is based on running the task body on schedule, it's
recommended to configure the virtual warehouse for automatic concurrency
handling using Multi-cluster warehouse (MCW) to match the task schedule correct

4. The Information Schema and Account Usage Share provide storage information for
which of the following objects? (Choose three.)
• Users correct
• Tables correct
• Databases correct
• Internal Stages correct

5. What is the default File Format used in the COPY command if one is not specified?
• CSV correct
• JSON
• Parquet
• XML

6. True or False: Reader Accounts are able to extract data from shared data objects for
use outside of Snowflake.
• True
• False correct

7. True or False: Loading data into Snowflake requires that source data files be no larger
than 16MB.
• True
• False correct

Explanation:
By default, COPY INTO location statements separate table data into a set of output files to take
advantage of parallel operations. The maximum size for each file is set using the
MAX_FILE_SIZE copy option. The default value is 16777216 (16 MB) but can be increased to
accommodate larger files. The maximum file size supported is 5 GB for Amazon S3, Google
Cloud Storage, or Microsoft Azure stages. To unload data to a single output file (at the potential
cost of decreased performance), specify the SINGLE = true copy option in your statement. You
can optionally specify a name for the file in the path.

8. True or False: A Virtual Warehouse can be resized while suspended.


• True correct
• False

9. True or False: When you create a custom role, it is a best practice to immediately grant
that role to ACCOUNTADMIN.
• True
• False correct

10. What are two ways to create and manage Data Shares in Snowflake? (Choose two.)
• Via the Snowflake Web Interface (Ul) correct
• Via the data_share=true parameter
• Via SQL commandscorrect
• Via Virtual Warehouses

11. True or False: Fail-safe can be disabled within a Snowflake account.


• True
• False correct

Explanation: Separate and distinct from Time Travel, Fail-safe ensures historical data is
protected in the event of a system failure or other catastrophic event, e.g. a hardware failure or
security breach. Fail safe feature cannot be enabled or disabled from the user end .
12. True or False: It is possible for a user to run a query against the query result cache
without requiring an active Warehouse.
• True correct
• False

Explanation: Query result cache is all about fetching the data from cloud services layer and
saving the cost by not running the virtual warehouse.
13. A virtual warehouse's auto-suspend and auto-resume settings apply to which of the
following?
• The primary cluster in the virtual warehouse
• The entire virtual warehouse correct
• The database in which the virtual warehouse resides
• The Queries currently being run on the virtual warehouse

14. Which of the following Snowflake features provide continuous data protection
automatically? (Select TWO).
• Internal stages correct
• Incremental backups
• Time Travel correct
• Zero-copy clones
• Fail-safe correct
Explanation: Time travel and fail safe are the two continuous data protection features support the
recovery of data automatically. Snowflake provides powerful CDP features for ensuring the
maintenance and availability of your historical data (i.e. data that has been changed or deleted):
Querying, cloning, and restoring historical data in tables, schemas, and databases for up to 90
days through Snowflake Time Travel. Disaster recovery of historical data (by Snowflake) through
Snowflake Fail-safe.

15. Which of the following conditions must be met in order to return results from the
results cache? (Select TWO).
• The user has the appropriate privileges on the objects associated with the query
• Micro-partitions have been reclustered since the query was last run
• The new query is run using the same virtual warehouse as the previous query
• The query includes a User Defined Function (UDF)
• The query has been run within 24 hours of the previously-run query

16. Which of the following are benefits of micro-partitioning? (Select TWO)


• Micro-partitions cannot overlap in their range of values correct
• Micro-partitions are immutable objects that support the use of Time Travel. correct
• Micro-partitions can reduce the amount of I/O from object storage to virtual
warehouses correct
• Rows are automatically stored in sorted order within micro-partitions
• Micro-partitions can be defined on a schema-by-schema basis

17. What is the minimum Snowflake edition required to create a materialized view?
• Standard Edition
• Enterprise Edition correct
• Business Critical Edition
• Virtual Private Snowflake Edition

Explanation: Materialized views require Enterprise Edition. To inquire about upgrading, please
contact Snowflake Support
18. What happens to the underlying table data when a CLUSTER BY clause is added to a
Snowflake table?
• Data is hashed by the cluster key to facilitate fast searches for common data values
• Larger micro-partitions are created for common data values to reduce the number of
partitions that must be scanned
• Smaller micro-partitions are created for common data values to allow for more parallelism
• Data may be colocated by the cluster key within the micro-partitions to improve
pruning performance correct

19. Which feature is only available in the Enterprise or higher editions of Snowflake?
• Column-level security correct
• SOC 2 type II certification
• Multi-factor Authentication (MFA)
• Object-level access control

20. Which of the following are valid methods for authenticating users for access into
Snowflake? (Select THREE)
• SCIM correct
• Federated authentication correct
• TLS 1.2
• Key-pair authentication correct
• OAuth correct
• OCSP authentication
--

21. During periods of warehouse contention which parameter controls the maximum
length of time a warehouse will hold a query for processing?
• STATEMENT_TIMEOUT__IN__SECONDS
• STATEMENT_QUEUED_TIMEOUT_IN_SECONDS correct
• MAX_CONCURRENCY__LEVEL
• QUERY_TIMEOUT_IN_SECONDS
--

Explanation: The parameter STATEMENT_QUEUED_TIMEOUT_IN_SECONDS sets the limit for


a query to wait in the queue in order to get its chance of running on the warehouse. The query
will quit after reaching this limit. By default, the value of this parameter is 0 which mean the
queries will wait indefinitely in the waiting queue
22. Which of the following indicates that it may be appropriate to use a clustering key for a
table? (Select TWO).
• The table contains a column that has very low cardinality
• DML statements that are being issued against the table are blocked
• The table has a small number of micro-partitions
• Queries on the table are running slower than expected correct
• The clustering depth for the table is large correct
--

23. Which Snowflake object enables loading data from files as soon as they are available
in a cloud storage location?
• Pipe correct
• External stage
• Task
• Stream
--

Explanation: Snowpipe enables loading data from files as soon as they’re available in a stage.
This means you can load data from files in micro-batches, making it available to users within
minutes, rather than manually executing COPY statements on a schedule to load larger batches.
https://docs.snowflake.com/en/user-guide/data-load-snowpipe-intro.html
24. A user needs to create a materialized view in the schema MYDB.MYSCHEMA.

Which statements will provide this access?


• GRANT ROLE MYROLE TO USER USER1;
CREATE MATERIALIZED VIEW ON SCHEMA MYD correct
• MYSCHEMA TO ROLE MYROLE;
• GRANT ROLE MYROLE TO USER USER1;
CREATE MATERIALIZED VIEW ON SCHEMA MYD
• MYSCHEMA TO USER USER1;
• GRANT ROLE MYROLE TO USER USER1;
CREATE MATERIALIZED VIEW ON SCHEMA MYD
• MYSCHEMA TO USER1;
• GRANT ROLE MYROLE TO USER USER1;
CREATE MATERIALIZED VIEW ON SCHEMA MYD
• MYSCHEMA TO MYROLE;
--

25. What is the default character set used when loading CSV files into Snowflake?
• UTF-8 correct
• UTF-16
• ISO S859-1
• ANSI_X3.A
--

For delimited files (CSV, TSV, etc.), the default character set is UTF-8. To use any other
characters sets, you must explicitly specify the encoding to use for loading. For the list of
supported character sets, see Supported Character Sets for Delimited Files (in this topic).

26. A sales table FCT_SALES has 100 million records.

The following Query was executed

SELECT COUNT (1) FROM FCT__SALES;

How did Snowflake fulfill this query?


• Query against the result set cache
• Query against a virtual warehouse cache
• Query against the most-recently created micro-partition
• Query against the metadata excite correct
--

27. Which cache type is used to cache data output from SQL queries?
• Metadata cache
• Result cache correct
• Remote cache
• Local file cache
--

28. What is a key feature of Snowflake architecture?


• Zero-copy cloning creates a mirror copy of a database that updates with the original
• Software updates are automatically applied on a quarterly basis
• Snowflake eliminates resource contention with its virtual warehouse
implementation correct
• Multi-cluster warehouses allow users to run a query that spans across multiple clusters
• Snowflake automatically sorts DATE columns during ingest for fast retrieval by date
--

29. What is a limitation of a Materialized View?


• A Materialized View cannot support any aggregate functions
• A Materialized View can only reference up to two tables
• A Materialized View cannot be joined with other tables
• A Materialized View cannot be defined with a JOIN correct
--Explanation: There are several limitations to using materialized views: A materialized view can
query only a single table. Joins, including self-joins, are not supported.
30. What features does Snowflake Time Travel enable?
• Querying data-related objects that were created within the past 365 days
• Restoring data-related objects that have been deleted within the past 90 days
correct
• Conducting point-in-time analysis for Bl reporting
• Analyzing data usage/manipulation over all periods of time
--

Explanation: Snowflake Time Travel enables accessing historical data (i.e. data that has been
changed or deleted) at any point within a defined period. It serves as a powerful tool for
performing the following tasks: Restoring data-related objects (tables, schemas, and databases)
that might have been accidentally or intentionally deleted. Duplicating and backing up data from
key points in the past. Analyzing data usage/manipulation over specified periods of time.
https://docs.snowflake.com/en/user-guide/data-time-travel.html
31. Which statement about billing applies to Snowflake credits?
• Credits are billed per-minute with a 60-minute minimum
• Credits are used to pay for cloud data storage usage
• Credits are consumed based on the number of credits billed for each hour that a
warehouse runs
• Credits are consumed based on the warehouse size and the time the warehouse is
running correct
--

Explanation: Snowflake credits are used to pay for the consumption of resources on Snowflake.
A Snowflake credit is a unit of measure, and it is consumed only when a customer is using
resources, such as when a virtual warehouse is running, the cloud services layer is performing
work, or serverless features are used.
32. What Snowflake features allow virtual warehouses to handle high concurrency
workloads? (Select TWO)
• The ability to scale up warehousescorrect
• The use of warehouse auto scaling
• The ability to resize warehouses
• Use of multi-clustered warehousescorrect
• The use of warehouse indexing
--

33. When reviewing the load for a warehouse using the load monitoring chart, the chart
indicates that a high volume of Queries are always queuing in the warehouse.

According to recommended best practice, what should be done to reduce the Queue
volume? (Select TWO).
• Use multi-clustered warehousing to scale out warehouse capacity. correct
• Scale up the warehouse size to allow Queries to execute faster.
• Stop and start the warehouse to clear the queued queries
• Migrate some queries to a new warehouse to reduce load correct
• Limit user access to the warehouse so fewer queries are run against it.
--

34. Which of the following objects can be shared through secure data sharing?
• Masking policy
• Stored procedure
• Task
• External table correct
--
Explanation: Secure Data Sharing enables sharing selected objects in a database in your
account with other Snowflake accounts. The following Snowflake database objects can be
shared:
✑ Tables

✑ External tables

✑ Secure views

✑ Secure materialized views

✑ Secure UDFs

Snowflake enables the sharing of databases through shares, which are created by data providers
and “imported” by data consumers.

35. Which of the following commands cannot be used within a reader account?
• CREATE SHARE correct
• ALTER WAREHOUSE correct
• DROP ROLE correct
• SHOW SCHEMAS
• DESCRBE TABLE
--

36. A user unloaded a Snowflake table called mytable to an internal stage called mystage.

Which command can be used to view the list of files that has been uploaded to the
staged?
• list @mytable;
• list @%raytable;
• list @ %m.ystage;
• list @mystage; correct
--

37. Which of the following Snowflake capabilities are available in all Snowflake editions?
(Select TWO)
• Customer-managed encryption keys through Tri-Secret Secure correct
• Automatic encryption of all data correct
• Up to 90 days of data recovery through Time Travel
• Object-level access control correct
• Column-level security to apply data masking policies to tables and views
--

38. Which command is used to unload data from a Snowflake table into a file in a stage?
• COPY INTO correct
• GET
• WRITE
• EXTRACT INTO
--

39. How often are encryption keys automatically rotated by Snowflake?


• 30 Days
• 60 Days
• 90 Days correct
• 365 Days
--

40. What are value types that a VARIANT column can store? (Select TWO)
• STRUCT correct
• OBJECT correct
• BINARY
• ARRAY correct
• CLOB
--

Explanation: Characteristics of a VARIANT . A VARIANT can store a value of any other type,
including OBJECT and ARRAY. The maximum length of a VARIANT is 16 MB.
41. A user has an application that writes a new Tile to a cloud storage location every 5
minutes.

What would be the MOST efficient way to get the files into Snowflake?
• Create a task that runs a copy into operation from an external stage every 5 minutes
• Create a task that puts the files in an internal stage and automate the data loading wizard
• Create a task that runs a GET operation to intermittently check for new files
• Set up cloud provider notifications on the Tile location and use Snowpipe with
auto-ingest
correct
--

42. Which of the following are best practice recommendations that should be considered
when loading data into Snowflake? (Select TWO).
• Load files that are approximately 25 MB or smaller. correct
• Remove all dates and timestamps.
• Load files that are approximately 100-250 MB (or larger) correct
• Avoid using embedded characters such as commas for numeric data types correct
• Remove semi-structured data types
--

43. A user has 10 files in a stage containing new customer data.

The ingest operation completes with no errors, using the following command: COPY INTO
my__table FROM @my__stage;

The next day the user adds 10 files to the stage so that now the stage contains a mixture
of new

customer data and updates to the previous data. The user did not remove the 10 original
files.

If the user runs the same copy into command what will happen?
• All data from all of the files on the stage will be appended to the table correct
• Only data about new customers from the new files will be appended to the table
• The operation will fail with the error uncertain files in stage.
• All data from only the newly-added files will be appended to the table.
--
44. A user has unloaded data from Snowflake to a stage.

Which SQL command should be used to validate which data was loaded into the stage?
• list @file__stage correct
• show @file__stage
• view @file__stage
• verify @file__stage
--

45. What happens when a cloned table is replicated to a secondary database? (Select
TWO)
• A read-only copy of the cloned tables is stored.
• The replication will not be successful.
• The physical data is replicated correct
• Additional costs for storage are charged to a secondary account
• Metadata pointers to cloned tables are replicated correct
--

46. Which data types does Snowflake support when querying semi-structured data?
(Select TWO)
• VARIANT correct
• ARRAY correct
• VARCHAR
• XML
• BLOB
--

Explanation: A VARIANT stores semi-structured data in Snowflake. It can store a value of any
other type, including OBJECT and ARRAY. The maximum length of a VARIANT is 16 MB.
A Snowflake ARRAY is similar to an array in many other programming languages. An ARRAY
contains 0 or more pieces of data. Each element is accessed by specifying its position in the
array.

47. Which of the following describes how multiple Snowflake accounts in a single
organization relate to various cloud providers?
• Each Snowflake account can be hosted in a different cloud vendor and region.
correct
• Each Snowflake account must be hosted in a different cloud vendor and region
• All Snowflake accounts must be hosted in the same cloud vendor and region
• Each Snowflake account can be hosted in a different cloud vendor, but must be in the
same region.
--

48. A user is loading JSON documents composed of a huge array containing multiple
records into Snowflake. The user enables the strip__outer_array file format option.

What does the STRIP_OUTER_ARRAY file format do?


• It removes the last element of the outer array.
• It removes the outer array structure and loads the records into separate table
rows, correct
• It removes the trailing spaces in the last element of the outer array and loads the records
into separate table columns
• It removes the NULL elements from the JSON object eliminating invalid data and enables
the ability to load the records
--

Explanation: Data Size Limitations. The VARIANT data type imposes a 16 MB size limit on
individual rows. For some semi-structured data formats (e.g. JSON), data sets are frequently a
simple concatenation of multiple documents. The JSON output from some software is composed
of a single huge array containing multiple records. There is no need to separate the documents
with line breaks or commas, though both are supported.
If the data exceeds 16 MB, enable the STRIP_OUTER_ARRAY file format option for the COPY
INTO <table> command to remove the outer array structure and load the records into separate
table rows: copy into <table>

from @~/<file>.json

file_format = (type = 'JSON' strip_outer_array = true);

49. What are the default Time Travel and Fail-safe retention periods for transient tables?
• Time Travel - 1 day. Fail-safe - 1 day
• Time Travel - 0 days. Fail-safe - 1 day
• Time Travel - 1 day. Fail-safe - 0 days
• Transient tables are retained in neither Fail-safe nor Time Travel correct
--

50. What is a best practice after creating a custom role?


• Create the custom role using the SYSADMIN role.
• Assign the custom role to the SYSADMIN role correct
• Assign the custom role to the PUBLIC role
• Add__CUSTOM to all custom role names
--

Explanation: When creating roles that will serve as the owners of securable objects in the
system, Snowflake recommends creating a hierarchy of custom roles, with the top-most custom
role assigned to the system role SYSADMIN. This role structure allows system administrators to
manage all objects in the account, such as warehouses and database objects, while restricting
management of users and roles to the USERADMIN role.

51. Which of the following Snowflake objects can be shared using a secure share? (Select
TWO).
• Materialized views correct
• Sequences
• Procedures
• Tablescorrect
• Secure User Defined Functions (UDFs)
--

52. Will data cached in a warehouse be lost when the warehouse is resized?
• Possibly, if the warehouse is resized to a smaller size and the cache no longer fits.
• Yes. because the compute resource is replaced in its entirety with a new compute
resource.
• No. because the size of the cache is independent from the warehouse size correct
• Yes. became the new compute resource will no longer have access to the cache
encryption key
--
53. Which Snowflake partner specializes in data catalog solutions?
• Alation correct
• DataRobot
• dbt
• Tableau
--

Explanation: Alation provides Data Cataloging functionality. They state they are the 'One Place to
Find, Understand, & Govern Data Across an Enterprise
54. What is the MOST performant file format for loading data in Snowflake?
• CSV (Unzipped)
• Parquet correct
• CSV (Gzipped)
• ORC
--

55. Which copy INTO command outputs the data into one file?
• SINGLE=TRUE
• MAX_FILE_NUMBER=1 correct
• FILE_NUMBER=1
• MULTIPLE=FAISE
--

56. Where would a Snowflake user find information about query activity from 90 days
ago?
• account__usage . query history view
• account__usage.query__history__archive View correct
• information__schema . cruery_history view
• information__schema - query history_by_ses s i on view
--

57. Which Snowflake technique can be used to improve the performance of a query?
• Clustering correct
• Indexing
• Fragmenting
• Using INDEX__HINTS
--

58. User-level network policies can be created by which of the following roles? (Select
TWO).
• ROLEADMIN
• ACCOUNTADMIN correct
• SYSADMIN
• SECURITYADMIN correct
• USERADMIN
--

59. Which command can be used to load data into an internal stage?
• LOAD
• copy
• GET
• PUT correct
--

60. What happens when an external or an internal stage is dropped? (Select TWO).
• When dropping an external stage, the files are not removed and only the stage is
dropped correct
• When dropping an external stage, both the stage and the files within the stage are
removed
• When dropping an internal stage, the files are deleted with the stage and the files are
recoverable
• When dropping an internal stage, the files are deleted with the stage and the files
are not recoverable correct
• When dropping an internal stage, only selected files are deleted with the stage and are
not recoverable
--

61. How long is Snowpipe data load history retained?


• As configured in the create pipe settings
• Until the pipe is dropped
• 64 days
• 14 days correct
--

62. In which use cases does Snowflake apply egress charges?


• Data sharing within a specific region correct
• Query result retrieval correct
• Database replication correct
• Loading data into Snowflake
--

63. Which account__usage views are used to evaluate the details of dynamic data
masking? (Select TWO)
• ROLES correct
• POLICY_REFERENCES correct
• QUERY_HISTORY
• RESOURCE_MONIT ORS
• ACCESS_HISTORY correct
--

64. Query compilation occurs in which architecture layer of the Snowflake Cloud Data
Platform?
• Compute layer
• Storage layer
• Cloud infrastructure layer
• Cloud services layercorrect
--

65. Which is the MINIMUM required Snowflake edition that a user must have if they want to
use AWS/Azure Privatelink or Google Cloud Private Service Connect?
• Standard
• Premium
• Enterprise correct
• Business Critical
--
66. In the query profiler view for a query, which components represent areas that can be
used to help optimize query performance? (Select TWO)
• Bytes scanned correct
• Bytes sent over the network
• Number of partitions scanned correct
• Percentage scanned from cache
• External bytes scanned
--

67. A marketing co-worker has requested the ability to change a warehouse size on their
medium virtual warehouse called mktg__WH.

Which of the following statements will accommodate this request?


• ALLOW RESIZE ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
• GRANT MODIFY ON WAREHOUSE MKTG WH TO ROLE MARKETING;correct
• GRANT MODIFY ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
• GRANT OPERATE ON WAREHOUSE MKTG WH TO ROLE MARKETING;
--

68. When reviewing a query profile, what is a symptom that a query is too large to fit into
the memory?
• A single join node uses more than 50% of the query time
• Partitions scanned is equal to partitions total
• An Aggregate Operacor node is present
• The query is spilling to remote storage correct
--

69. Which stage type can be altered and dropped?


• Database stage
• External stage correct
• Table stage
• User stage
--

70. Which command can be used to stage local files from which Snowflake interface?
• SnowSQL correct
• Snowflake classic web interface (Ul)
• Snowsight
• .NET driver
--

71. What is the recommended file sizing for data loading using Snowpipe?
• A compressed file size greater than 100 MB, and up to 250 MB correct
• A compressed file size greater than 100 GB, and up to 250 GB
• A compressed file size greater than 10 MB, and up to 100 MB
• A compressed file size greater than 1 GB, and up to 2 GB
--

72. Which services does the Snowflake Cloud Services layer manage? (Select TWO).
• Compute resources correct
• Query execution
• Authentication correct
• Data storage
• Metadata correct
--

Explanation: The cloud services layer is a collection of services that coordinate activities across
Snowflake. These services tie together all of the different components of Snowflake in order to
process user requests, from login to query dispatch. The cloud services layer also runs on
compute instances provisioned by Snowflake from the cloud provider.
Services managed in this layer include:

✑ Authentication

✑ Infrastructure management

✑ Metadata management

✑ Query parsing and optimization

✑ Access control

73. What data is stored in the Snowflake storage layer? (Select TWO).
• Snowflake parameters correct
• Micro-partitions correct
• Query history
• Persisted query results correct
• Standard and secure view results
--

74. In which scenarios would a user have to pay Cloud Services costs? (Select TWO).
• Compute Credits = 50 Credits Cloud Services = 10 correct
• Compute Credits = 80 Credits Cloud Services = 5
• Compute Credits = 10 Credits Cloud Services = 9
• Compute Credits = 120 Credits Cloud Services = 10
• Compute Credits = 200 Credits Cloud Services = 26 correct
--

75. What transformations are supported in a CREATE PIPE ... AS COPY ... FROM (....)
statement? (Select TWO.)
• Data can be filtered by an optional where clause correct
• Incoming data can be joined with other tables
• Columns can be reordered
• Columns can be omitted correct
• Row level access can be defined
--

76. What is a responsibility of Snowflake's virtual warehouses?


• Infrastructure management
• Metadata management
• Query execution correct
• Query parsing and optimization
• Management of the storage layer
--
77. Which of the following compute resources or features are managed by Snowflake?
(Select TWO).
• Execute a COPY command correct
• Updating data
• Snowpipe
• AUTOMATIC__CLUSTERING correct
• Scaling up a warehouse
--

78. What happens when a virtual warehouse is resized?


• When increasing the size of an active warehouse the compute resource for all running
and queued queries on the warehouse are affected
• When reducing the size of a warehouse the compute resources are removed only
when they are no longer being used to execute any current statements.correct
• The warehouse will be suspended while the new compute resource is provisioned and
will resume automatically once provisioning is complete.
• Users who are trying to use the warehouse will receive an error message until the
resizing is complete
--

79. A developer is granted ownership of a table that has a masking policy. The developer's
role is not able to see the masked data.

Will the developer be able to modify the table to read the masked data?
• Yes, because a table owner has full control and can unset masking policies.correct
• Yes, because masking policies only apply to cloned tables.
• No, because masking policies must always reference specific access roles.
• No, because ownership of a table does not include the ability to change masking policies
--

80. Which of the following describes how clustering keys work in Snowflake?
• Clustering keys update the micro-partitions in place with a full sort, and impact the DML
operations.
• Clustering keys sort the designated columns over time, without blocking DML
operations correct
• Clustering keys create a distributed, parallel data structure of pointers to a table's rows
and columns
• Clustering keys establish a hashed key on each node of a virtual warehouse to optimize
joins at run-time
--

81. What is a machine learning and data science partner within the Snowflake Partner
Ecosystem?
• Informatica
• Power Bl
• Adobe
• Data Robot correct
--

82. Which of the following is a valid source for an external stage when the Snowflake
account is located on Microsoft Azure?
• An FTP server with TLS encryption
• An HTTPS server with WebDAV
• A Google Cloud storage bucket
• A Windows server file share on Azure correct
--

83. Which data type can be used to store geospatial data in Snowflake?
• Variant
• Object
• Geometry
• Geography correct
--

84. What can be used to view warehouse usage over time? (Select Two).
• The load HISTORY view correct
• The Query history view
• The show warehouses command
• The WAREHOUSE_METERING__HISTORY View correct
• The billing and usage tab in the Snowflake web Ul correct
--

85. Which Snowflake feature is used for both querying and restoring data?
• Cluster keys
• Time Travel correct
• Fail-safe
• Cloning
--

86. A company strongly encourages all Snowflake users to self-enroll in Snowflake's


default Multi-Factor Authentication (MFA) service to provide increased login security for
users connecting to Snowflake.

Which application will the Snowflake users need to install on their devices in order to
connect with MFA?
• Okta Verify
• Duo Mobile
• Microsoft Authenticator
• Google Authenticator correct
--

87. Which Snowflake objects track DML changes made to tables, like inserts, updates, and
deletes?
• Pipes
• Streams correct
• Tasks
• Procedures
--

88. What tasks can be completed using the copy command? (Select TWO)
• Columns can be aggregated correct
• Columns can be joined with an existing table
• Columns can be reordered correct
• Columns can be omitted correct
• Data can be loaded without the need to spin up a virtual warehouse
--
89. What feature can be used to reorganize a very large table on one or more columns?
• Micro-partitions
• Clustering keys correct
• Key partitions
• Clustered partitions
--

90. What SQL command would be used to view all roles that were granted to user.1?
• show grants to user USER1; correct
• show grants of user USER1;
• describe user USER1;
• show grants on user USER1;
--

91. Which of the following can be executed/called with Snowpipe?


• A User Defined Function (UDF)
• A stored procedure
• A single copy_into statement correct
• A single insert__into statement
--

92. What Snowflake role must be granted for a user to create and manage accounts?
• ACCOUNTADMIN correct
• ORGADMIN
• SECURITYADMIN
• SYSADMIN
--

93. When unloading to a stage, which of the following is a recommended practice or


approach?
• Set SINGLE: = true for larger files
• Use OBJECT_CONSTRUCT ( * ) when using Parquet correct
• Avoid the use of the CAST function
• Define an individual file format
--

94. When is the result set cache no longer available? (Select TWO)
• When another warehouse is used to execute the query correct
• When another user executes the query
• When the underlying data has changed correct
• When the warehouse used to execute the query is suspended
• When it has been 24 hours since the last query correct
--

95. Which of the following describes external functions in Snowflake?


• They are a type of User-defined Function (UDF). correct
• They contain their own SQL code.
• They call code that is stored inside of Snowflake.
• They can return multiple rows for each row received
--
Explanation: External functions are user-defined functions that are stored and executed outside
of Snowflake. External functions make it easier to access external API services such as
geocoders, machine learning models, and other custom code running outside of Snowflake. This
feature eliminates the need to export and reimport data when using third-party services,
significantly simplifying your data pipelines.
96. What are ways to create and manage data shares in Snowflake? (Select TWO)
• Through the Snowflake web interface (Ul) correct
• Through the DATA_SHARE=TRUE parameter
• Through SQL commands correct
• Through the enable__share=true parameter
• Using the CREATE SHARE AS SELECT * TABLE command
--

97. A company's security audit requires generating a report listing all Snowflake logins
(e.g.. date and user) within the last 90 days.

Which of the following statements will return the required information?


• SELECT LAST_SUCCESS_LOGIN, LOGIN_NAME
FROM ACCOUNT_USAG
• USERS;
• SELECT EVENT_TIMESTAMP, USER_NAME
FROM table(information_schema.login_history_by_user())
• SELECT EVENT_TIMESTAMP, USER_NAME
FROM ACCOUNT_USAG
• ACCESS_HISTORY;
• SELECT EVENT_TIMESTAMP, USER_NAME
FROM ACCOUNT_USAG
• LOGIN_HISTORY;

98. Which semi-structured file formats are supported when unloading data from a table?
(Select TWO).
• ORC correct
• XML
• Avro
• Parquet correct
• JSON correct
--

Explanation: Semi-structured JSON, Parquet

99. What is the purpose of an External Function?


• To call code that executes outside of Snowflake correct
• To run a function in another Snowflake database
• To share data in Snowflake with external parties
• To ingest data from on-premises data sources
--

100. A user created a new worksheet within the Snowsight Ul and wants to share this with
teammates How can this worksheet be shared?
• Create a zero-copy clone of the worksheet and grant permissions to teammates
• Create a private Data Exchange so that any teammate can use the worksheetwrong
• Share the worksheet with teammates within Snowsight correct
• Create a database and grant all permissions to teammates
1. What privileges are required to create a task?
• The global privilege create task is required to create a new task.
• Tasks are created at the Application level and can only be created by the Account Admin
role.
• Many Snowflake DDLs are metadata operations only, and create task DDL can be
executed without virtual warehouse requirement or task specific grants.
• The role must have access to the target schema and the create task privilege on
the schema itself. correct
--

Explanation: All tasks in a simple tree must have the same task owner (i.e. a single role must
have the OWNERSHIP privilege on all of the tasks in the tree). All tasks in a simple tree must
exist in the same schema.

2. True or False: AWS Private Link provides a secure connection from the Customer’s on-
premise data center to the Snowflake.
• True correct
• False
--

Explanation: AWS PrivateLink provides a secure connection from the customer's on-premise
data center to Snowflake. It establishes a private connection between the customer's VPC
(Virtual Private Cloud) and Snowflake, ensuring that the data is transferred securely over the
Amazon network and not exposed to the public internet.

3. Why would a customer size a Virtual Warehouse from an X-Small to a Medium?


• To accommodate more queries
• To accommodate more users
• To accommodate fluctuations in workload
• To accommodate a more complex workload correct
--

4. When a Pipe is recreated using the CREATE OR REPLACE PIPE command:


• The Pipe load history is reset to empty
• The REFRESH parameter is set to TRUE
• Previously loaded files will be ignored
• All of the above correct
--

Explanation:
When a pipe is recreated using the CREATE OR REPLACE PIPE command, the following will
happen:

The pipe load history is reset to empty.

The REFRESH parameter is set to TRUE.

Previously loaded files will be ignored.


This command allows you to replace the existing pipe with a new one while maintaining its name
and any associated metadata.

5. True or False: Query ID’s are unique across all Snowflake deployments and can be
used in communication with Snowflake Support to help troubleshoot issues.
• True correct
• False
--

6. True or False: During data unloading, only JSON and CSV files can be compressed.
• True
• False correct
--

7. What is the minimum Snowflake edition that customers planning on storing protected
information in Snowflake should consider for regulatory compliance?
• Standard
• Premier
• Enterprise
• Business Critical Edition correct
--

8. What is the minimum Snowflake edition that provides multi-cluster warehouses and up
to 90 days of Time Travel?
• Standard
• Premier
• Enterprise correct
• Business Critical Edition
--

9. True or False: Snowflake allows its customers to directly access the micro-partition
files that make up its tables.
• True
• False correct
--

10. Which type of table corresponds to a single Snowflake session?


• Temporary correct
• Translent
• Provisional
• Permanent
--

Explanation: Snowflake supports creating temporary tables for storing non-permanent, transitory
data (e.g. ETL data, session-specific data). Temporary tables only exist within the session in
which they were created and persist only for the remainder of the session.

11. If a Small Warehouse is made up of 2 servers/cluster, how many servers/cluster make


up a Medium Warehouse?
• 4 correct
• 16
• 32
• 128
--

12. Select the different types of Internal Stages: (Choose three.)


• Named Stage
• User Stage
• Table Stage
• Schema Stage
--

13. True or False: When a new Snowflake object is created, it is automatically owned by
the user who created it.
• True
• False correct
--

Explanation: When a new Snowflake object is created, it is not automatically owned by the user
who created it. Instead, the object is owned by the role that was active for the user when the
object was created. This means that ownership and associated privileges belong to the role,
allowing for easier management of access control and permissions across multiple users within
an organization.

14. Which of the following statements are true of Snowflake releases: (Choose two.)
• They happen approximately weekly correct
• They roll up and release approximately monthly, but customers can request early release
application
• During a release, new customer requests/queries/connections transparently move
over to the newer version correct
• A customer is assigned a 30 minute window (that can be moved anytime within a week)
during which the system will be unavailable and customer is upgraded
--

15. True or False: Snowflake supports federated authentication in all editions.


• True
• False correct
--

Explanation:
Snowflake supports federated authentication only in the Enterprise and higher editions.
Federated authentication allows users to authenticate with Snowflake using their existing identity
provider (IdP) credentials, enabling single sign-on (SSO) and simplifying the management of
user access and authentication. If federated authentication is a requirement for a customer, they
need to consider at least the Enterprise edition of Snowflake.

16. Which of the following statements are true of Virtual Warehouses? (Choose all that
apply.)
• Customers can change the size of the Warehouse after creation
• A Warehouse can be resized while running
• A Warehouse can be configured to suspend after a period of inactivity
• A Warehouse can be configured to auto-resume when new queries are submitted
--

17. True or False: Snowflake charges a premium for storing semi-structured data.
• True
• False correct
--

18. True or False: Reader Accounts incur no additional Compute costs to the Data
Provider since they are simply reading the shared data without making changes.
• True
• False correct
--

19. True or False: The user has to specify which cluster a query will run on in multi-
clustering Warehouse.
• True
• False correct
--

20. The PUT command: (Choose two.)


• Automatically creates a File Format object correct
• Automatically uses the last Stage created
• Automatically compresses files using Gzip correct
• Automatically encrypts files correct
--

21. How a Snowpipe charges calculated?


• Per-second/per Warehouse size
• Per-second/per-core granularity correct
• Number of Pipes in account
• Total storage bucket size
--

22. True or False: A Snowflake account is charged for data stored in both Internal and
External Stages.
• True
• False correct
--

Explanation:
A Snowflake account is charged for data stored in Internal Stages. However, data stored in
External Stages such as Amazon S3, Google Cloud Storage, or Microsoft Azure Blob Storage is
not billed by Snowflake. Storage costs for External Stages are billed separately by the respective
cloud storage provider.

23. Which is true of Snowflake network policies? A Snowflake network policy: (Choose
two.)
• Is available to all Snowflake Editions correct
• Is only available to customers with Business Critical Edition
• Restricts or enables access to specific IP addresses correct
• Is activated using an “ALTER DATABASE” command
--

24. True or False: A Virtual Warehouse consumes Snowflake credits even when inactive.
• True
• False correct
--

25. The number of queries that a Virtual Warehouse can concurrently process is
determined by: Choose 2 answers
• The complexity of each query correct
• The CONCURRENT_QUERY_UMIT parameter set on the Snowflake account
• The size of the data required for each query correct
• The tool that s executing the query
--

26. True or False: A customer using SnowSQL / native connectors will be unable be
unable to able to also use the Snowflake Web interface (UI) unless access to the UI is
explicitly granted by supported.
• True
• False correct
--

27. True or False: Micro-partition metadata enables some operations to be completed


without requiring Compute.
• True correct
• False
--

28. Each incremental increase in Virtual Warehouse size (e,g. Medium to Large) generally
results in what?

Select one.
• More micro-partitions
• Better query scheduling
• Double the numbers of servers In the compute duster correct
• Higher storage costs
--

29. True or False: It is possible to query data from an Internal or named External stage
without loading the data into Snowflake.
• True correct
• False
--

30. True or False: Data Providers can share data with only the Data Consumer.
• True
• False correct
--

31. True or False: Snowflake’s Global Services Layer gathers and maintains statistics on
all columns in all micro-partitions.
• True correct
• False
--

Explanation:
Snowflake is a single, integrated platform delivered as-a-service. It features storage, compute,
and global services layers that are physically separated but logically integrated.

32. True or False: It is possible to load data into Snowflake without creating a named File
Format object.
• True correct
• False
--

33. True or False: A table in Snowflake can only be queried using the Virtual Warehouse
that was used to load the data.
• True
• False correct
--

34. True or False: A single database can exist in more than one Snowflake account.
• True
• False correct
--

35. True or False: Some queries can be answered through the metadata cache and do not
require an active Virtual Warehouse.
• True correct
• False
--

Explanation:Some aggregate queries are answered thru micro partitions metadata only not
requiring any VW spin ups.

36. Which interfaces can be used to create and/or manage Virtual Warehouses?
• The Snowflake Web Interface (UI)
• SQL commands
• Data integration tools
• All of the above correct
--

37. When can a Virtual Warehouse start running queries?


• 12am-5am
• Only during administrator defined time slots
• When its provisioning is complete correct
• After replication
--

38. Which of the following DML commands isn’t supported by Snowflake?


• UPSERT correct
• MERGE
• UPDATE
• TRUNCATE TABLE
--

39. Which of the following are true of multi-cluster Warehouses? Select all that apply
below.
• A multi-cluster Warehouse can add clusters automatically based on query activity
• A multi-cluster Warehouse can automatically turn itself off after a period of
inactivity
• A multi-cluster Warehouse can scale down when query activity slows
• A multi-cluster Warehouse can automatically turn itself on when a query is
executed against it
--

40. Which of the following statement is true of Snowflake?


• It was built specifically for the cloud
• it was built as an on-premises solution and then potted to the cloud
• It was designed as a hybrid database to allow customers to store data either on premises
or in the cloud
• It was built for Hadoop architecture
• It's based on an Oracle Architecture
--

41. True or False: Pipes can be suspended and resumed.


• True
• False
--

Explanation: pausing-or-resuming-pipes Pausing or Resuming Pipes In addition to the pipe


owner, a role that has the following minimum permissions can pause or resume the pipe:

42. Which of the following connectors allow Multi-Factor Authentication (MFA)


authorization when connecting? (Choose all that apply.)
• JDBC
• SnowSQL
• Snowflake Web Interface (UI)
• ODBC
• Python
--

43. Credit Consumption by the Compute Layer (Virtual Warehouses) is based on: (Choose
two.)
• Number of users
• Warehouse size
• Amount of data processed
• # of clusters for the Warehouse
--

Explanation: "Snowflake credits are charged based on the number of virtual warehouses you
use, how long they run, and their size."

44. The Query History in the Snowflake Web Interface (UI) is kept for approximately:
• 60 minutes
• 24 hours
• 14 days correct
• 30 days
• 1 year
--
45. A role is created and owns 2 tables. This role is then dropped. Who will now own the
two tables?
• The tables are now orphaned correct
• The user that deleted the role
• SYSADMIN
• The assumed role that dropped the role
--

Explanation:
When a role is dropped in Snowflake, any objects owned by the role, such as tables, become
orphaned. Orphaned objects cannot be accessed or altered by any user or role until ownership is
transferred to an existing role. To transfer ownership, a user with the appropriate privileges
(typically a user with the ACCOUNTADMIN or SECURITYADMIN role) can use the ALTER
command to change the object's ownership: ALTER TABLE table_name SET OWNER_ROLE =
new_owner_role;

46. Which of the following statements are true about Schemas in Snowflake? (Choose
two.)
• A Schema may contain one or more Databases
• A Database may contain one or more Schemas
• A Schema is a logical grouping of Database Objects
• Each Schema is contained within a Warehouse
--

47. When scaling out by adding clusters to a multi-cluster warehouse, you are primarily
scaling for improved:
• Concurrency
• Performance
--

48. What are the three layers that make up Snowflake’s architecture? Choose 3 answer
• Compute correct
• Tri-Secret Secure
• Storage correct
• Cloud Services correct
--

49. Which of the following objects can be cloned? (Choose four.)


• Tables correct
• Named File Formats
• Schemas correct
• Shares
• Databases correct
• Users correct
--

Explanation:
The following objects can be cloned in Snowflake:

Tables: You can create a clone of a table using the CREATE TABLE ... CLONE statement. This
creates a new table with the same structure and data as the source table at the time of cloning.
Schemas: You can create a clone of a schema using the CREATE SCHEMA ... CLONE
statement. This creates a new schema with the same structure and objects as the source
schema at the time of cloning.

Databases: You can create a clone of a database using the CREATE DATABASE ... CLONE
statement. This creates a new database with the same structure and objects as the source
database at the time of cloning.

Users: While you cannot directly clone a user, you can create a new user with the same
properties, roles, and privileges as an existing user by using the CREATE USER ... COPY
statement.

Named File Formats (B) and Shares (D) cannot be cloned directly using a single statement in
Snowflake.

50. True or False: An active warehouse is required to run a COPY INTO statement.
• True correct
• False
--

51. To run a Multi-Cluster Warehouse in auto-scale mode, a user would:


• Configure the Maximum Clusters setting to “Auto-Scale”
• Set the Warehouse type to “Auto”
• Set the Minimum Clusters and Maximum Clusters settings to the same value
• Set the Minimum Clusters and Maximum Clusters settings to the different values
correct
--

52. Which of the following statements describes a benefit of Snowflake’s separation of


compute and storage? (Choose all that apply.)
• Growth of storage and compute are tightly coupled together correct
• Storage expands without the requirement to add more compute correct
• Compute can be scaled up or down without the requirement to add more storage
correct
• Multiple compute clusters can access stored data without contention correct
--

53. Which of the following roles is recommended to be used to create and manage users
and roles?
• SYSADMIN
• SECURITYADMIN correct
• PUBLIC
• ACCOUNTADMIN
--

Explanation: "Security admin: Role that can manage any object grant globally, as well as create,
monitor, and manage users and roles"

54. Which of the following are valid Snowflake Virtual Warehouse Scaling Policies?
(Choose two.)
• Custom correct
• Economy correct
• Optimized
• Standard correct
--

55. Which of the following statements is true of Snowflake?


• It was built specifically for the cloud correct
• It was built as an on-premises solution and then ported to the cloud
• It was designed as a hybrid database to allow customers to store data either on premises
or in the cloud
• It was built for Hadoop architecture
• It's based on an Oracle Architecture
--

56. True or False: It is possible to unload structured data to semi-structured formats such
as JSON and parquet.
• True
• False correct
--

Explanation:
It is not possible to unload structured data directly to semi-structured formats such as JSON and
Parquet using Snowflake's native functionality. Snowflake's COPY INTO statement, which is
used for unloading data, currently supports exporting data to CSV, TSV, and other delimited text
formats, as well as to Avro format (a row-based binary format).

If you need to unload structured data to JSON or Parquet, you may need to use third-party tools
or custom code to transform the exported data into the desired format after unloading it from
Snowflake. Alternatively, you can consider using cloud-native ETL services that support such
transformations, like AWS Glue, Azure Data Factory, or Google Cloud Dataflow.

57. Which of the following terms best describes Snowflake’s database architecture?
• Columnar shared nothing
• Shared disk
• Multi-cluster, shared data correct
• Cloud-native shared memory
--

Explanation:
Built from the ground up for the cloud, Snowflake’s unique multi-cluster shared data architecture
delivers the performance, scale, elasticity, and concurrency today’s organizations require.

58. Which of the following objects is not covered by Time Travel?


• Tables
• Schemas
• Databases
• Stages correct
--

59. Which of the following statements would be used to export/unload data from
Snowflake?
• COPY INTO @stage correct
• EXPORT TO @stage
• INSERT INTO @stage
• EXPORT_TO_STAGE(stage = > @Wage, select = > 'select * from t1);
--

60. Which of the following are options when creating a Virtual Warehouse?
• Auto-suspend correct
• Auto-resume correct
• Local SSD size
• User count
--

61. True or False: The COPY command must specify a File Format in order to execute.
• True
• False correct
--

Explanation:
Create Stage:

Create Table (STAGE_FILE_FORMAT option):

62. True or False: Bulk unloading of data from Snowflake supports the use of a SELECT
statement.
• True correct
• False
--

63. Which of the following statements are true of VALIDATION_MODE in Snowflake?


(Choose two.)
• The validation_mode option is used when creating an Internal Stage correct
• validation_mode=return_all_errors is a parameter of the copy command correct
• The validation_mode option will validate data to be loaded by the copy statement while
completing the load and will return the rows that could not be loaded without error
• The validation_mode option will validate data to be loaded by the copy statement
without completing the load and will return possible errors correct
--

64. True or False: You can query the files in an External Stage directly without having to
load the data into a table.
• True correct
• False
--

Explanation:
External tables are read-only, therefore no DML operations can be performed on them; however,
external tables can be used for query and join operations. Views can be created against external
tables.

65. True or False: Snowflake’s data warehouse was built from the ground up for the cloud
in lieu of using an existing database or a platform, like Hadoop, as a base.
• True correct
• False
--
Explanation:
Snowflake's data warehouse was built from the ground up specifically for the cloud, instead of
using an existing database or platform like Hadoop as a base. This design choice allowed
Snowflake to create a unique architecture that takes advantage of the elasticity, scalability, and
flexibility of cloud resources, resulting in a fully managed, highly performant, and cost-effective
data warehouse solution. Snowflake's multi-cluster, shared data architecture separates compute
and storage resources, allowing for independent scaling and optimized performance for various
workloads and use cases.

66. Snowflake is designed for which type of workloads? (Choose two.)


• OLAP (Analytics) workloads correct
• OLTP (Transactional) workloads
• Concurrent workloads correct
• On-premise workloads
--

67. What are the three things customers want most from their enterprise data warehouse
solution? Choose 3 answers
• On-premise availability correct
• Simplicity correct
• Open source based
• Concurrency correct
• Performance correct
--

68. When reviewing a query profile, what is a symptom that a query is too large to fit into
the memory?
• A single join node uses more than 50% of the query time
• Partitions scanned is equal to partitions total
• An AggregateOperacor node is present
• The query is spilling to remote storage correct
--

69. Which of the following are valid methods for authenticating users for access into
Snowflake? (Select THREE)
• SCIM correct
• Federated authentication correct
• TLS 1.2
• Key-pair authentication correct
• OAuth correct
• OCSP authentication
--

70. A developer is granted ownership of a table that has a masking policy. The developer's
role is not able to see the masked data.

Will the developer be able to modify the table to read the masked data?
• Yes, because a table owner has full control and can unset masking policies.correct
• Yes, because masking policies only apply to cloned tables.
• No, because masking policies must always reference specific access roles.
• No, because ownership of a table does not include the ability to change masking policies
--
71. A virtual warehouse's auto-suspend and auto-resume settings apply to which of the
following?
• The primary cluster in the virtual warehouse
• The entire virtual warehouse correct
• The database in which the virtual warehouse resides
• The Queries currently being run on the virtual warehouse
--

72. When reviewing the load for a warehouse using the load monitoring chart, the chart
indicates that a high volume of Queries are always queuing in the warehouse

According to recommended best practice, what should be done to reduce the Queue
volume? (Select TWO).
• Use multi-clustered warehousing to scale out warehouse capacity .correct
• Scale up the warehouse size to allow Queries to execute faster.
• Stop and start the warehouse to clear the queued queries
• Migrate some queries to a new warehouse to reduce load correct
• Limit user access to the warehouse so fewer queries are run against it.
--

73. Which account__usage views are used to evaluate the details of dynamic data
masking? (Select TWO)
• ROLES correct
• POLICY_REFERENCES correct
• QUERY_HISTORY
• RESOURCE_MONIT ORS
• ACCESS_HISTORY correct
--

Explanation:
To evaluate the details of dynamic data masking in Snowflake, you can use the following account
usage views:

POLICY_REFERENCES: This view provides information about the masking policies applied to
various objects (tables, columns, etc.) within the account. It includes details like policy name,
object name, and the masking policy expression.

ACCESS_HISTORY: This view provides a historical record of access attempts to objects (tables,
columns, etc.) in the account, including whether the access was allowed or blocked, and if
masking was applied during the access. It can help you understand how data masking policies
are being enforced and identify any potential issues with the masking configuration.

74. How would you determine the size of the virtual warehouse used for a task?
• Root task may be executed concurrently (i.e. multiple instances), it is recommended to
leave some margins in the execution window to avoid missing instances of execution
• Querying (select) the size of the stream content would help determine the warehouse
size. For example, if querying large stream content, use a larger warehouse size
• If using the stored procedure to execute multiple SQL statements, it's best to test
run the stored procedure separately to size the compute resource first correct
• Since task infrastructure is based on running the task body on schedule, it's
recommended to configure the virtual warehouse for automatic concurrency handling
using Multi-cluster warehouse (MCW) to match the task schedule
--
75. Which of the following are best practice recommendations that should be considered
when loading data into Snowflake? (Select TWO).
• Load files that are approximately 25 MB or smaller. correct
• Remove all dates and timestamps.
• Load files that are approximately 100-250 MB (or larger) correct
• Avoid using embedded characters such as commas for numeric data types correct
• Remove semi-structured data types
--

76. A company's security audit requires generating a report listing all Snowflake logins
(e.g.. date and user) within the last 90 days.

Which of the following statements will return the required information?


• SELECT LAST_SUCCESS_LOGIN, LOGIN_NAME
FROM ACCOUNT_USAG
• USERS;
• SELECT EVENT_TIMESTAMP, USER_NAME
FROM table(information_schema.login_history_by_user())correct
• SELECT EVENT_TIMESTAMP, USER_NAME
FROM ACCOUNT_USAG
• ACCESS_HISTORY;
• SELECT EVENT_TIMESTAMP, USER_NAME
FROM ACCOUNT_USAG
• LOGIN_HISTORY;
--

77. What feature can be used to reorganize a very large table on one or more columns?
• Micro-partitions
• Clustering keys correct
• Key partitions
• Clustered partitions
--

78. In the query profiler view for a query, which components represent areas that can be
used to help optimize query performance? (Select TWO)
• Bytes scanned correct
• Bytes sent over the network
• Number of partitions scanned correct
• Percentage scanned from cache
• External bytes scanned
--

79. What is a machine learning and data science partner within the Snowflake Partner
Ecosystem?
• Informatica
• Power Bl
• Adobe
• Data Robot correct
--

80. Which command can be used to load data into an internal stage?
• LOAD
• copy
• GET
• PUT correct
--

81. A user is loading JSON documents composed of a huge array containing multiple
records into Snowflake. The user enables the strip__outer_array file format option

What does the STRIP_OUTER_ARRAY file format do?


• It removes the last element of the outer array.
• It removes the outer array structure and loads the records into separate table
rows, correct
• It removes the trailing spaces in the last element of the outer array and loads the records
into separate table columns
• It removes the NULL elements from the JSON object eliminating invalid data and enables
the ability to load the records
--

Explanation:
Data Size Limitations

The VARIANT data type imposes a 16 MB size limit on individual rows.

For some semi-structured data formats (e.g. JSON), data sets are frequently a simple
concatenation of multiple documents. The JSON output from some software is composed of a
single huge array containing multiple records. There is no need to separate the documents with
line breaks or commas, though both are supported.

If the data exceeds 16 MB, enable the STRIP_OUTER_ARRAY file format option for the COPY
INTO <table> command to remove the outer array structure and load the records into separate
table rows:

copy into <table>

from @~/<file>.json

file_format = (type = 'JSON' strip_outer_array true);=

82. Which copy INTO command outputs the data into one file?
• SINGLE=TRUE
• MAX_FILE_NUMBER=1 correct
• FILE_NUMBER=1
• MULTIPLE=FAISE
--

83. What happens when a virtual warehouse is resized?


• When increasing the size of an active warehouse the compute resource for all running
and queued queries on the warehouse are affected
• When reducing the size of a warehouse the compute resources are removed only
when they are no longer being used to execute any current statements. correct
• The warehouse will be suspended while the new compute resource is provisioned and
will resume automatically once provisioning is complete.
• Users who are trying to use the warehouse will receive an error message until the
resizing is complete
--

84. What happens to the underlying table data when a CLUSTER BY clause is added to a
Snowflake table?
• Data is hashed by the cluster key to facilitate fast searches for common data values
• Larger micro-partitions are created for common data values to reduce the number of
partitions that must be scanned
• Smaller micro-partitions are created for common data values to allow for more parallelism
• Data may be colocated by the cluster key within the micro-partitions to improve
pruning performance correct
--

85. A user has unloaded data from Snowflake to a stage

Which SQL command should be used to validate which data was loaded into the stage?
• list @file__stage correct
• show @file__stage
• view @file__stage
• verify @file__stage
--

86. A user has an application that writes a new Tile to a cloud storage location every 5
minutes.

What would be the MOST efficient way to get the files into Snowflake?
• Create a task that runs a copy into operation from an external stage every 5 minutes
• Create a task that puts the files in an internal stage and automate the data loading wizard
• Create a task that runs a GET operation to intermittently check for new files
• Set up cloud provider notifications on the Tile location and use Snowpipe with
auto-ingest correct
--

87. What is the default File Format used in the COPY command if one is not specified?
• CSV correct
• JSON
• Parquet
• XML
--

88. True or False: A Virtual Warehouse can be resized while suspended.


• True correct
• False
--

89. What features does Snowflake Time Travel enable?


• Querying data-related objects that were created within the past 365 days
• Restoring data-related objects that have been deleted within the past 90 days
• Conducting point-in-time analysis for Bl reporting correct
• Analyzing data usage/manipulation over all periods of time
--

Explanation:
Snowflake Time Travel is a feature that enables you to access historical data within a specified
period, known as the data retention period. It allows you to query data as it existed at a specific
point in time, which can be useful for point-in-time analysis, BI reporting, or data recovery in case
of accidental data changes or deletions.

90. In which use cases does Snowflake apply egress charges?


• Data sharing within a specific region
• Query result retrieval correct
• Database replication
• Loading data into Snowflake
--

Explanation:
Snowflake applies egress charges in cases where data is transferred out of Snowflake's
infrastructure, such as when retrieving query results. These charges are typically imposed by the
cloud service provider (e.g., AWS, Azure, or Google Cloud Platform) for data transfer out of their
respective data centers.

91. Which feature is only available in the Enterprise or higher editions of Snowflake?
• Column-level security correct
• SOC 2 type II certification
• Multi-factor Authentication (MFA)
• Object-level access control
--

92. Which of the following is a valid source for an external stage when the Snowflake
account is located on Microsoft Azure?
• An FTP server with TLS encryption
• An HTTPS server with WebDAV
• A Google Cloud storage bucket
• A Windows server file share on Azure correct
--

93. Which statement about billing applies to Snowflake credits?


• Credits are billed per-minute with a 60-minute minimum
• Credits are used to pay for cloud data storage usage
• Credits are consumed based on the number of credits billed for each hour that a
warehouse runs
• Credits are consumed based on the warehouse size and the time the warehouse is
running correct
--

Explanation:
Snowflake credits are used to pay for the consumption of resources on Snowflake. A Snowflake
credit is a unit of measure, and it is consumed only when a customer is using resources, such as
when a virtual warehouse is running, the cloud services layer is performing work, or serverless
features are used. https://docs.snowflake.com/en/user-guide/what-are-credits.html

94. Which Snowflake technique can be used to improve the performance of a query?
• Clustering correct
• Indexing
• Fragmenting
• Using INDEX__HINTS
--

95. What are the default Time Travel and Fail-safe retention periods for transient tables?
• Time Travel - 1 day. Fail-safe - 1 day
• Time Travel - 0 days. Fail-safe - 1 day
• Time Travel - 1 day. Fail-safe - 0 days
• Transient tables are retained in neither Fail-safe nor Time Travel correct
--

Explanation:
Transient tables in Snowflake do not have a Time Travel retention period or a Fail-safe retention
period. These tables are designed for temporary storage and intermediate processing, and as
such, they do not maintain historical data versions or offer data protection through Fail-safe.
Once the data is deleted from a transient table, it cannot be recovered using Time Travel or Fail-
safe features.

96. Which of the following are benefits of micro-partitioning? (Select TWO)


• Micro-partitions cannot overlap in their range of values
• Micro-partitions are immutable objects that support the use of Time Travel. correct
• Micro-partitions can reduce the amount of I/O from object storage to virtual
warehouses correct
• Rows are automatically stored in sorted order within micro-partitions
• Micro-partitions can be defined on a schema-by-schema basis
--

Explanation:
Micro-partitioning is a key feature of Snowflake's architecture, providing several benefits:

Micro-partitions are immutable objects that support the use of Time Travel. Each time data is
loaded or modified, new micro-partitions are created, preserving the previous versions of data to
enable Time Travel and historical data querying.

Micro-partitions can reduce the amount of I/O from object storage to virtual warehouses.
Snowflake's query optimizer leverages metadata about micro-partitions, such as the range of
values within them, to prune unnecessary micro-partitions from query scans. This reduces the
amount of data read and improves query performance.

97. Which of the following conditions must be met in order to return results from the
results cache? (Select TWO).
• The user has the appropriate privileges on the objects associated with the query
correct
• Micro-partitions have been reclustered since the query was last run
• The new query is run using the same virtual warehouse as the previous query
• The query includes a User Defined Function (UDF)
• The query has been run within 24 hours of the previously-run query correct
--

98. What happens when a cloned table is replicated to a secondary database? (Select
TWO)
• A read-only copy of the cloned tables is stored. correct
• The replication will not be successful.
• The physical data is replicated
• Additional costs for storage are charged to a secondary account
• Metadata pointers to cloned tables are replicated correct
--

Explanation:
When a cloned table is replicated to a secondary database in Snowflake, the following occurs:

A read-only copy of the cloned tables is stored in the secondary database. This ensures that the
replicated data is consistent with the primary database and can be used for reporting, querying,
or other read-only operations.

Metadata pointers to cloned tables are replicated, rather than the physical data. Snowflake's
cloning operation is metadata-based, which means that cloning a table does not create a full
copy of the data. Instead, the new table references the same underlying data as the original
table. When replicating a cloned table, Snowflake replicates these metadata pointers, ensuring
efficient replication without duplicating the physical data.

99. What can be used to view warehouse usage over time? (Select Two).
• The load HISTORY view correct
• The Query history view
• The show warehouses command
• The WAREHOUSE_METERING__HISTORY View correct
• The billing and usage tab in the Snowflake web Ul correct
--

100. Will data cached in a warehouse be lost when the warehouse is resized?
• Possibly, if the warehouse is resized to a smaller size and the cache no longer fits.
• Yes. because the compute resource is replaced in its entirety with a new compute
resource.
• No. because the size of the cache is independent from the warehouse size correct
• Yes. became the new compute resource will no longer have access to the cache
encryption key
--

1. Will data cached in a warehouse be lost when the warehouse is resized?


• Possibly, if the warehouse is resized to a smaller size and the cache no longer fits.correct
• Yes. because the compute resource is replaced in its entirety with a new compute
resource.
• No. because the size of the cache is independent from the warehouse size
• Yes. became the new compute resource will no longer have access to the cache
encryption key
--

2. What are ways to create and manage data shares in Snowflake? (Select TWO)
• Through the Snowflake web interface (Ul)correct
• Through the DATA_SHARE=TRUE parameter
• Through SQL commandscorrect
• Through the enable__share=true parametercorrect
• Using the CREATE SHARE AS SELECT * TABLE command
--

3. What is the purpose of an External Function?


• To call code that executes outside of Snowflakecorrect
• To run a function in another Snowflake database
• To share data in Snowflake with external parties
• To ingest data from on-premises data sources
--

4. Which is the MINIMUM required Snowflake edition that a user must have if they want to
use AWS/Azure Privatelink or Google Cloud Private Service Connect?
• Standard
• Premium
• Enterprise
• Business Criticalcorrect
--

5. What happens when a virtual warehouse is resized?


• When increasing the size of an active warehouse the compute resource for all running
and queued queries on the warehouse are affected
• When reducing the size of a warehouse the compute resources are removed only when
they are no longer being used to execute any current statements.correct
• The warehouse will be suspended while the new compute resource is provisioned and
will resume automatically once provisioning is complete.
• Users who are trying to use the warehouse will receive an error message until the
resizing is complete
--

6. What features does Snowflake Time Travel enable?


• Querying data-related objects that were created within the past 365 days
• Restoring data-related objects that have been deleted within the past 90 dayscorrect
• Conducting point-in-time analysis for Bl reporting
• Analyzing data usage/manipulation over all periods of time
--

Explanation:
Snowflake Time Travel enables accessing historical data (i.e. data that has been changed or
deleted) at any point within a defined period.

It serves as a powerful toolfor performing the following tasks:

✑ Restoring data-related objects (tables, schemas, and databases) that might have been
accidentally or intentionally deleted.

✑ Duplicating and backing up data from key points in the past.

✑ Analyzing data usage/manipulation over specified periods of time.


https://docs.snowflake.com/en/user-guide/data-time-travel.html

7. A virtual warehouse's auto-suspend and auto-resume settings apply to which of the


following?
• The primary cluster in the virtual warehouse
• The entire virtual warehousecorrect
• The database in which the virtual warehouse resides
• The Queries currently being run on the virtual warehouse
--
8. What are value types that a VARIANT column can store? (Select TWO)
• STRUCTcorrect
• OBJECTcorrect
• BINARY
• ARRAYcorrect
• CLOB
--

Explanation:
Characteristics of a VARIANT

A VARIANT can store a value of any other type, including OBJECT and ARRAY. The maximum
length of a VARIANT is 16 MB.

9. True or False: It is possible for a user to run a query against the query result cache
without requiring an active Warehouse.
• Truecorrect
• False
--

Explanation:
Query result cache is all about fetching the data from cloud services layer and saving the cost by
not running the virtual warehouse.

10. Which of the following indicates that it may be appropriate to use a clustering key for a
table? (Select TWO).
• The table contains a column that has very low cardinalitycorrect
• DML statements that are being issued against the table are blocked
• The table has a small number of micro-partitions
• Queries on the table are running slower than expectedcorrect
• The clustering depth for the table is largecorrect
--

11. Which of the following can be executed/called with Snowpipe?


• A User Defined Function (UDF)
• A stored procedure
• A single copy_into statementcorrect
• A single insert__into statement
--

12. Query compilation occurs in which architecture layer of the Snowflake Cloud Data
Platform?
• Compute layer
• Storage layer
• Cloud infrastructure layer
• Cloud services layercorrect
--

Explanation:
For query execution, Snowflake uses the Virtual Warehouse. The query processing layer is
separated from the disk storage layer in the Snowflake data architecture. You can use the data
from the storage layer to run queries in this layer
13. In which scenarios would a user have to pay Cloud Services costs? (Select TWO).
• Compute Credits = 50 Credits Cloud Services = 10correct
• Compute Credits = 80 Credits Cloud Services = 5
• Compute Credits = 10 Credits Cloud Services = 9
• Compute Credits = 120 Credits Cloud Services = 10
• Compute Credits = 200 Credits Cloud Services = 26correct
--

14. Which Snowflake partner specializes in data catalog solutions?


• Alationcorrect
• DataRobot
• dbt
• Tableau
--

Explanation:
Alation provides Data Cataloging functionality. They state they are the 'One Place to Find,
Understand, & Govern Data Across an Enterprise. https://docs.snowflake.com/en/user-
guide/ecosystem-all.html

15. Which Snowflake technique can be used to improve the performance of a query?
• Clusteringcorrect
• Indexing
• Fragmenting
• Using INDEX__HINTS
--

16. Which of the following objects can be shared through secure data sharing?
• Masking policy
• Stored procedure
• Task
• External tablecorrect
--

Explanation:
Secure Data Sharing enables sharing selected objects in a database in your account with other
Snowflake accounts.

The following Snowflake database objects can be shared:

✑ Tables

✑ External tables

✑ Secure views

✑ Secure materialized views

✑ Secure UDFs

Snowflake enables the sharing of databases through shares, which are created by data providers
and “imported” by data consumers.
17. Which semi-structured file formats are supported when unloading data from a table?
(Select TWO).
• ORCcorrect
• XML
• Avro
• Parquetcorrect
• JSONcorrect
--

Explanation:
Semi-structured - JSON, Parquet

18. Which account__usage views are used to evaluate the details of dynamic data
masking? (Select TWO)
• ROLEScorrect
• POLICY_REFERENCEScorrect
• QUERY_HISTORYcorrect
• RESOURCE_MONIT ORS
• ACCESS_HISTORY
--

19. True or False: A 4X-Large Warehouse may, at times, take longer to provision than a X-
Small Warehouse.
• Truecorrect
• False
--

Explanation:
You can experiment the same with snowflake UI.

20. What happens to the underlying table data when a CLUSTER BY clause is added to a
Snowflake table?
• Data is hashed by the cluster key to facilitate fast searches for common data values
• Larger micro-partitions are created for common data values to reduce the number of
partitions that must be scanned
• Smaller micro-partitions are created for common data values to allow for more parallelism
• Data may be colocated by the cluster key within the micro-partitions to improve pruning
performancecorrect
--

21. What is a limitation of a Materialized View?


• A Materialized View cannot support any aggregate functions
• A Materialized View can only reference up to two tables
• A Materialized View cannot be joined with other tables
• A Materialized View cannot be defined with a JOINcorrect
--

Explanation:
There are several limitations to using materialized views:

✑ A materialized view can query only a single table.


✑ Joins, including self-joins, are not supported.

22. What can be used to view warehouse usage over time? (Select Two).
• The load HISTORY viewcorrect
• The Query history view
• The show warehouses command
• The WAREHOUSE_METERING__HISTORY Viewcorrect
• The billing and usage tab in the Snowflake web Ulcorrect
--

23. What are the default Time Travel and Fail-safe retention periods for transient tables?
• Time Travel - 1 day. Fail-safe - 1 day
• Time Travel - 0 days. Fail-safe - 1 day
• Time Travel - 1 day. Fail-safe - 0 dayscorrect
• Transient tables are retained in neither Fail-safe nor Time Travel
--

24. Which of the following is a valid source for an external stage when the Snowflake
account is located on Microsoft Azure?
• An FTP server with TLS encryption
• An HTTPS server with WebDAV
• A Google Cloud storage bucketcorrect
• A Windows server file share on Azure
--

25. Which of the following describes how multiple Snowflake accounts in a single
organization relate to various cloud providers?
• Each Snowflake account can be hosted in a different cloud vendor and region.correct
• Each Snowflake account must be hosted in a different cloud vendor and region
• All Snowflake accounts must be hosted in the same cloud vendor and region
• Each Snowflake account can be hosted in a different cloud vendor, but must be in the
same region.
--

26. When reviewing a query profile, what is a symptom that a query is too large to fit into
the memory?
• A single join node uses more than 50% of the query time
• Partitions scanned is equal to partitions total
• An AggregateOperacor node is present
• The query is spilling to remote storagecorrect
--

27. A user unloaded a Snowflake table called mytable to an internal stage called mystage.

Which command can be used to view the list of files that has been uploaded to the
staged?
• list @mytable;
• list @%raytable;
• list @ %m.ystage;
• list @mystage;correct
--

28. Which Snowflake object enables loading data from files as soon as they are available
in a cloud storage location?
• Pipecorrect
• External stage
• Task
• Stream
--

Explanation:
Snowpipe enables loading data from files as soon as they’re available in astage. This means you
can load data from files in micro-batches, making it available tousers within minutes, rather than
manually executing COPY statements on a schedule toload larger batches.

29. What SQL command would be used to view all roles that were granted to user.1?
• show grants to user USER1;correct
• show grants of user USER1;
• describe user USER1;
• show grants on user USER1;
--

30. What is a best practice after creating a custom role?


• Create the custom role using the SYSADMIN role.
• Assign the custom role to the SYSADMIN rolecorrect
• Assign the custom role to the PUBLIC role
• Add_CUSTOM to all custom role names
--

Explanation:
When creating roles that will serve as the owners of securable objects in the system, Snowflake
recommends creating a hierarchy of custom roles, with the top-most custom role assigned to the
system role SYSADMIN. This role structure allows system administrators to manage all objects
in the account, such as warehouses and database objects, while restricting management of
users and roles to the USERADMIN role.

31. True or False: Loading data into Snowflake requires that source data files be no larger
than 16MB.
• True
• Falsecorrect
--

Explanation:
By default, COPY INTO location statements separate table data into a set of output files to take
advantage of parallel operations. The maximum size for each file is set using the
MAX_FILE_SIZE copy option. The default value is 16777216 (16 MB) but can be increased to
accommodate larger files. The maximum file size supported is 5 GB for Amazon S3, Google
Cloud Storage, or Microsoft Azure stages. To unload data to a single output file (at the potential
cost of decreased performance), specify the SINGLE = true copy option in your statement. You
can optionally specify a name for the file in the path.

32. Which command can be used to stage local files from which Snowflake interface?
• SnowSQLcorrect
• Snowflake classic web interface (Ul)
• Snowsight
• .NET driver
--

33. Which of the following conditions must be met in order to return results from the
results cache? (Select TWO).
• The user has the appropriate privileges on the objects associated with the querycorrect
• Micro-partitions have been reclustered since the query was last run
• The new query is run using the same virtual warehouse as the previous query
• The query includes a User Defined Function (UDF)
• The query has been run within 24 hours of the previously-run querycorrect
--

34. When is the result set cache no longer available? (Select TWO)
• When another warehouse is used to execute the querycorrect
• When another user executes the query
• When the underlying data has changedcorrect
• When the warehouse used to execute the query is suspended
• When it has been 24 hours since the last querycorrect
--

35. A developer is granted ownership of a table that has a masking policy. The developer's
role is not able to see the masked data. Will the developer be able to modify the table to
read the masked data?
• Yes, because a table owner has full control and can unset masking policies.
• Yes, because masking policies only apply to cloned tables.
• No, because masking policies must always reference specific access roles.
• No, because ownership of a table does not include the ability to change masking
policiescorrect
--

36. Which of the following describes how clustering keys work in Snowflake?
• Clustering keys update the micro-partitions in place with a full sort, and impact the DML
operations.
• Clustering keys sort the designated columns over time, without blocking DML
operationscorrect
• Clustering keys create a distributed, parallel data structure of pointers to a table's rows
and columns
• Clustering keys establish a hashed key on each node of a virtual warehouse to optimize
joins at run-time
--

37. What feature can be used to reorganize a very large table on one or more columns?
• Micro-partitions
• Clustering keys correct
• Key partitions
• Clustered partitions
--

38. How long is Snowpipe data load history retained?


• As configured in the create pipe settings
• Until the pipe is dropped
• 64 days correct
• 14 days
--

Explanation:
Bulk data load . Stored in the metadata of the target table for 64 days. Available upon completion
of theCOPY statement as the statement output.

39. A user needs to create a materialized view in the schema MYDB.MYSCHEMA.

Which statements will provide this access?


• GRANT ROLE MYROLE TO USER USER1;
CREATE MATERIALIZED VIEW ON SCHEMA MYDcorrect
• MYSCHEMA TO ROLE MYROLE;
• GRANT ROLE MYROLE TO USER USER1;
CREATE MATERIALIZED VIEW ON SCHEMA MYD
• MYSCHEMA TO USER USER1;
• GRANT ROLE MYROLE TO USER USER1;
CREATE MATERIALIZED VIEW ON SCHEMA MYD
• MYSCHEMA TO USER1;
• GRANT ROLE MYROLE TO USER USER1;
CREATE MATERIALIZED VIEW ON SCHEMA MYD
• MYSCHEMA TO MYROLE;
--

40. What Snowflake role must be granted for a user to create and manage accounts?
• ACCOUNTADMINcorrect
• ORGADMIN
• SECURITYADMIN
• SYSADMIN
--

41. What data is stored in the Snowflake storage layer? (Select TWO).
• Snowflake parameterscorrect
• Micro-partitionscorrect
• Query history
• Persisted query resultscorrect
• Standard and secure view results
--

42. A company's security audit requires generating a report listing all Snowflake logins
(e.g.. date and user) within the last 90 days.

Which of the following statements will return the required information?


• SELECT LAST_SUCCESS_LOGIN, LOGIN_NAME
FROM ACCOUNT_USAG
• USERS;
• SELECT EVENT_TIMESTAMP, USER_NAME
FROM table(information_schema.login_history_by_user())
• SELECT EVENT_TIMESTAMP, USER_NAME
FROM ACCOUNT_USAGcorrect
• ACCESS_HISTORY;
• SELECT EVENT_TIMESTAMP, USER_NAME
FROM ACCOUNT_USAG
• LOGIN_HISTORY;
--

43. How would you determine the size of the virtual warehouse used for a task?
• Root task may be executed concurrently (i.e. multiple instances), it is recommended to
leave some margins in the execution window to avoid missing instances of execution
• Querying (select) the size of the stream content would help determine the warehouse
size. For example, if querying large stream content, use a larger warehouse size
• If using the stored procedure to execute multiple SQL statements, it's best to test run the
stored procedure separately to size the compute resource firstcorrect
• Since task infrastructure is based on running the task body on schedule, it's
recommended to configure the virtual warehouse for automatic concurrency handling
using Multi-cluster warehouse (MCW) to match the task schedule
--

44. In the query profiler view for a query, which components represent areas that can be
used to help optimize query performance? (Select TWO)
• Bytes scannedcorrect
• Bytes sent over the network
• Number of partitions scanned
• Percentage scanned from cachecorrect
• External bytes scanned
--

45. User-level network policies can be created by which of the following roles? (Select
TWO).
• ROLEADMINcorrect
• ACCOUNTADMINcorrect
• SYSADMIN
• SECURITYADMINcorrect
• USERADMIN
--

Explanation:
By default, Snowflake allows users to connect to the service from any computer or device IP
address. A security administrator (or higher) can create a network policy to allow or deny access
to a single IP address or a list of addresses. Network policies currently support only Internet
Protocol version 4 (i.e. IPv4) addresses.

An administrator with sufficient permissions can create any number of network policies.

46. A user has 10 files in a stage containing new customer data.

The ingest operation completes with no errors, using the following command:

COPY INTO my__table FROM @my__stage;

The next day the user adds 10 files to the stage so that now the stage contains a mixture
of new customer data and updates to the previous data. The user did not remove the 10
original files.

If the user runs the same copy into command what will happen?
• All data from all of the files on the stage will be appended to the table
• Only data about new customers from the new files will be appended to the table
• The operation will fail with the error uncertain files in stage.
• All data from only the newly-added files will be appended to the table.correct
--

47. What happens when a cloned table is replicated to a secondary database? (Select
TWO)
• A read-only copy of the cloned tables is stored.correct
• The replication will not be successful.
• The physical data is replicatedcorrect
• Additional costs for storage are charged to a secondary accountcorrect
• Metadata pointers to cloned tables are replicated
--

Explanation:
Cloned objects are replicated physically rather than logically to secondary databases. That is,
cloned tables in a standard database do not contribute to the overall data storage unless or until
DML operations on the clone add to or modify existing data.

However, when a cloned table is replicated to a secondary database, the physical data is also
replicated, increasing the data storage usage for your account.

48. Which feature is only available in the Enterprise or higher editions of Snowflake?
• Column-level securitycorrect
• SOC 2 type II certification
• Multi-factor Authentication (MFA)
• Object-level access control
--

Explanation:
https://docs.snowflake.com/en/user-guide/intro-editions.html

49. Which of the following Snowflake objects can be shared using a secure share? (Select
TWO).
• Materialized viewscorrect
• Sequences
• Procedures
• Tablescorrect
• Secure User Defined Functions (UDFs)correct
--

Explanation:
Secure Data Sharing enables sharing selected objects in a database in your account with other
Snowflake accounts.

The following Snowflake database objects can be shared:

✑ Tables

✑ External tables

✑ Secure views
✑ Secure materialized views

✑ Secure UDFs

50. Which of the following are benefits of micro-partitioning? (Select TWO)

• Micro-partitions cannot overlap in their range of valuescorrect


• Micro-partitions are immutable objects that support the use of Time Travel.correct
• Micro-partitions can reduce the amount of I/O from object storage to virtual
warehousescorrect
• Rows are automatically stored in sorted order within micro-partitions
• Micro-partitions can be defined on a schema-by-schema basis
--

51. Which statement about billing applies to Snowflake credits?


• Credits are billed per-minute with a 60-minute minimum
• Credits are used to pay for cloud data storage usage
• Credits are consumed based on the number of credits billed for each hour that a
warehouse runs
• Credits are consumed based on the warehouse size and the time the warehouse is
runningcorrect
--

Explanation:
Snowflake credits are used to pay for the consumption of resources on Snowflake. A Snowflake
credit is a unit of measure, and it is consumed only when a customer is using resources, such as
when a virtual warehouse is running, the cloud services layer is performing work, or serverless
features are used. https://docs.snowflake.com/en/user-guide/what-are-credits.html

52. Which copy INTO command outputs the data into one file?
• SINGLE=TRUEcorrect
• MAX_FILE_NUMBER=1
• FILE_NUMBER=1
• MULTIPLE=FAISE
--

53. Which of the following compute resources or features are managed by Snowflake?
(Select TWO).
• Execute a COPY commandcorrect
• Updating data
• Snowpipecorrect
• AUTOMATIC__CLUSTERINGcorrect
• Scaling up a warehouse
--

54. A user has unloaded data from Snowflake to a stage

Which SQL command should be used to validate which data was loaded into the stage?
• list @file__stagecorrect
• show @file__stage
• view @file__stage
• verify @file__stage
--
55. Which of the following are valid methods for authenticating users for access into
Snowflake? (Select THREE)
• SCIMcorrect
• Federated authenticationcorrect
• TLS 1.2
• Key-pair authenticationcorrect
• OAuthcorrect
• OCSP authentication
--

56. Which Snowflake objects track DML changes made to tables, like inserts, updates, and
deletes?
• Pipes
• Streamscorrect
• Tasks
• Procedures
--

57. What is the recommended file sizing for data loading using Snowpipe?
• A compressed file size greater than 100 MB, and up to 250 MBcorrect
• A compressed file size greater than 100 GB, and up to 250 GB
• A compressed file size greater than 10 MB, and up to 100 MB
• A compressed file size greater than 1 GB, and up to 2 GB
--

58. What is a responsibility of Snowflake's virtual warehouses?


• Infrastructure management
• Metadata management
• Query executioncorrect
• Query parsing and optimization
• Management of the storage layer
--

59. Which data type can be used to store geospatial data in Snowflake?
• Variant
• Object
• Geometrycorrect
• Geography
--

60. Which of the following are best practice recommendations that should be considered
when loading data into Snowflake? (Select TWO).
• Load files that are approximately 25 MB or smaller.correct
• Remove all dates and timestamps.
• Load files that are approximately 100-250 MB (or larger)correct
• Avoid using embedded characters such as commas for numeric data typescorrect
• Remove semi-structured data types
--

61. In which use cases does Snowflake apply egress charges?


• Data sharing within a specific regioncorrect
• Query result retrieval
• Database replication
• Loading data into Snowflake
--

Explanation:
Cloud providers apply data egress charges in either of the following use cases:

✑ Data is transferred from one region to another within the same cloud platform.

✑ Data is transferred out of the cloud platform.

62. How often are encryption keys automatically rotated by Snowflake?


• 30 Dayscorrect
• 60 Days
• 90 Days
• 365 Days
--

Explanation:
All Snowflake-managed keys are automatically rotated by Snowflake when they are more than
30 days old. Active keys are retired, and new keys are created. When Snowflake determines the
retired key is no longer needed, the key is automatically destroyed.

63. What is a machine learning and data science partner within the Snowflake Partner
Ecosystem?
• Informatica
• Power Bl
• Adobe
• Data Robotcorrect
--

64. Which Snowflake feature is used for both querying and restoring data?
• Cluster keys
• Time Travelcorrect
• Fail-safe
• Cloning
--

65. What is a key feature of Snowflake architecture?


• Zero-copy cloning creates a mirror copy of a database that updates with the original
• Software updates are automatically applied on a quarterly basis
• Snowflake eliminates resource contention with its virtual warehouse
implementationcorrect
• Multi-cluster warehouses allow users to run a query that spans across multiple clusters
• Snowflake automatically sorts DATE columns during ingest for fast retrieval by date
--

66. Which services does the Snowflake Cloud Services layer manage? (Select TWO).
• Compute resourcescorrect
• Query execution
• Authenticationcorrect
• Data storage
• Metadatacorrect
--

Explanation:
https://docs.snowflake.com/en/user-guide/intro-key-concepts.html The cloud services layer is a
collection of services that coordinate activities across Snowflake. These services tie together all
of the different components of Snowflake in order to process user requests, from login to query
dispatch. The cloud services layer also runs on compute instances provisioned by Snowflake
from the cloud provider.

Services managed in this layer include:

✑ Authentication

✑ Infrastructure management

✑ Metadata management

✑ Query parsing and optimization

✑ Access control

67. What happens when an external or an internal stage is dropped? (Select TWO).
• When dropping an external stage, the files are not removed and only the stage is
droppedcorrect
• When dropping an external stage, both the stage and the files within the stage are
removed
• When dropping an internal stage, the files are deleted with the stage and the files are
recoverable
• When dropping an internal stage, the files are deleted with the stage and the files are not
recoverablecorrect
• When dropping an internal stage, only selected files are deleted with the stage and are
not recoverable
--

68. What is the default File Format used in the COPY command if one is not specified?
• CSVcorrect
• JSON
• Parquet
• XML
--

69. When unloading to a stage, which of the following is a recommended practice or


approach?
• Set SINGLE: = true for larger files
• Use OBJECT_CONSTRUCT ( * ) when using Parquet
• Avoid the use of the CAST function
• Define an individual file formatcorrect
--

70. Where would a Snowflake user find information about query activity from 90 days
ago?
• account__usage . query history viewcorrect
• account__usage.query__history__archive View
• information__schema . cruery_history view
• information__schema - query history_by_ses s i on view
--

71. Which of the following Snowflake capabilities are available in all Snowflake editions?
(Select TWO)
• Customer-managed encryption keys through Tri-Secret Securecorrect
• Automatic encryption of all datacorrect
• Up to 90 days of data recovery through Time Travel
• Object-level access controlcorrect
• Column-level security to apply data masking policies to tables and views
--

72. What is the minimum Snowflake edition required to create a materialized view?
• Standard Edition
• Enterprise Editioncorrect
• Business Critical Edition
• Virtual Private Snowflake Edition
--

Explanation:
Materialized views require Enterprise Edition. To inquire about upgrading,please contact
Snowflake Support

73. The Information Schema and Account Usage Share provide storage information for
which of the following objects? (Choose three.)
• Userscorrect
• Tablescorrect
• Databasescorrect
• Internal Stagescorrect
--

74. Which stage type can be altered and dropped?


• Database stage
• External stage
• Table stagecorrect
• User stage
--

75. A user is loading JSON documents composed of a huge array containing multiple
records into Snowflake. The user enables the strip__outer_array file format option

What does the STRIP_OUTER_ARRAY file format do?


• It removes the last element of the outer array.
• It removes the outer array structure and loads the records into separate table
rows,correct
• It removes the trailing spaces in the last element of the outer array and loads the records
into separate table columns
• It removes the NULL elements from the JSON object eliminating invalid data and enables
the ability to load the records
--
Explanation:
Data Size Limitations

The VARIANT data type imposes a 16 MB size limit on individual rows.

For some semi-structured data formats (e.g. JSON), data sets are frequently a simple
concatenation of multiple documents. The JSON output from some software is composed of a
single huge array containing multiple records. There is no need to separate the documents with
line breaks or commas, though both are supported.

If the data exceeds 16 MB, enable the STRIP_OUTER_ARRAY file format option forthe COPY
INTO <table> command to remove the outer array structure and load therecords into separate
table rows:

copy into <table>

from @~/<file>.json

file_format = (type = 'JSON' strip_outer_array true);=

76. Which of the following Snowflake features provide continuous data protection
automatically? (Select TWO).
• Internal stagescorrect
• Incremental backups
• Time Travelcorrect
• Zero-copy clones
• Fail-safecorrect
--

Explanation:
Time travel and fail safe are the two continuous data protection features support the recovery of
data automatically.

Snowflake provides powerful CDP features for ensuring the maintenance and availability of your
historical data (i.e. data that has been changed or deleted):

✑ Querying, cloning, and restoring historical data in tables, schemas, and databases for up to 90
days through Snowflake Time Travel.

✑ Disaster recovery of historical data (by Snowflake) through Snowflake Fail-safe.

https://docs.snowflake.com/en/user-guide/data-availability.html

77. During periods of warehouse contention which parameter controls the maximum
length of time a warehouse will hold a query for processing?
• STATEMENT_TIMEOUT__IN__SECONDS
• STATEMENT_QUEUED_TIMEOUT_IN_SECONDScorrect
• MAX_CONCURRENCY__LEVEL
• QUERY_TIMEOUT_IN_SECONDS
--

Explanation:
The parameter STATEMENT_QUEUED_TIMEOUT_IN_SECONDS sets thelimit for a query to
wait in the queue in order to get its chance of running on the warehouse. The query will quit after
reaching this limit. By default, the value of this parameter is 0 whichmean the queries will wait
indefinitely in the waiting queue

78. True or False: Fail-safe can be disabled within a Snowflake account.


• True
• Falsecorrect
--

Explanation:
Separate and distinct from Time Travel, Fail-safe ensures historical data is protected in the event
of a system failure or other catastrophic event, e.g. a hardware failure or security breach. Fail
safe feature cannot be enabled or disabled from the user end .

79. Which of the following describes external functions in Snowflake?


• They are a type of User-defined Function (UDF).correct
• They contain their own SQL code.
• They call code that is stored inside of Snowflake.
• They can return multiple rows for each row received
--

Explanation:
External functions are user-defined functions that are stored and executed outside of Snowflake.

External functions make it easier to access external API services such as geocoders, machine
learning models, and other custom code running outside of Snowflake. This feature eliminates
the need to export and reimport data when using third-party services, significantly simplifying
your data pipelines..

80. Which data types does Snowflake support when querying semi-structured data?
(Select TWO)
• VARIANTcorrect
• ARRAYcorrect
• VARCHAR
• XML
• BLOB
--

Explanation:
A VARIANT stores semi-structured data in Snowflake. It can store a value of any other type,
including OBJECT and ARRAY.

The maximum length of a VARIANT is 16 MB.

A Snowflake ARRAY is similar to an array in many other programming languages. An ARRAY


contains 0 or more pieces of data. Each element is accessed by specifying its position in the
array.

81. What is the default character set used when loading CSV files into Snowflake?
• UTF-8correct
• UTF-16
• ISO S859-1
• ANSI_X3.A
--

Explanation:
For delimited files (CSV, TSV, etc.), the default character set is UTF-8. To use any
othercharacters sets, you must explicitly specify the encoding to use for loading. For the list
ofsupported character sets, see Supported Character Sets for Delimited Files (in this topic).

82. What are two ways to create and manage Data Shares in Snowflake? (Choose two.)
• Via the Snowflake Web Interface (Ul)correct
• Via the data_share=true parameter
• Via SQL commandscorrect
• Via Virtual Warehouses
--

83. A marketing co-worker has requested the ability to change a warehouse size on their
medium virtual warehouse called mktg__WH.

Which of the following statements will accommodate this request?


• ALLOW RESIZE ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
• GRANT MODIFY ON WAREHOUSE MKTG WH TO ROLE MARKETING;correct
• GRANT MODIFY ON WAREHOUSE MKTG__WH TO USER MKTG__LEAD;
• GRANT OPERATE ON WAREHOUSE MKTG WH TO ROLE MARKETING;
--

84. True or False: A Virtual Warehouse can be resized while suspended.


• Truecorrect
• False
--

Explanation:
Reference: https://docs.snowflake.com/en/user-guide/warehouses-tasks.html#effects-of-resizing-
a-suspended-warehouse

85. When reviewing the load for a warehouse using the load monitoring chart, the chart
indicates that a high volume of Queries are always queuing in the warehouse

According to recommended best practice, what should be done to reduce the Queue
volume? (Select TWO).
• Use multi-clustered warehousing to scale out warehouse capacity.correct
• Scale up the warehouse size to allow Queries to execute faster.
• Stop and start the warehouse to clear the queued queries
• Migrate some queries to a new warehouse to reduce loadcorrect
• Limit user access to the warehouse so fewer queries are run against it.
--

86. A user has an application that writes a new Tile to a cloud storage location every 5
minutes.

What would be the MOST efficient way to get the files into Snowflake?
• Create a task that runs a copy into operation from an external stage every 5 minutes
• Create a task that puts the files in an internal stage and automate the data loading wizard
• Create a task that runs a GET operation to intermittently check for new files
• Set up cloud provider notifications on the Tile location and use Snowpipe with auto-
ingestcorrect
--

87. A company strongly encourages all Snowflake users to self-enroll in Snowflake's


default Multi-Factor Authentication (MFA) service to provide increased login security for
users connecting to Snowflake.

Which application will the Snowflake users need to install on their devices in order to
connect with MFA?
• Okta Verify
• Duo Mobilecorrect
• Microsoft Authenticator
• Google Authenticator
--

88. The fail-safe retention period is how many days?


• 1 day
• 7 dayscorrect
• 45 days
• 90 days
--

89. True or False: Reader Accounts are able to extract data from shared data objects for
use outside of Snowflake.
• Truecorrect
• False
--

90. What is the MOST performant file format for loading data in Snowflake?
• CSV (Unzipped)correct
• Parquet
• CSV (Gzipped)
• ORC
--

91. Which cache type is used to cache data output from SQL queries?
• Metadata cache
• Result cachecorrect
• Remote cache
• Local file cache
--

92. True or False: When you create a custom role, it is a best practice to immediately grant
that role to ACCOUNTADMIN.
• True
• Falsecorrect
--

93. What tasks can be completed using the copy command? (Select TWO)
• Columns can be aggregatedcorrect
• Columns can be joined with an existing table
• Columns can be reorderedcorrect
• Columns can be omittedcorrect
• Data can be loaded without the need to spin up a virtual warehouse
--

94. Which of the following commands cannot be used within a reader account?
• CREATE SHAREcorrect
• ALTER WAREHOUSE
• DROP ROLE
• SHOW SCHEMAS
• DESCRBE TABLE
--

Explanation:
A reader account is intended primarily for querying data shared by the provider of the account.
Adding new data to the account and/or updating shared data in the account is not supported.
Changing the configuration of virtual warehouses is also not permitted as those resources are
owned and managed by the provider of the account which is sharing the data.

95. A sales table FCT_SALES has 100 million records.

The following Query was executed

SELECT COUNT (1) FROM FCT__SALES;

How did Snowflake fulfill this query?


• Query against the result set cache
• Query against a virtual warehouse cache
• Query against the most-recently created micro-partition
• Query against the metadata excitecorrect
--

96. Which command is used to unload data from a Snowflake table into a file in a stage?
• COPY INTOcorrect
• GET
• WRITE
• EXTRACT INTO
--.

97. Which command can be used to load data into an internal stage?
• LOAD
• copy
• GET
• PUTcorrect
--

98. What Snowflake features allow virtual warehouses to handle high concurrency
workloads? (Select TWO)
• The ability to scale up warehousescorrect
• The use of warehouse auto scalingcorrect
• The ability to resize warehouses
• Use of multi-clustered warehousescorrect
• The use of warehouse indexing
--

99. What transformations are supported in a CREATE PIPE ... AS COPY ... FROM
(....)statement? (Select TWO.)
• Data can be filtered by an optional where clausecorrect
• Incoming data can be joined with other tables
• Columns can be reorderedcorrect
• Columns can be omittedcorrect
• Row level access can be defined
--

100. What types of data listings are available in the Snowflake Data Marketplace? (Choose
two.)
• Reader correct
• Consumer
• Vendor
• Standard correct
• Personalized correct
--

1. True or False: When a user creates a role, they are initially assigned ownership of the
role and they maintain ownership until it is transferred to another user.
• Truecorrect
• False
--

2. True or False: You can define multiple columns within a clustering key on a table.
• Truecorrect
• False
--

3. True or False: When active, a pipe requires a dedicated Virtual Warehouse to execute.
• True
• Falsecorrect
--

4. Which of the following statements is true of data loading?


• Resizing the virtual warehouse from x-Small to Small will process a single file twice as
fast
• The "deal file size for loading is 16MB to match micro-partition size
• Marry files in the 10-lOOMB range tend to land In the 'sweet spot" for load parallelism
Once loaded, there is no option to force a reload of an already loaded filecorrect
--

5. which of the following are valid approaches to loading data into a snowflake table?
select all the below that apply.
• Bulk copy from an External Stagecorrect
• Continuous load using Snowpipe REST APIcorrect
• The Snowflake Web Interface (UT) data loading wizardcorrect
• Bulk copy from an Internal Stage
--

6. Which of the following are valid methods for authenticating users for access into
Snowflake? (Select THREE)
• SCIMcorrect
• Federated authenticationcorrect
• TLS 1.2
• Key-pair authenticationcorrect
• OAuthcorrect
• OCSP authentication
--

7. Snowflake recommends, as a minimize, that all users with the following roles(s) should
be enrolled in Multi-Factor Authentication (MFA):
• SECURITYADMIN, ACCOUNTADMIN, PUBLIC, SYSADMIN
• SECURITYADMIN ACCOUNTADMIN, SYSADMIN
• SECURITYADMIN, ACCOUNTADMIN
• ACCOUNTADMINcorrect
--

8. Which of the following statement is true of Snowflake?


• It was built specifically for the cloudcorrect
• it was built as an on-premises solution and then potted to the cloud
• It was designed as a hybrid database to allow customers to store data either on premises
or in the cloud
• It was built for Hadoop architecture
• It's based on an Oracle Architecture
--

9. True or False: A third-party tool that supports standard JDBC or ODBC but has no
Snowflake-specific driver will be unable to connect to Snowflake.
• True
• Falsecorrect
--

Explanation:
Snowflake provides a JDBC type 4 driver that supports core JDBC functionality. The JDBC driver
must be installed in a 64-bit environment and requires Java 1.8 (or higher). The driver can be
used with most client tools/applications that support JDBC for connecting to a database server.

10. What parameter controls if the Virtual warehouse starts immediately after the CREATE
WAREHOUSE statement?
• INITTIALLY_SUSPENDED = TRUE/FALSEcorrect
• START_AFTCR_CREATE = TRUE/FALSE
• START_TTIME = 60 // (seconds from now)
• STAR
• TIME = CURREN
• DATE()
--

11. What is the minimum Snowflake edition required to create a materialized view?
• Standard Edition
• Enterprise Editioncorrect
• Business Critical Edition
• Virtual Private Snowflake Edition
--

Explanation:
Materialized views require Enterprise Edition. To inquire about upgrading, please contact
Snowflake Support

12. Which type of table corresponds to a single Snowflake session?

• Temporarycorrect
• Translent
• Provisional
• Permanent
--

Explanation:
Snowflake supports creating temporary tables for storing non-permanent, transitory data (e.g.
ETL data, session-specific data). Temporary tables only exist within the session in which they
were created and persist only for the remainder of the session.

13. True or False: It is possible to query data from an Internal or named External stage
without loading the data into Snowflake.
• Truecorrect
• False
--

14. What are ways to create and manage data shares in Snowflake? (Select TWO)
• Through the Snowflake web interface (Ul)correct
• Through the DATA_SHARE=TRUE parameter
• Through SQL commandscorrect
• Through the enable__share=true parametercorrect
• Using the CREATE SHARE AS SELECT * TABLE command
--

15. What are the three things customers want most from their enterprise data warehouse
solution? Choose 3 answers
• On-premise availabilitycorrect
• Simplicitycorrect
• Open source based
• Concurrencycorrect
• Performancecorrect
--

16. True or False: Snowflake charges a premium for storing semi-structured data.
• True
• Falsecorrect
--

17. Which of the following statements are true about Schemas in Snowflake? (Choose
two.)
• A Schema may contain one or more Databasescorrect
• A Database may contain one or more Schemascorrect
• A Schema is a logical grouping of Database Objectscorrect
• Each Schema is contained within a Warehouse
--

18. What is the minimum duration charged when starting a virtual warehouse?
• 1 second
• 1 minutecorrect
• 1 hour
• 1 day
--

19. The number of queries that a Virtual Warehouse can concurrently process is
determined by: Choose 2 answers
• The complexity of each querycorrect
• The CONCURRENT_QUERY_UMIT parameter set on the Snowflake account
• The size of the data required for each querycorrect
• The tool that s executing the query
--

20. On which of the following cloud platform can a Snowflake account be hosted? Choose
2 answers
• Amazon Web Servicescorrect
• Private Virtual Cloud
• Oracle Cloud
• Microsoft Azure Cloudcorrect
--

21. True or False: All Snowflake table types include fail-safe storage.
• True
• Falsecorrect
--

22. True or False: AWS Private Link provides a secure connection from the Customer’s
on-premise data center to the Snowflake.
• True
• Falsecorrect
--

23. True or False: It is possible to unload structured data to semi-structured formats such
as JSON and parquet.
• Truecorrect
• False
--

24. What transformations are supported in a CREATE PIPE ... AS COPY ... FROM (....)
statement? (Select TWO.)
• Data can be filtered by an optional where clausecorrect
• Incoming data can be joined with other tables
• Columns can be reorderedcorrect
• Columns can be omittedcorrect
• Row level access can be defined
--
25. What is the minimum Snowflake edition that provides data sharing?
• Standardcorrect
• Premier
• Enterprise
• Business Critical Edition
--

26. Which of the following connectors allow Multi-Factor Authentication (MFA)


authorization when connecting? (Choose all that apply.)
• JDBCcorrect
• SnowSQLcorrect
• Snowflake Web Interface (UI)correct
• ODBCcorrect
• Pythoncorrect
--

27. How long is Snowpipe data load history retained?


• As configured in the create pipe settings
• Until the pipe is dropped
• 64 dayscorrect
• 14 days
--

Explanation:
Bulk data load

Stored in the metadata of the target table for 64 days. Available upon completion of the COPY
statement as the statement output.

28. True or False: Snowflake’s data warehouse was built from the ground up for the cloud
in lieu of using an existing database or a platform, like Hadoop, as a base.
• True
• Falsecorrect
--

29. What happens when a Data Provider revokes privileges to a Share on an object in their
source database?
• The object immediately becomes unavailable for all Data Consumerscorrect
• Any additional data arriving after this point in time will not be visible to Data Consumers
• The Data Consumers stop seeing data updates and become responsible for storage
charges for the object
• A static copy of the object at the time the privilege was revoked is created In the Data
Consumers' accounts
--

30. Which of the following connectors are available in the Downloads section of the
Snowflake web Interface (UI) Choose 2 answers
• SnowSQLcorrect
• ODBCcorrect
• R
• HIVE
--
31. What privileges are required to create a task?
• The global privilege create task is required to create a new task.
• Tasks are created at the Application level and can only be created by the Account Admin
role.
• Many Snowflake DDLs are metadata operations only, and create task DDL can be
executed without virtual warehouse requirement or task specific grants.
• The role must have access to the target schema and the create task privilege on the
schema itself.correct
--

Explanation:
All tasks in a simple tree must have the same task owner (i.e. a single role must have the
OWNERSHIP privilege on all of the tasks in the tree). All tasks in a simple tree must exist in the
same schema.

32. Which data types does Snowflake support when querying semi-structured data?
(Select TWO)

• VARIANTcorrect
• ARRAYcorrect
• VARCHAR
• XML
• BLOB
--

Explanation:
A VARIANT stores semi-structured data in Snowflake. It can store a value of any other type,
including OBJECT and ARRAY. The maximum length of a VARIANT is 16 MB.

A Snowflake ARRAY is similar to an array in many other programming languages. An ARRAY


contains 0 or more pieces of data. Each element is accessed by specifying its position in the
array.

33. Which of the following statements describes a benefit of Snowflake’s separation of


compute and storage? (Choose all that apply.)
• Growth of storage and compute are tightly coupled togethercorrect
• Storage expands without the requirement to add more computecorrect
• Compute can be scaled up or down without the requirement to add more storagecorrect
• Multiple compute clusters can access stored data without contentioncorrect
--

34. What feature can be used to reorganize a very large table on one or more columns?
• Micro-partitions
• Clustering keyscorrect
• Key partitions
• Clustered partitions
--

35. What are two ways to create and manage Data Shares in Snowflake? (Choose two.)
• Via the Snowflake Web Interface (Ul)correct
• Via the data_share=true parameter
• Via SQL commandscorrect
• Via Virtual Warehouses
--

36. Which copy INTO command outputs the data into one file?
• SINGLE=TRUE
• MAX_FILE_NUMBER=1correct
• FILE_NUMBER=1
• MULTIPLE=FAISE
--

37. How many shares can be consumed by single Data Consumer?


• 1
• 10
• 100, but can be increased by contacting support
• Unlimitedcorrect
--

38. What command is used to load files into an Internal Stage within Snowflake?
• PUTcorrect
• COPY INTO
• TRANSFER
• INSERT
--

Explanation:
You must specify an internal stage in the PUT command when uploading files to Snowflake. You
must specify the same stage in the COPY INTO <table> command when loading data into a
table from the staged files.

39. True or False: The user has to specify which cluster a query will run on in multi-
clustering Warehouse.
• True
• Falsecorrect
--

40. Why would a customer size a Virtual Warehouse from an X-Small to a Medium?
• To accommodate more queries
• To accommodate more users
• To accommodate fluctuations in workload
• To accommodate a more complex workloadcorrect
--

41. How would you determine the size of the virtual warehouse used for a task?
• Root task may be executed concurrently (i.e. multiple instances), it is recommended to
leave some margins in the execution window to avoid missing instances of execution
• Querying (select) the size of the stream content would help determine the warehouse
size. For example, if querying large stream content, use a larger warehouse size
• If using the stored procedure to execute multiple SQL statements, it's best to test run the
stored procedure separately to size the compute resource firstcorrect
• Since task infrastructure is based on running the task body on schedule, it's
recommended to configure the virtual warehouse for automatic concurrency handling
using Multi-cluster warehouse (MCW) to match the task schedule
--
42. Which of the following are examples of operations that require a Virtual Warehouse to
complete, assuming no quires have been executed previously? Choose 3 answers
• MIN(< < column value>>)correct
• COPYcorrect
• SUM(<< column value >>)correct
• UPDATEcorrect
--

43. What are the default Time Travel and Fail-safe retention periods for transient tables?
• Time Travel - 1 day. Fail-safe - 1 day
• Time Travel - 0 days. Fail-safe - 1 day
• Time Travel - 1 day. Fail-safe - 0 dayscorrect
• Transient tables are retained in neither Fail-safe nor Time Travel
--

44. What happens when a virtual warehouse is resized?


• When increasing the size of an active warehouse the compute resource for all running
and queued queries on the warehouse are affected
• When reducing the size of a warehouse the compute resources are removed only when
they are no longer being used to execute any current statements.correct
• The warehouse will be suspended while the new compute resource is provisioned and
will resume automatically once provisioning is complete.
• Users who are trying to use the warehouse will receive an error message until the
resizing is complete
--

45. Which statements are true of micro-partitions? Choose 2 answers


• They are approximately 16MB in sizecorrect
• They are stored compressed only if COMPRESS=TRUE on Table
• They are Immutablecorrect
• They are only encrypted in the Enterprise edition and above
--

46. Which Snowflake partner specializes in data catalog solutions?


• Alationcorrect
• DataRobot
• dbt
• Tableau
--

Explanation:
Alation provides Data Cataloging functionality. They state they are the 'One Place to Find,
Understand, & Govern Data Across an Enterprise.

47. A virtual warehouse's auto-suspend and auto-resume settings apply to which of the
following?

• The primary cluster in the virtual warehouse


• The entire virtual warehousecorrect
• The database in which the virtual warehouse resides
• The Queries currently being run on the virtual warehouse
--
48. True or False: Snowpipe via RFST API can only reference External Stages as source.
• True
• Falsecorrect
--

49. Which statement best describes '' clustering''?


• Clustering represents the way data is grouped together and stored within snowflake's
micro-partitionscorrect
• The database administrator must define the clustering methodology for each Snowflake
table.
• The clustering key must be included on the COPV command when loading data into
Snowflake.
• Clustering can be disabled within a Snowflake account.
--

50. Which of the following statements would be used to export/unload data from
Snowflake?
• COPY INTO @stagecorrect
• EXPORT TO @stage
• INSERT INTO @stage
• EXPORT_TO_STAGE (stage = > @Wage, select = > 'select * from t1);
--

51. What is a machine learning and data science partner within the Snowflake Partner
Ecosystem?
• Informatica
• Power Bl
• Adobe
• Data Robotcorrect
--

52. What is a key feature of Snowflake architecture?


• Zero-copy cloning creates a mirror copy of a database that updates with the original
• Software updates are automatically applied on a quarterly basis
• Snowflake eliminates resource contention with its virtual warehouse implementation
• Multi-cluster warehouses allow users to run a query that spans across multiple
clusterscorrect
• Snowflake automatically sorts DATE columns during ingest for fast retrieval by date
--

53. Which of the following roles is recommended to be used to create and manage users
and roles?
• SYSADMIN
• SECURITYADMIN correct
• PUBLIC
• ACCOUNTADMIN
--

Explanation:
"Security admin: Role that can manage any object grant globally, as well as create, monitor, and
manage users and roles"
54. Which of the following statements are true of VALIDATION_MODE in Snowflake?
(Choose two.)
• The validation_mode option is used when creating an Internal Stagecorrect
• validation_mode=return_all_errors is a parameter of the copy commandcorrect
• The validation_mode option will validate data to be loaded by the copy statement while
completing the load and will return the rows that could not be loaded without error
• The validation_mode option will validate data to be loaded by the copy statement without
completing the load and will return possible errors correct
--

55. True or False: Once created, a micro-partition will never be changed.


• True correct
• False
--

56. To run a Multi-Cluster Warehouse in auto-scale mode, a user would:


• Configure the Maximum Clusters setting to “Auto-Scale”
• Set the Warehouse type to “Auto”
• Set the Minimum Clusters and Maximum Clusters settings to the same value
• Set the Minimum Clusters and Maximum Clusters settings to the different values correct
--

57. Which of the following describes how clustering keys work in Snowflake?
• Clustering keys update the micro-partitions in place with a full sort, and impact the DML
operations.
• Clustering keys sort the designated columns over time, without blocking DML operations
correct
• Clustering keys create a distributed, parallel data structure of pointers to a table's rows
and columns
• Clustering keys establish a hashed key on each node of a virtual warehouse to optimize
joins at run-time
--

58. Which of the following are benefits of micro-partitioning? (Select TWO)


• Micro-partitions cannot overlap in their range of valuescorrect
• Micro-partitions are immutable objects that support the use of Time Travel.
• Micro-partitions can reduce the amount of I/O from object storage to virtual warehouses
• Rows are automatically stored in sorted order within micro-partitionscorrect
• Micro-partitions can be defined on a schema-by-schema basis
--

59. Which interfaces can be used to create and/or manage Virtual Warehouses?
• The Snowflake Web Interface (UI)
• SQL commands
• Data integration tools
• All of the abovecorrect
--

60. When scaling out by adding clusters to a multi-cluster warehouse, you are primarily
scaling for improved:
• Concurrencycorrect
• Performance
--

61. True or False: You can query the files in an External Stage directly without having to
load the data into a table.
• Truecorrect
• False
--

External tables are read-only, therefore no DML operations can be performed on them; however,
external tables can be used for query and join operations. Views can be created against external
tables.

62. Which are true of Snowflake roles?


• All grants to objects are given to roles, and never to userscorrect
• In order to do DML/DOL, a user needs to have selected a single role that has that
specific access to the object and operationcorrect
• The public role controls at other roles
• Roles are a subset of users and users own objects In Snowflake
--

63. Query results are stored in the Result Cache for how long after they are last accessed,
assuming no data changes have occurred?
• 1 Hour
• 3 Hours
• 12 hours
• 24 hourscorrect
--

64. Which Snowflake object enables loading data from files as soon as they are available
in a cloud storage location?
• Pipecorrect
• External stage
• Task
• Stream
--

Explanation:
Snowpipe enables loading data from files as soon as they’re available in a stage. This means
you can load data from files in micro-batches, making it available to users within minutes, rather
than manually executing COPY statements on a schedule to load larger batches.

65. Which of the following are best practices for users with the
SYSADMIN/ACCOUNTADMIN roles? Choose 3 answers
• Their default role should be set to SYSTEMADMIN (the lower of the two)correct
• They should not set up multi_Factor Authentication (MFA)―as administrator they may
need to change the MFA settings and those enrolled in MFA are unable to do so
• They should only access and 'step into' the ACCOUNTADMIN role temporarily, as
needed to complete a specific operationcorrect
• They should ensure all database objects in the account are owned by the
ACCOUNTADMIN role
• They should use the SYSADMIN role to perform administrative work on database
objectscorrect
--
66. Which statement best describes Snowflake tables?
• Snowflake tables are logical representations of underlying physical datacorrect
• Snowflake tables ate the physical instantiation of data loaded Into Snowflake
• Snowflake tables require that clustering keys be defined to perform optimally
• Snowflake tables are owned by a user
--

67. True or False: A table in Snowflake can only be queried using the Virtual Warehouse
that was used to load the

data.
• True
• Falsecorrect
--

68. What is the MOST performant file format for loading data in Snowflake?
• CSV (Unzipped)correct
• Parquet
• CSV (Gzipped)
• ORC
--

69. True or False: Snowflake’s Global Services Layer gathers and maintains statistics on
all columns in all micro-partitions.
• Truecorrect
• False
--

Explanation:
Snowflake is a single, integrated platform delivered as-a-service. It features storage, compute,
and global services layers that are physically separated but logically integrated.

70. True or False: Users are able to see the result sets of queries executed by other users
that share their same role.
• True
• Falsecorrect
--

71. What Snowflake role must be granted for a user to create and manage accounts?
• ACCOUNTADMINcorrect
• ORGADMIN
• SECURITYADMIN
• SYSADMIN
--

72. A company strongly encourages all Snowflake users to self-enroll in Snowflake's


default Multi-Factor Authentication (MFA) service to provide increased login security for
users connecting to Snowflake.

Which application will the Snowflake users need to install on their devices in order to
connect with MFA?
• Okta Verify
• Duo Mobilecorrect
• Microsoft Authenticator
• Google Authenticator
--

73. True or False: Snowflake bills for a minimum of five minutes each time a Virtual
Warehouse is started.
• True
• Falsecorrect
--

74. A user is preparing to load data from an external stage

Which practice will provide the MOST efficient loading performance?


• Organize files into logical paths
• Store the files on the external stage to ensure caching is maintained
• Use pattern matching for regular expression execution
• Load the data in one large filecorrect
--

75. What SQL command would be used to view all roles that were granted to user.1?
• show grants to user USER1;correct
• show grants of user USER1;
• describe user USER1;
• show grants on user USER1;
--

76. How often are encryption keys automatically rotated by Snowflake?


• 30 Dayscorrect
• 60 Days
• 90 Days
• 365 Days
--

Explanation:
All Snowflake-managed keys are automatically rotated by Snowflake when they are more than
30 days old. Active keys are retired, and new keys are created. When Snowflake determines the
retired key is no longer needed, the key is automatically destroyed.

77. The FLATEEN function is used to query which type of data in Snowflake?
• Structured data
• Semi-structured datacorrect
• Both of the above
• None of the above
--

FLATTEN is used to unnest semi-structured data. Don't see an application for structured data as
by definition it shouldn't be nested.

78. Which of the following is true of Snowpipe via REST API? Choose 2 answers
• you can only use it on internal Stagescorrect
• All COPY INTO options are available fluting pipe creation
• Snowflake automatically manages the compute required to execute the Pipe's copy into
commandscorrect
• Snowpipe keeps track of which files it has loadedcorrect
--

79. Which statement about billing applies to Snowflake credits?


• Credits are billed per-minute with a 60-minute minimum
• Credits are used to pay for cloud data storage usage
• Credits are consumed based on the number of credits billed for each hour that a
warehouse runs
• Credits are consumed based on the warehouse size and the time the warehouse is
runningcorrect
--

Explanation:
Snowflake credits are used to pay for the consumption of resources on Snowflake. A Snowflake
credit is a unit of measure, and it is consumed only when a customer is using resources, such as
when a virtual warehouse is running, the cloud services layer is performing work, or serverless
features are used.

80. Which is true of Snowflake network policies? A Snowflake network policy: (Choose
two.)
• Is available to all Snowflake Editionscorrect
• Is only available to customers with Business Critical Edition
• Restricts or enables access to specific IP addressescorrect
• Is activated using an “ALTER DATABASE” command
--

81. Which of the following is a valid source for an external stage when the Snowflake
account is located on Microsoft Azure?
• An FTP server with TLS encryption
• An HTTPS server with WebDAV
• A Google Cloud storage bucket
• A Windows server file share on Azurecorrect
--

82. When reviewing the load for a warehouse using the load monitoring chart, the chart
indicates that a high volume of Queries are always queuing in the warehouse.

According to recommended best practice, what should be done to reduce the Queue
volume? (Select TWO).
• Use multi-clustered warehousing to scale out warehouse capacity.correct
• Scale up the warehouse size to allow Queries to execute faster.
• Stop and start the warehouse to clear the queued queries
• Migrate some queries to a new warehouse to reduce loadcorrect
• Limit user access to the warehouse so fewer queries are run against it.
--

83. What is a limitation of a Materialized View?


• A Materialized View cannot support any aggregate functions
• A Materialized View can only reference up to two tables
• A Materialized View cannot be joined with other tables
• A Materialized View cannot be defined with a JOINcorrect
--

Explanation:
There are several limitations to using materialized views:

✑ A materialized view can query only a single table.

✑ Joins, including self-joins, are not supported.

84. What are value types that a VARIANT column can store? (Select TWO)
• STRUCTcorrect
• OBJECTcorrect
• BINARY
• ARRAYcorrect
• CLOB
--

Explanation:
Characteristics of a VARIANT

A VARIANT can store a value of any other type, including OBJECT and ARRAY. The maximum
length of a VARIANT is 16 MB.

85. Which Snowflake objects track DML changes made to tables, like inserts, updates, and
deletes?
• Pipes
• Streamscorrect
• Tasks
• Procedures
--

86. Which services does the Snowflake Cloud Services layer manage? (Select TWO).
• Compute resourcescorrect
• Query execution
• Authenticationcorrect
• Data storage
• Metadatacorrect
--

Explanation:
https://docs.snowflake.com/en/user-guide/intro-key-concepts.html The cloud services layer is a
collection of services that coordinate activities across Snowflake. These services tie together all
of the different components of Snowflake in order to process user requests, from login to query
dispatch. The cloud services layer also runs on compute instances provisioned by Snowflake
from the cloud provider.

Services managed in this layer include:

✑ Authentication

✑ Infrastructure management
✑ Metadata management

✑ Query parsing and optimization

✑ Access control

87. Select the three types of tables that exist within Snowflake. Choose 3 answers
• Temporarycorrect
• Transientcorrect
• Provisioned
• Permanentcorrect
--

88. Which of the following are common use cases for zero-copy cloning? Choose 3
answers
• Quick provisioning of Dev and Test/QA environmentscorrect
• Data backupscorrect
• Point in time snapshotscorrect
• Performance optimization
--

89. Which of the following statements are true of Virtual Warehouses? (Choose all that
apply.)
• Customers can change the size of the Warehouse after creationcorrect
• A Warehouse can be resized while runningcorrect
• A Warehouse can be configured to suspend after a period of inactivitycorrect
• A Warehouse can be configured to auto-resume when new queries are submittedcorrect
--

90. What happens when a cloned table is replicated to a secondary database? (Select
TWO)
• A read-only copy of the cloned tables is stored.correct
• The replication will not be successful.
• The physical data is replicatedcorrect
• Additional costs for storage are charged to a secondary accountcorrect
• Metadata pointers to cloned tables are replicated
--

Explanation:
Cloned objects are replicated physically rather than logically to secondary databases. That is,
cloned tables in a standard database do not contribute to the overall data storage unless or until
DML operations on the clone add to or modify existing data.However, when a cloned table is
replicated to a secondary database, the physical data is also replicated, increasing the data
storage usage for your account.

91. True or False: It is possible for a user to run a query against the query result cache
without requiring an active Warehouse.
• Truecorrect
• False
--

Explanation:
Query result cache is all about fetching the data from cloud services layer and saving the cost by
not running the virtual warehouse.

92. When unloading to a stage, which of the following is a recommended practice or


approach?
• Set SINGLE: = true for larger files
• Use OBJECT_CONSTRUCT ( * ) when using Parquet
• Avoid the use of the CAST functioncorrect
• Define an individual file format
--

93. Which of the following items does the Cloud services Layer manage? Choose 4
answers
• user authenticationcorrect
• Metadatacorrect
• Query compilation and optimizationcorrect
• external blob storage
• Data securitycorrect
--

94. If 3 size Small virtual warehouse is made up of two servers, how many servers make
up a Large warehouse?
• 4
• 8correct
• 16
• 32
--

Explanation:
Size specifies the amount of compute resources available per cluster in a warehouse.Snowflake
supports the following warehouse sizes:

95. A user needs to create a materialized view in the schema MYDB.MYSCHEMA.

Which statements will provide this access?

A)

B)

C)

D)
• Option Acorrect
• Option B
• Option C
• Option D
--

96. A developer is granted ownership of a table that has a masking policy. The developer's
role is not able to see the masked data.

Will the developer be able to modify the table to read the masked data?
• Yes, because a table owner has full control and can unset masking policies.correct
• Yes, because masking policies only apply to cloned tables.
• No, because masking policies must always reference specific access roles.
• No, because ownership of a table does not include the ability to change masking policies
--

97. Which of the following are valid Snowflake Virtual Warehouse Scaling Policies?
(Choose two.)
• Custom correct
• Economy correct
• Optimized
• Standard correct
--

98. True or False: When you create a custom role, it is a best practice to immediately grant
that role to ACCOUNTADMIN.
• True
• False correct
--

99. In which scenarios would a user have to pay Cloud Services costs? (Select TWO).
• Compute Credits = 50 Credits Cloud Services = 10 correct
• Compute Credits = 80 Credits Cloud Services = 5 correct
• Compute Credits = 10 Credits Cloud Services = 9
• Compute Credits = 120 Credits Cloud Services = 10 correct
• Compute Credits = 200 Credits Cloud Services = 26
--

100. When is the result set cache no longer available? (Select TWO)
• When another warehouse is used to execute the query correct
• When another user executes the query
• When the underlying data has changed correct
• When the warehouse used to execute the query is suspended
• When it has been 24 hours since the last query correct
--

1. When a pipe is recreated using the CREATE on REPLAC PIPE command: Select one.
• The Pipe load history is reset to emptycorrect
• The REFRESH parameter is set to TRUE
• Previously loaded files will be ignored
• All of the above
--
2. Select the different types of Internal Stages: (Choose three.)
• Named Stagecorrect
• User Stagecorrect
• Table Stagecorrect
• Schema Stage
--

3. Which statement best describes Snowflake tables?


• Snowflake tables are logical representations of underlying physical datacorrect
• Snowflake tables ate the physical instantiation of data loaded Into Snowflake
• Snowflake tables require that clustering keys be defined to perform optimally
• Snowflake tables are owned by a user
--

4. What happens when a Data Provider revokes privileges to a Share on an object in their
source database?
• The object immediately becomes unavailable for all Data Consumerscorrect
• Any additional data arriving after this point in time will not be visible to Data Consumers
• The Data Consumers stop seeing data updates and become responsible for storage
charges for the object
• A static copy of the object at the time the privilege was revoked is created In the Data
Consumers' accounts
--

5. True or False: Snowpipe via RFST API can only reference External Stages as source.
• True
• Falsecorrect
--

6. True or False: When active, a pipe requires a dedicated Virtual Warehouse to execute.
• True
• Falsecorrect
--

7. True or False: You can resize a Virtual Warehouse while queries are running.
• Truecorrect
• False
--

8. Which of the following statements about data sharing are true? choose 2 answers
• New objects created by a Data Provider are automatically shared with existing Data
Consumers & Reader Accountscorrect
• All database objects can be included In a shared database
• Reader Accounts are created and funded by Data Prowlerscorrect
• Shared databases are read-onlycorrect
--

9. Which of the following statements is true of Snowflake?


• It was built specifically for the cloudcorrect
• It was built as an on-premises solution and then ported to the cloud
• It was designed as a hybrid database to allow customers to store data either on premises
or in the cloud
• It was built for Hadoop architecture
• It's based on an Oracle Architecture
--

10. What is the recommended method for loading data into Snowflake?
• Load row by row
• Load data in batchcorrect
• Load data by writing it In the Snowflake Web Interface (UI)
• Load data via frequent, angle row DML's
--

11. What is the lowest Snowflake edition that offers Time Travel up to 90 days.
• standard Edition
• Premier Edition
• Enterprise Editioncorrect
• Business Critical Edition
--

12. Which of the following statements are true of Virtual Warehouses? (Choose all that
apply.)
• Customers can change the size of the Warehouse after creationcorrect
• A Warehouse can be resized while runningcorrect
• A Warehouse can be configured to suspend after a period of inactivitycorrect
• A Warehouse can be configured to auto-resume when new queries are submitted
--

13. True or False: Users are able to see the result sets of queries executed by other users
that share their same role.
• True
• Falsecorrect
--

14. Select the three types of tables that exist within Snowflake.

Choose 3 answers
• Temporarycorrect
• Transientcorrect
• Provisioned
• Permanentcorrect
--

15. Snowflake recommends, as a minimize, that all users with the following roles(s)
should be enrolled in Multi-Factor Authentication (MFA):
• SECURITYADMIN, ACCOUNTADMIN, PUBLIC, SYSADMIN
• SECURITYADMIN ACCOUNTADMIN, SYSADMIN
• SECURITYADMIN, ACCOUNTADMIN
• ACCOUNTADMINcorrect
--

16. Which of the following are valid Snowflake Virtual Warehouse Scaling Policies?
(Choose two.)
• Customcorrect
• Economycorrect
• Optimized
• Standardcorrect
--

17. True or False: Each worksheet in the Snowflake Web Interface (UI) can be associated
with different roles, databases, schemas, and Virtual Warehouses.
• True
• Falsecorrect
--

18. True or False: Reader Accounts are able to extract data from shared data objects for
use outside of Snowflake.
• Truecorrect
• False
--

19. True or False: Micro-partition metadata enables some operations to be completed


without requiring Compute.
• Truecorrect
• False
20. Account-level storage usage can be monitored via:
• The snowflake wet Interface (UI) in the Databases section.
• The Snowflake web interface (UI) in the Account -> Billing a usage sectioncorrect
• The Information Schema -> ACCOUNT_USAGE_HISTORY View
• The Account usage Schema - > ACCOUNT_USAGE_METRICS View
--

21. True or False: Data in Fail-safe can be deleted by a user or the Snowflake team before
it expires.
• True
• Falsecorrect
--

22. Which statement best describes '' clustering''?


• Clustering represents the way data is grouped together and stored within snowflake's
micro-partitionscorrect
• The database administrator must define the clustering methodology for each Snowflake
table.
• The clustering key must be included on the COPV command when loading data into
Snowflake.
• Clustering can be disabled within a Snowflake account.
--

23. Snowflake is designed for which type of workloads? (Choose two.)


• OLAP (Analytics) workloadscorrect
• OLTP (Transactional) workloadscorrect
• Concurrent workloads
• On-premise workloads
--

24. How many shares can be consumed by single Data Consumer?


• 1
• 10
• 100, but can be increased by contacting support
• Unlimitedcorrect
--

25. What is the maximum compressed row size in Snowflake?


• 8KB
• 16MBcorrect
• 50MB
• 4000GB
--

26. True or False: During data unloading, only JSON and CSV files can be compressed.
• True
• Falsecorrect
--

27. Which of the following statements is true of Snowflake micro-partitioning?


• Micro-partitioning has been known to introduce data skew
• Micro-partitioning: requires a partitioning schema to be defined up front
• Micro-partitioning is transparently completed using the ordering that occurs when the
data is inserted/loadedcorrect
• Micro-partitioning can be disabled within a Snowflake account
--

28. Which of the following are options when creating a virtual Warehouse? Choose 2
answers
• Auto-dropcorrect
• Auto resize
• Auto-resumecorrect
• Auto-suspendcorrect
--

29. True or False: When you create a custom role, it is a practice to immediately grant that
role to ACCOUNTADMIN.
• True
• Falsecorrect
--

30. How a Snowpipe charges calculated?


• Per-second/per Warehouse size
• Per-second/per-core granularitycorrect
• Number of Pipes in account
• Total storage bucket size
--

31. When can a Virtual Warehouse start running queries?


• 12am-5am
• Only during administrator defined time slotscorrect
• When its provisioning is complete
• After replication
--

32. For a multi-cluster Warehouse, the number of credits billed is calculated on: Select
one.
• The number of queries that ran using the Warehouse.
• The size of the Warehouse and the number of clusters that ran within a given time
period.correct
• The sue of the Warehouse and the maximum number of clusters configured for the
Warehouse.
• The number of users who accessed the Warehouse.
--

33. When loading data into Snowflake, the COPY command supports: Choose 2 answers
• Joinscorrect
• Fitters
• Data type conversionscorrect
• Column reorderingcorrect
• Aggregates
--

34. Which is true of Snowflake network policies? A Snowflake network policy: (Choose
two.)
• Is available to all Snowflake Editionscorrect
• Is only available to customers with Business Critical Edition
• Restricts or enables access to specific IP addressescorrect
• Is activated using an “ALTER DATABASE” command
--

35. True or False: A 4X-Large Warehouse may, at times, take longer to provision than a X-
Small Warehouse.

• True
• Falsecorrect
--

36. True or False: It is possible to unload structured data to semi-structured formats such
as JSON and parquet.
• Truecorrect
• False
--

37. Query results are stored in the Result Cache for how long after they are last accessed,
assuming no data changes have occurred?
• 1 Hour
• 3 Hours
• 12 hours
• 24 hourscorrect
--

38. Which formats are supported for unloading data from Snowflake? Choose 2 answers

• Delimited (CSV, TSV, etc.)correct


• Avro
• JSONcorrect
• ORC
--

39. True or False: When a user creates a role, they are initially assigned ownership of the
role and they maintain ownership until it is transferred to another user.
• Truecorrect
• False

40. Data storage for individual tables can be monitored using which commands and/or
object(s)? Choose 2 answers
• SHOW TABLES;correct
• SHOW STORAGE BY TABLE;
• Information Schema -> TABLE_STORAGE_METRICS
• Information Schema -> TASLE_HISTORYcorrect
--

41. Which of the following statements are true about Schemas in Snowflake? (Choose
two.)
• A Schema may contain one or more Databasescorrect
• A Database may contain one or more Schemascorrect
• A Schema is a logical grouping of Database Objectscorrect
• Each Schema is contained within a Warehouse
--

42. Which of the following statements are true of Snowflake releases: (Choose two.)
• They happen approximately weeklycorrect
• They roll up and release approximately monthly, but customers can request early release
application
• During a release, new customer requests/queries/connections transparently move over to
the newer versioncorrect
• A customer is assigned a 30 minute window (that can be moved anytime within a week)
during which the system will be unavailable and customer is upgradedcorrect
--

43. True or False: Snowflake allows its customers to directly access the micro-partition
files that make up its tables.
• True
• Falsecorrect
--

44. True or False: AWS Private Link provides a secure connection from the Customer’s
on-premise data center to the Snowflake.
• True
• Falsecorrect
--

45. If auto-suspend is enable for a Virtual Warehouse, he Warehouse is automatically


suspended when:
• All Snowflake sessions using the warehouse are terminated.
• The last query using the warehouse completes.
• There are no users loaned into Snowflake.
• The Warehouse is inactive for a specified period of time.correct
--

46. What is the most granular object that the Time Travel retention period can be defined
on?
• Accountcorrect
• Database
• Schema
• Table
--

47. True or False: The COPY command must specify a File Format in order to execute.
• Truecorrect
• False
--

48. What parameter controls if the Virtual warehouse starts immediately after the CREATE
WAREHOUSE statement?
• INITTIALLY_SUSPENDED = TRUE/FALSEcorrect
• START_AFTCR_CREATE = TRUE/FALSE
• START_TTIME = 60 // (seconds from now)
• STAR
• TIME = CURREN
• DATE()
--

49. True or False: It is possible for a user to run a query against the query result cache
without requiring an active Warehouse.
• True
• Falsecorrect
--

50. To run a Multi-Cluster Warehouse in auto-scale mode, a user would:


• Configure the Maximum Clusters setting to “Auto-Scale”
• Set the Warehouse type to “Auto”
• Set the Minimum Clusters and Maximum Clusters settings to the same value
• Set the Minimum Clusters and Maximum Clusters settings to the different valuescorrect
--

51. Which of the following accurately represents how a table fits into Snowflake’s logical
container hierarchy?
• Account -> Schema -> Database -> Table
• Account -> Database -> Schema -> Tablecorrect
• Database -> Schema -> Table -> Account
• Database -> Table -> Schema -> Account
--

52. The fail-safe retention period is how many days?


• 1 day
• 7 dayscorrect
• 45 days
• 90 days
--

53. Which type of table corresponds to a single Snowflake session?


• Temporarycorrect
• Translent
• Provisional
• Permanent
--

54. Which of the following commands are not blocking operations?

Choose 2 answers
• UPDATEcorrect
• INSERTcorrect
• MERGE
• COPYcorrect
--

55. The Information Schema and Account Usage Share provide storage information for
which of the following objects? (Choose three.)
• Userscorrect
• Tablescorrect
• Databasescorrect
• Internal Stages
--

56. What is the minimum duration charged when starting a virtual warehouse?
• 1 second
• 1 minutecorrect
• 1 hour
• 1 day
--

57. Which of the following are examples of operations that require a Virtual Warehouse to
complete, assuming no quires have been executed previously? Choose 3 answers
• MIN(< < column value>>)correct
• COPYcorrect
• SUM(<< column value >>)correct
• UPDATEcorrect
--

58. Which of the following commands sets the Virtual Warehouse for a session?
• COPT WAREHOUSE FROM <<Config file> ;
• SET warehouse = <<warehouse name>>;
• USE WAREHOUSE <<warehouse name>>;correct
• USE VIRTUAL_WAREHOUSE <<warehouse name>>;
--

59. True or False: A single database can exist in more than one Snowflake account.
• True
• Falsecorrect
--

60. Which of the following statements is true of data loading?


• Resizing the virtual warehouse from x-Small to Small will process a single file twice as
fast
• The "deal file size for loading is 16MB to match micro-partition size
• Marry files in the 10-lOOMB range tend to land In the 'sweet spot" for load parallelism
Once loaded, there is no option to force a reload of an already loaded filecorrect
--

1. True or false: Snowflake enforces unique, primary key, and foreign key constraints
during DML operations.
• True
• Falsecorrect
--

2. Which of the following are best practices for users with the
SYSADMIN/ACCOUNTADMIN roles? Choose 3 answers
• Their default role should be set to SYSTEMADMIN (the lower of the two)correct
• They should not set up multi_Factor Authentication (MFA)―as administrator they may
need to change the MFA settings and those enrolled in MFA are unable to do so
• They should only access and 'step into' the ACCOUNTADMIN role temporarily, as
needed to complete a specific operationcorrect
• They should ensure all database objects in the account are owned by the
ACCOUNTADMIN role
• They should use the SYSADMIN role to perform administrative work on database
objectscorrect
--

3. True or False: A third-party tool that supports standard JDBC or ODBC but has no
Snowflake-specific driver will be unable to connect to Snowflake.
• Truecorrect
• False
--

4. Which of the following are common use cases for zero-copy cloning? Choose 3
answers
• Quick provisioning of Dev and Test/QA environmentscorrect
• Data backupscorrect
• Point in time snapshots
• Performance optimization
--

5. True or False: Snowflake allows its customers to directly access the micro-partition
files that make up its tables.
• True
• Falsecorrect
--
6. If a Small Warehouse is made up of 2 servers/cluster, how many servers/cluster make
up a Medium

Warehouse?
• 4correct
• 16
• 32
• 128
--

7. Increasing the maximum number of clusters in a Multi-Cluster Warehouse is an


example of:
• Scaling rhythmically
• Scaling max
• Scaling out
• Scaling Upcorrect
--

8. What command is used to load files into an Internal Stage within Snowflake? Select
one.
• PUTcorrect
• COPY INTO
• TRANSFER
• INSERT
--

9. When loading data into Snowflake, the COPY command supports: Choose 2 answers
• Joins
• Fitters
• Data type conversionscorrect
• Column reorderingcorrect
• Aggregates
--

10. Which of the following are main sections of the top navigation of the Snowflake web
Interface (UI)?
• Databasecorrect
• Tables
• Warehousescorrect
• Worksheets
--

11. Which is true of Snowflake network policies? A Snowflake network policy: (Choose
two.)
• Is available to all Snowflake Editionscorrect
• Is only available to customers with Business Critical Edition
• Restricts or enables access to specific IP addresses
• Is activated using an “ALTER DATABASE” command
--

12. Which of the following are examples of operations that require a Virtual Warehouse to
complete, assuming no quires have been executed previously? Choose 3 answers
• MIN(< < column value>>)
• COPYcorrect
• SUM(<< column value >>)correct
• UPDATEcorrect
--

13. When a Pipe is recreated using the CREATE OR REPLACE PIPE command:
• The Pipe load history is reset to empty
• The REFRESH parameter is set to TRUEcorrect
• Previously loaded files will be ignored
• All of the above
--

14. True or False: The user has to specify which cluster a query will run on in multi-
clustering Warehouse.
• Truecorrect
• False
--

15. Which of the following statements describes a benefit of Snowflake’s separation of


compute and storage? (Choose all that apply.)
• Growth of storage and compute are tightly coupled togethercorrect
• Storage expands without the requirement to add more compute
• Compute can be scaled up or down without the requirement to add more storage
• Multiple compute clusters can access stored data without contention
--

16. Which are true of Snowflake roles?


• All grants to objects are given to roles, and never to userscorrect
• In order to do DML/DOL, a user needs to have selected a single role that has that
specific access to the object and operationcorrect
• The public role controls at other roles
• Roles are a subset of users and users own objects In Snowflake
--

17. For a multi-cluster Warehouse, the number of credits billed is calculated on:

Select one.
• The number of queries that ran using the Warehouse.
• The size of the Warehouse and the number of clusters that ran within a given time
period.correct
• The sue of the Warehouse and the maximum number of clusters configured for the
Warehouse.
• The number of users who accessed the Warehouse.
--

18. To run a Multi-Cluster Warehouse in auto-scale mode, a user would:


• Configure the Maximum Clusters setting to “Auto-Scale”correct
• Set the Warehouse type to “Auto”
• Set the Minimum Clusters and Maximum Clusters settings to the same value
• Set the Minimum Clusters and Maximum Clusters settings to the different valuescorrect
--
19. True or False: Snowflake charges a premium for storing semi-structured data.
• Truecorrect
• Falsecorrect
--

20. Which object allows you to limit the number of credits consumed within a Snowflake
account?

Select one.
• Account usage Trackingcorrect
• Resource Monitorcorrect
• Warehouse Limit Parameter
• Credit Consumption Tracker
--

21. True or False: When a user creates a role, they are initially assigned ownership of the
role and they maintain ownership until it is transferred to another user.
• Truecorrect
• False
--

22. Which statements are true of micro-partitions? Choose 2 answers


• They are approximately 16MB in sizecorrect
• They are stored compressed only if COMPRESS=TRUE on Table
• They are Immutable
• They are only encrypted in the Enterprise edition and above
--

23. True or False: It is possible to query data from an Internal or named External stage
without loading the data into Snowflake.
• Truecorrect
• False
--

24. True or False: When data share is established between a Data Provider and a data
Consumer, the Data Consumer can extend that data share to other Data Consumers.
• Truecorrect
• Falsecorrect
--

25. True or False: Micro-partition metadata enables some operations to be completed


without requiring Compute.
• Truecorrect
• False
--

26. Which item in the Data Warehouse migration process does not apply in Snowflake>
• Migrate Users
• Migrate Schemas
• Migrate Indexescorrect
• Build the Data pipeline
--
27. When can a Virtual Warehouse start running queries?
• 12am-5am
• Only during administrator defined time slotscorrect
• When its provisioning is complete
• After replication
--

28. When creating a user it is advisable sable to: Choose 2 answers


• Set the user to be initially disabled
• Force an immediate password changecorrect
• Set a default role for the usercorrect
• Set the number of minutes to unlock to 15 minutes
• Set the users access to expire within a specified timeframe
--

29. When scaling up Virtual Warehouse by increasing Virtual Warehouse t-shirt size, you
are primarily scaling for improved: Select one.
• Concurrency
• Performancecorrect
--

30. The number of queries that a Warehouse can concurrently process is determined by:
Choose 2 answers
• The complexity of each querycorrect
• The CONCURRENT_QUERY_UMIT parameter set on the Snowflake account
• The size of the data required for each querycorrect
• The tool that s executing the query
--

31. Account-level storage usage can be monitored via:


• The snowflake wet Interface (UI) in the Databases section.correct
• The Snowflake web interface (UI) in the Account -> Billing a usage sectioncorrect
• The Information Schema -> ACCOUNT_USAGE_HISTORY View
• The Account usage Schema - > ACCOUNT_USAGE_METRICS View
--

32. True or False: All Snowflake table types include fail-safe storage.
• True
• Falsecorrect
--

33. Which formats are supported for unloading data from Snowflake? Choose 2 answers
• Delimited (CSV, TSV, etc.)correct
• Avro
• ISONcorrect
• ORC
--

34. The PUT command: (Choose two.)


• Automatically creates a File Format objectcorrect
• Automatically uses the last Stage created
• Automatically compresses files using Gzipcorrect
• Automatically encrypts files
--

35. Which of the following objects can be cloned? (Choose four.)


• Tablescorrect
• Named File Formats
• Schemascorrect
• Shares
• Databasescorrect
• Userscorrect
--

36. If auto-suspend is enable for a Virtual Warehouse, he Warehouse is automatically


suspended when:
• All Snowflake sessions using the warehouse are terminated.
• The last query using the warehouse completes.
• There are no users loaned into Snowflake.
• The Warehouse is inactive for a specified period of time.correct
--

37. True or False: The COPY command must specify a File Format in order to execute.
• Truecorrect
• False
--

38. Which of the following are true of multi-cluster Warehouses? Select all that apply
below.
• A multi-cluster Warehouse can add clusters automatically based on query activitycorrect
• A multi-cluster Warehouse can automatically turn itself off after a period of
inactivitycorrect
• A multi-cluster Warehouse can scale down when query activity slowscorrect
• A multi-cluster Warehouse can automatically turn itself on when a query is executed
against itcorrect
--

39. True or False: Reader Accounts incur no additional Compute costs to the Data
Provider since they are simply reading the shared data without making changes.
• Truecorrect
• Falsecorrect
--

40. Which of the following are options when creating a Virtual Warehouse?
• Auto-suspend correct
• Auto-resume correct
• Local SSD size
• User count
--

1. True or False: It is possible to unload structured data to semi-structured formats such


as JSON and parquet.
• True correct
• False
--

2. True or False: The user has to specify which cluster a query will run on in multi-
clustering Warehouse.
• True
• False correct
--

3. Data storage for individual tables can be monitored using which commands and/or
object(s)?

Choose 2 answers
• SHOW TABLES; correct
• SHOW STORAGE BY TABLE;
• Information Schema -> TABLE_STORAGE_METRICS
• Information Schema -> TASLE_HISTORY correct
--

4. When creating a user it is advisable sable to: Choose 2 answers


• Set the user to be initially disabled
• Force an immediate password change correct
• Set a default role for the user correct
• Set the number of minutes to unlock to 15 minutes
• Set the users access to expire within a specified timeframe
--

5. True or False: When Snowflake is configured to use Single Sign-on (sso), Snowflake
receive the usernames and credentials from the sso service and loads them into the
customer's Snowflake account.
• True
• false:correct
--

6. What is the recommended method for loading data into Snowflake?


• Load row by row
• Load data in batch correct
• Load data by writing it In the Snowflake Web Interface (UI)
• Load data via frequent, angle row DML's
--

7. True or False: Snowflake allows its customers to directly access the micro-partition
files that make up its tables.
• True
• False correct
--

8. True or False: Each worksheet in the Snowflake Web Interface (UI) can be associated
with different roles, databases, schemas, and Virtual Warehouses.
• True correct
• False
--

9. Which of the following statements are true of Snowflake data loading? Choose 3
answers
• VARIANT "nut" values are not the same as SQL Null values correct
• It is recommended to do frequent, single row DMLS
• It is recommended to validate the data before loading into the Snowflake target table
correct
• It is recommended to use staging tables to manage MERGE statements correct
--

10. True or false: Snowflake enforces unique, primary key, and foreign key constraints
during DML operations.
• True
• False correct
--

11. which of the following are valid approaches to loading data into a snowflake table?
select

all the below that apply.


• Bulk copy from an External Stage correct
• Continuous load using Snowpipe REST API correct
• The Snowflake Web Interface (UT) data loading wizard correct
• Bulk copy from an Internal Stage correct
--

12. As a best practice, clustering keys should only be defined on tables of which minimum
size?
• Multi-Kilobyte (KB) Range
• Multi-Megabyte (MB) Range
• Multi-Gigabyte (GB) Range
• Multi-Terabyte (TB) Range correct
--

13. What is the maximum compressed row size in Snowflake?


• 8KB
• 16MBcorrect
• 50MB
• 4000GB
--

14. In which layer of its architecture does Snowflake store its metadata statistics?
• Storage Layer
• Compute Layer
• Database Layer
• Cloud Service Layer correct
--

15. True or False: All Snowflake table types include fail-safe storage.
• True
• False correct
--

16. Which of the following languages can be used to implement Snowflake User Defined
Functions (UDFs)? Choose 2 answers
• Java
• JavaScript correct
• SQLcorrect
• Python
--

17. A Virtual Warehouse's auto-suspend and auto-resume settings apply to:


• The primary duster in the virtual warehouse
• The entire Virtual Warehouse correct
• The database the Virtual Warehouse resides in
• The queries currently being run by the Virtual Warehouse
--

18. True or False: A third-party tool that supports standard JDBC or ODBC but has no
Snowflake-specific driver will be unable to connect to Snowflake.
• True
• False correct
--

19. When scaling up Virtual Warehouse by increasing Virtual Warehouse t-shirt size, you
are primarily scaling for improved:
• Concurrency
• Performance correct
--

20. Which of the following are main sections of the top navigation of the Snowflake web
Interface (UI)?
• Database correct
• Tables
• Warehouses correct
• Worksheets correct
--

21. The number of queries that a Warehouse can concurrently process is determined by:
Choose 2 answers
• The complexity of each query correct
• The CONCURRENT_QUERY_UMIT parameter set on the Snowflake account
• The size of the data required for each query correct
• The tool that s executing the query
--

22. What happens when a Data Provider revokes privileges to a Share on an object in their
source database?
• The object immediately becomes unavailable for all Data Consumers correct
• Any additional data arriving after this point in time will not be visible to Data Consumers
• The Data Consumers stop seeing data updates and become responsible for storage
charges for the object
• A static copy of the object at the time the privilege was revoked is created In the Data
Consumers' accounts
--

23. Which type of table corresponds to a single Snowflake session?


• Temporary correct
• Translent
• Provisional
• Permanent
--

24. Fail-safe is unavailable on which table types?


• Temporary correct
• Translent correct
• Provisional
• Permanent
--

25. Increasing the size of a Virtual Warehouse from an X-Small to an X-Large is an


example of:
• Scaling rhythmically
• Scaling max
• Scaling out
• Scaling up correct
--

26. Which of the following statements is true of data loading?


• Resizing the virtual warehouse from x-Small to Small will process a single file twice as
fast
• The "deal file size for loading is 16MB to match micro-partition size
• Marry files in the 10-lOOMB range tend to land In the 'sweet spot" for load parallelism
Once loaded, there is no option to force a reload of an already loaded file correct
--

27. On which of the following cloud platform can a Snowflake account be hosted? Choose
2 answers
• Amazon Web Servicescorrect
• Private Virtual Cloud
• Oracle Cloud
• Microsoft Azure Cloudcorrect
--

28. Which of the following are common use cases for zero-copy cloning? Choose 3
answers
• Quick provisioning of Dev and Test/QA environments correct
• Data backups correct
• Point in time snapshots correct
• Performance optimization
--

29. True or False: Micro-partition metadata enables some operations to be completed


without requiring Compute.
• True correct
• False

30. Which of the following accurately represents how a table fits into Snowflake’s logical
container hierarchy?
• Account -> Schema -> Database -> Table
• Account -> Database -> Schema -> Table correct
• Database -> Schema -> Table -> Account
• Database -> Table -> Schema -> Account

31. True or False: When data share is established between a Data Provider and a data
Consumer, the Data Consumer can extend that data share to other Data Consumers.
• True
• False correct

32. Increasing the maximum number of clusters in a Multi-Cluster Warehouse is an


example of:
• Scaling rhythmically
• Scaling max
• Scaling out correct
• Scaling Up

33. Which of the following connectors are available in the Downloads section of the
Snowflake web Interface (UI) Choose 2 answers
• SnowSQL correct
• ODBC correct
• R
• HIVE

34. Which of the following items does the Cloud services Layer manage? Choose 4
answers
• user authentication correct
• Metadatacorrect
• Query compilation and optimization correct
• external blob storage
• Data security correct

35. Each incremental increase in Virtual Warehouse size (e,g. Medium to Large) generally
results in what?
• More micro-partitions
• Better query scheduling
• Double the numbers of servers In the compute duster correct
• Higher storage costs

36. For a multi-cluster Warehouse, the number of credits billed is calculated on:
• The number of queries that ran using the Warehouse.
• The size of the Warehouse and the number of clusters that ran within a given time
period. correct
• The sue of the Warehouse and the maximum number of clusters configured for the
Warehouse.
• The number of users who accessed the Warehouse.

37. Which of the following are options when creating a Virtual Warehouse?
• Auto-suspend correct
• Auto-resume correct
• Local SSD size
• User count

38. The FLATEEN function is used to query which type of data in Snowflake?
• Structured data
• Semi-structured data correct
• Both of the above
• None of the above

39. True or False: Multi_Factor Authentication (MFA) in Snowflake is only supported in


conjunction with single Sign-on (sso).
• True
• False correct

40. Which of the following statements is true of zero-copy cloning?


• Zero-copy clones objects inherit
• All zero-copy clone objects inherit the privileges of their original objects
• Zero-copy coning is licensed as an additional Snowflake feature
• At the instance/instance a clone is created, all micro-partitions in the original table and
the clone are fully shared. correct

Question 1
To run a Multi-Cluster Warehouse in auto-scale mode, a user would:

• A. Configure the Maximum Clusters setting to ‫ג‬€Auto-Scale‫ג‬€


• B. Set the Warehouse type to ‫ג‬€Auto‫ג‬€
• C. Set the Minimum Clusters and Maximum Clusters settings to the same value
• D. Set the Minimum Clusters and Maximum Clusters settings to the different
values

Answer : D

Question 2
Which of the following terms best describes Snowflake's database architecture?

• A. Columnar shared nothing


• B. Shared disk
• C. Multi-cluster, shared data
• D. Cloud-native shared memory

• Answer : B

Question 3
Which of the following are options when creating a Virtual Warehouse? (Choose two.)

• A. Auto-drop
• B. Auto-resize
• C. Auto-resume
• D. Auto-suspend

Answer : CD

Question 4
A Virtual Warehouse's auto-suspend and auto-resume settings apply to:

• A. The primary cluster in the Virtual Warehouse


• B. The entire Virtual Warehouse
• C. The database the Virtual Warehouse resides in
• D. The queries currently being run by the Virtual Warehouse

Answer : B

Question 5
Fail-safe is unavailable on which table types? (Choose two.)

• A. Temporary
• B. Transient
• C. Provisional
• D. Permanent

Answer : AB

Question 6
Which of the following objects is not covered by Time Travel?

• A. Tables
• B. Schemas
• C. Databases
• D. Stages

Answer : D

Question 7
True or False: Micro-partition metadata enables some operations to be completed
without requiring Compute.
• A. True
• B. False

Answer : A

Question 8
Which of the following commands are not blocking operations? (Choose two.)

• A. UPDATE
• B. INSERT
• C. MERGE
• D. COPY

Answer : BD

Question 9
Which of the following is true of Snowpipe via REST API? (Choose two.)

• A. You can only use it on Internal Stages


• B. All COPY INTO options are available during pipe creation
• C. Snowflake automatically manages the compute required to execute the Pipe‫ג‬€™s
COPY INTO commands
• D. Snowpipe keeps track of which files it has loaded

Answer : BD

Question 10
Snowflake recommends, as a minimum, that all users with the following role(s) should
be enrolled in Multi-Factor Authentication (MFA):

• A. SECURITYADMIN, ACCOUNTADMIN, PUBLIC, SYSADMIN


• B. SECURITYADMIN, ACCOUNTADMIN, SYSADMIN
• C. SECURITYADMIN, ACCOUNTADMIN
• D. ACCOUNTADMIN

Answer : D

Question 11
When can a Virtual Warehouse start running queries?

• A. 12am-5am
• B. Only during administrator defined time slots
• C. When its provisioning is complete
• D. After replication
Answer : B

Question 12
True or False: Users are able to see the result sets of queries executed by other users
that share their same role.
• A. True
• B. False
Answer : B

Question 13
True or False: The user has to specify which cluster a query will run on in a multi-
cluster Warehouse.

• A. True
• B. False

Answer : B

Question 14
True or False: Pipes can be suspended and resumed.

• A. True
• B. False

Answer : B

Question 15
Which of the following languages can be used to implement Snowflake User Defined
Functions (UDFs)? (Choose two.)

• A. Java
• B. Javascript
• C. SQL
• D. Python

Answer : BC

Which Snowflake function will parse a JSON-null into a SQL-null?

A TO_CHAR

B TO_VARIANT

C TO_VARCHAR

D STRIP NULL VALUE

Hide Solution Discuss 2


Correct Answer: D

TheSTRIP_NULL_VALUEfunction in Snowflake is used to convert a JSON null value into a SQL


NULL value1.

Question #2

What are the least privileges needed to view and modify resource monitors? (Select TWO).

A SELECT

B OWNERSHIP

C MONITOR

D MODIFY

E USAGE

Hide Solution Discuss 6

Correct Answer: C, D

To view and modify resource monitors, the least privileges needed are MONITOR and
MODIFY.These privileges allow a user to monitor credit usage and make changes to resource
monitors3.

Question #3

Which object can be used with Secure Data Sharing?

A View

B Materialized view

C External table

D User-Defined Function (UDF)

Hide Solution Discuss 0


Correct Answer: A

Viewscan be used with Secure Data Sharing in Snowflake.Materialized views, external tables,
and UDFs are not typically shared directly for security and performance reasons2.

Question #4

What are the least privileges needed to view and modify resource monitors? (Select TWO).

A SELECT

B OWNERSHIP

C MONITOR

D MODIFY

E USAGE

Hide Solution Discuss 6

Correct Answer: C, D

To view and modify resource monitors, the least privileges needed are MONITOR and
MODIFY.These privileges allow a user to monitor credit usage and make changes to resource
monitors3.

Question #5

A tag object has been assigned to a table (TABLE_A) in a schema within a Snowflake database.

Which CREATE object statement will automatically assign the TABLE_A tag to a target object?

A CREATE TABLE <table_name> LIKE TABLE_A;

B CREATE VIEW <view_name> AS SELECT * FROM TABLE_A;

C CREATE TABLE <table_name> AS SELECT * FROM TABLE_A;


D CREATE MATERIALIZED VIEW <view name> AS SELECT * FROM TABLE A;

Hide Solution Discuss 7

Correct Answer: C

When a tag object is assigned to a table, using the statementCREATE TABLE <table_name> AS
SELECT * FROM TABLE_Awill automatically assign the TABLE_A tag to the newly created
table2.

Question #1

What does Snowflake attempt to do if any of the compute resources for a virtual warehouse
fail to provision during start-up?

A Repair the failed resources.

B Restart failed resources.

C Queue the failed resources

D Provision the failed resources

Hide Solution Discuss 0

Correct Answer: B

Question #2

When snaring data in Snowflake. what privileges does a Provider need to grant along with a
share? (Select TWO).

A USAGE on the specific tables in the database.

B USAGE on the specific tables in the database.

C MODIFY on 1Mb specific tables in the database.

D USAGE on the database and the schema containing the tables to share
E OPEBATE on the database and the schema containing the tables to share.

Hide Solution Discuss 2

Correct Answer: C, D

Question #3

Which view in SNOWFLAKE.ACCOUNT_USAGE shows from which IP address a user


connected to Snowflak?

A ACCESS_HOSTORY

B LOGIN_HISTORY

C SESSIONS

D QUERY HISTORY

Hide Solution Discuss 0

Correct Answer: B

Question #4

When snaring data in Snowflake. what privileges does a Provider need to grant along with a
share? (Select TWO).

A USAGE on the specific tables in the database.

B USAGE on the specific tables in the database.

C MODIFY on 1Mb specific tables in the database.

D USAGE on the database and the schema containing the tables to share

E OPEBATE on the database and the schema containing the tables to share.

Hide Solution Discuss 6

Correct Answer: C, D
Question #5

What does Snowflake recommend a user do if they need to connect to Snowflake with a tool
or technology mat is not listed in Snowflake partner ecosystem?

A Use Snowflake's native API.

B Use a custom-built connector.

C Contact Snowflake Support for a new driver.

D Connect through Snowflake's JDBC or ODBC drivers

Hide Solution Discuss 10

Correct Answer: D

Question 1

Among the options listed, which is not a valid Snowflake Warehouse Size?

A) S

B) XL

C) M

D) XXS

Question 2

Does the Fail-Safe period for temporary and transient tables equal 0?
A) True

B) False

Question 3

Does the time travel feature in Snowflake preserve data at the expense of running

continuous backups?

A) True

B) False

Question 4

How many predecessor tasks can a child task have?

A) 1

B) 100

C) 1000

D) 5

Question 5

How frequently does Snowpipe load data into Snowflake?

A) As soon as data files are available in a stage.


B) Once every 1 minute.

C) Once every 5 minutes.

D) When we manually execute the COPY procedure.

Question 6

What types of stages are available in Snowflake?

A) External stages.

B) Mix stages.

C) Internal Stages.

D) Provider Stages.

Question 7

Will Snowpipe reload a file with the same name if it was modified 15 days after the

original load and copied over to the stage?

A) No. Snowpipe ignores any files that have already been loaded.

B) Yes it will copy the file causing potential duplicates. This is because Snowpipe

keeps history for only 14 days.


Question 8

Are shared databases exclusively read-only databases?

A) Yes

B) No

Question 9

Can a single storage integration support multiple external stages?

A) Yes

B) No

Question 10

Which of the following scenarios represents scaling out for a Snowflake virtual

Warehouse?

A) Changing the size of a warehouse from L to XL.

B) Adding a new virtual warehouse of the same size.

C) Adding more data storage to your Snowflake account.

D) Adding more EC2 instances for SQS.


Solutions

Question 1

D) XXS.

Explanation:

There is no XXS (Double Extra Small) Warehouse Size. The smallest available

Warehouse Size is X-Small (XS). You can view the various sizes in the provided image

or refer to Snowflake’s documentation

Question 2

A) True.

Explanation:

The Fail-Safe retention period for temporary and transient tables in Snowflake is

indeed 0. Unlike regular tables, they do not have a Fail-Safe retention period.

Question 3

B) False.
Explanation:

The Time Travel feature in Snowflake allows you to access historical data, including

data that has been altered or deleted, within a specified time frame. While using Time

Travel may result in additional storage costs, this feature is not dependent on

continuous backups like traditional databases.

When any DML operations are performed on a table, Snowflake retains previous

versions of the table data for a defined period of time. This enables querying earlier

versions of the data using the AT | BEFORE clause.

Question 4

B) 100

Explanation:

A single task can have a maximum of 100 predecessor tasks and 100 child tasks.

Question 5

A) As soon as data files are available in a stage.

Explanation:

Snowpipe enables loading data from files as soon as they’re available in a stage

Question 6

A) External stages.
C) Internal Stages.

Explanation:

In Snowflake, there are primarily two types of stages:

A) External Stages: These stages reference data files that are stored externally, such as

in a cloud storage service like AWS S3, Azure Blob Storage, or Google Cloud Storage.

C) Internal Stages: These stages store data files internally within Snowflake, making

them a part of the Snowflake ecosystem.

Question 7

B) Yes it will copy the file causing potential duplicates. This is because Snowpipe

keeps history for only 14 days.

Explanation:

For Files modified and staged again within 14 days

Snowpipe ignores modified files that are staged again. To reload modified data files, it

is currently necessary to recreate the pipe object using the CREATE OR REPLACE

PIPE syntax.

For Files modified and staged again after 14 days

Snowpipe loads the data again, potentially resulting in duplicate records in the target

table.
Question 8

A) Yes

Explanation:

All database objects shared between accounts are read-only (i.e. the objects cannot

be modified or deleted, including adding or modifying table data).

Sharing accounts cannot modify or delete objects or data within the shared database.

The primary purpose of sharing databases is to enable other accounts to query the

data without altering it.

Question 9

A) Yes

Explanation:

A single storage integration can support multiple external stages. The URL in the

stage definition must align with the storage location specified for the

STORAGE_ALLOWED_LOCATIONS parameter.

STORAGE_ALLOWED_LOCATIONS = (‘cloud_specific_url’)

Explicitly limits external stages that use the integration to reference one or more

storage locations (i.e. S3 bucket, GCS bucket, or Azure container). Supports a comma-

separated list of URLs for existing buckets and, optionally, paths used to store data
files for loading/unloading. Alternatively supports the * wildcard, meaning “allow

access to all buckets and/or paths”.

Question 10

B) Adding a new virtual warehouse of the same size.

Explanation:

Scaling out in Snowflake involves adding more virtual warehouses of the same size to

work in parallel. This approach increases concurrency and allows for more

simultaneous processing of queries. In contrast, scaling up involves changing the size

of an existing warehouse to provide more compute power, while the option related to

requesting more data storage from the cloud provider is unrelated to scaling virtual

warehouses.

Snowflake supports two ways to scale warehouses:

• Scale up by resizing a warehouse.

• Scale out by adding clusters to a multi-cluster warehouse

(requires Snowflake Enterprise Edition or higher).

Question 11

What is the recommended method for storing JSON data in Snowflake?

Options:

A) Using a column of the JSON data type.


B) Using a column of the VARCHAR data type.

C) Using a column of the VARIANT data type.

D) Using a column of the NULL data type.

Question 12

Is Fail-Safe period configurable?

1. Yes

2. No

Question 13

Can you use the GET command in Snowflake to download files from an external

stage?

A) Yes, the GET command can be used to download files from external stages.

B) No, Snowflake does not support the GET command for downloading files.

C) The GET command is used for querying data, not for file downloads.

D) The DOWNLOAD command should be used instead of GET for file retrieval from

external stages.
Question 14

What are the various types of stages that exist in Snowflake?

Options:

A) User.

B) Table.

C) Named internal.

D) Named external.

E) Account.

Question 15

Which command can be employed in SnowSQL to retrieve files from a stage?

A) GET.

B) PUT.

C) UNLOAD.

D) INSERT INTO.
Question 16

Which table function in Snowflake enables the transformation of semi-structured data

into a relational format?

A) FLATTEN.

B) CHECK_JSON.

C) PARSE_JSON.

Question 17

What method does Snowflake employ to restrict the retrieval of a minimal number of

micro-partitions as part of a query?

A) Pruning.

B) Clustering.

C) Indexing.

D) Computing.

Question 18

In which of the following situations is it advisable to utilize Snowpipe for loading data

into Snowflake?

A) We have a small volume of frequent data.


B) We have a huge volume of data generated as part of a batch schedule.

C) In both of the previous scenarios.

D) For running real-time reports on large data sets.

Question 19

Does Fail-Safe on tables result in additional storage costs ?

A) Yes

B) No

Question 20

Does Snowflake trigger the re-clustering of a table only when it would benefit from

the operation?

A) True

B) False

Answer Key

Question 11

Solution: C) Using a column of the VARIANT data type.


Explanation: In Snowflake, it is best practice to store JSON data as the VARIANT

data type within the tables. VARIANT columns can accommodate semi-structured

data, including JSON, with a maximum size limit of 16MB. This allows for efficient

storage and querying of JSON data using JSON notation, enabling the storage of

arrays, objects, and other JSON structures within Snowflake.

Question 12

Solution: B) No

Explanation: Fail-safe provides a (non-configurable) 7-day period during which

historical data may be recoverable by Snowflake.

Question 13

Solution: B) No, Snowflake does not support the GET command for downloading

files.

Explanation:

Get command downloads data files from one of the following Snowflake stages to a

local directory/folder on a client machine:

• Named internal stage.

• Internal stage for a specified table.

• Internal stage for the current user.


• GET does not support downloading files from external stages. To

download files from external stages, use the utilities provided by the cloud

service.

Question 14

Solution:

A) User.

B) Table.

C) Named internal.

D) Named external.

Explanation: Snowflake provides different types of stages to handle data loading

and retrieval. These include User stages, Table stages (associated with individual

tables), Named internal stages (for internal storage), and Named external stages (for

external data sources). Each type of stage serves distinct purposes in managing and

organizing data within Snowflake. Snowflake supports creating named external

stages.

Question 15

Solution: A) GET.

Explanation: The GET command is utilized in SnowSQL to download files from a

Snowflake internal stage, such as a user stage, table stage, or named internal stage, to
a directory or folder on a client machine. This operation can be performed using

SnowSQL, allowing for easy retrieval of data loaded using the COPY INTO command.

Question 16

Solution: A) FLATTEN.

Explanation: The FLATTEN table function in Snowflake is designed for the

conversion of semi-structured data into a relational representation. It works with

VARIANT, OBJECT, or ARRAY columns and produces a lateral view, facilitating the

handling of semi-structured data in a structured, tabular format.

Question 17

Solution: A) Pruning.

Explanation: Snowflake utilizes query pruning to analyze and access the minimum

number of micro-partitions necessary to answer a query efficiently. This technique

optimizes query performance by retrieving only the relevant data, significantly

reducing query processing time.

Question 18

Solution: A) We have a small volume of frequent data.

Explanation: Snowpipe enables loading data from files as soon as they’re available

in a stage. This means you can load data from files in micro-batches, making it

available to users within minutes, rather than manually executing COPY statements

on a schedule to load larger batches.


Question 19

Solution: A) Yes

Explanation: Fail-safe provides a (non-configurable) 7-day period during which

historical data may be recoverable by Snowflake. Storage fees are incurred for

maintaining historical data during both the Time Travel and Fail-safe periods.

Question 20

A) True.

Explanation: In Snowflake, when performing Data Manipulation Language (DML)

operations such as INSERT, UPDATE, DELETE, MERGE, or COPY, the data in a

clustered table can become less optimally clustered. To address this, Snowflake

provides automatic and periodic re-clustering to maintain optimal data organization.

Importantly, Snowflake only initiates re-clustering for a table when it would result in

a benefit to the table’s data organization and performance.

Question 1

A Snowflake account has activated federated authentication.


What will occur when a user with a password that was defined by Snowflake
attempts to log in to Snowflake?
AThe user will be unable to enter a password.

BThe user will encounter an error, and will not be able to log in.

CThe user will be able to log into Snowflake successfully.


DAfter entering the username and password, the user will be redirected to an
Identity Provider (IdP) login page.

Reveal Answer
Answer : A

When federated authentication is activated in Snowflake, users authenticate via


an external identity provider (IdP) rather than using Snowflake-managed
credentials. Therefore, a user with a password defined by Snowflake will be
unable to enter a password and must use their IdP credentials to log in.

Next Question
Question 2

Which Snowflake feature provides increased login security for users connecting
to Snowflake that is powered by Duo Security service?
AOAuth

BNetwork policies

CSingle Sign-On (SSO)

DMulti-Factor Authentication (MFA)

Reveal Answer
Answer : D

Multi-Factor Authentication (MFA) provides increased login security for users


connecting to Snowflake. Snowflake's MFA is powered by Duo Security service,
which adds an additional layer of security during the login process.
Next Question
Question 3

A column named "Data" contains VARIANT data and stores values as follows:

How will Snowflake extract the employee's name from the column data?
AData:employee.name

BDATA:employee.name

Cdata:Employee.name

Ddata:employee.name

Reveal Answer
Answer : D

In Snowflake, to extract a specific value from a VARIANT column, you use the
column name followed by a colon and then the key. The keys are case-sensitive.
Therefore, to extract the employee's name from the ''Data'' column, the correct
syntax isdata:employee.name.

Next Question
Question 4

What is the purpose of the Snowflake SPLIT TO_TABLE function?


ATo count the number of characters in a string
BTo split a string into an array of sub-strings

CTo split a string and flatten the results into rows

DTo split a string and flatten the results into columns

Reveal Answer
Answer : C
The purpose of the Snowflake SPLIT_TO_TABLE function is to split a string based on a
specified delimiter and flatten the results into rows.This table function is useful for
transforming a delimited string into a set of rows that can be further processed or queried5.

Next Question
Question 5

How can a Snowflake user traverse semi-structured data?


AInsert a colon (:) between the VARIANT column name and any first-level
element.

BInsert a colon (:) between the VARIANT column name and any second-level
element. C. Insert a double colon (: :) between the VARIANT column name and
any first-level element.

DInsert a double colon (: :) between the VARIANT column name and any second-
level element.

Reveal Answer Next Question


Question 6

Which Snowflake feature allows a user to track sensitive data for compliance,
discovery, protection, and resource usage?
ATags

BComments

CInternal tokenization

DRow access policies

Reveal Answer
Answer : A
Tags in Snowflake allow users to track sensitive data for compliance, discovery, protection,
and resource usage.They enable the categorization and tracking of data, supporting
compliance with privacy regulations678. Reference: [COF-C02] SnowPro Core Certification
Exam Study Guide

Next Question
Question 7

What does the LATERAL modifier for the FLATTEN function do?
ACasts the values of the flattened data

BExtracts the path of the flattened data

CJoins information outside the object with the flattened data

DRetrieves a single instance of a repeating element in the flattened data

Reveal Answer
Answer : C
The LATERAL modifier for the FLATTEN function allows joining information outside the object
(such as other columns in the source table) with the flattened data, creating a lateral view that
correlates with the preceding tables in the FROM clause2345. Reference: [COF-C02] SnowPro
Core Certification Exam Study Guide

Question Type: Single Choice


A Snowflake account has activated federated authentication.
What will occur when a user with a password that was defined by Snowflake attempts to log in to
Snowflake?

Options:
A The user will be unable to enter a password.

B The user will encounter an error, and will not be able to log in.

C The user will be able to log into Snowflake successfully.

D After entering the username and password, the user will be redirected to an Identity Provider
(IdP) login page.

Answer: A

Explanation:
When federated authentication is activated in Snowflake, users
authenticate via an external identity provider (IdP) rather than using
Snowflake-managed credentials. Therefore, a user with a password
defined by Snowflake will be unable to enter a password and must
use their IdP credentials to log in.

Question Type: Single Choice


Which Snowflake feature provides increased login security for users connecting to Snowflake
that is powered by Duo Security service?

Options:
A OAuth
B Network policies

C Single Sign-On (SSO)

D Multi-Factor Authentication (MFA)

Answer: D

Explanation:
Multi-Factor Authentication (MFA) provides increased login security
for users connecting to Snowflake. Snowflake's MFA is powered by
Duo Security service, which adds an additional layer of security
during the login process.

Question Type: Single Choice


A column named 'Data' contains VARIANT data and stores values as follows:

How will Snowflake extract the employee's name from the column data?

Options:
A Data:employee.name

B DATA:employee.name

C data:Employee.name

D data:employee.name

Answer: D

Explanation:
In Snowflake, to extract a specific value from a VARIANT column,
you use the column name followed by a colon and then the key.
The keys are case-sensitive. Therefore, to extract the employee's
name from the ''Data'' column, the correct syntax
isdata:employee.name.

Question Type: Single Choice


What is the purpose of the Snowflake SPLIT TO_TABLE function?

Options:
A To count the number of characters in a string

B To split a string into an array of sub-strings

C To split a string and flatten the results into rows

D To split a string and flatten the results into columns

Answer: C

Explanation:
The purpose of the Snowflake SPLIT_TO_TABLE function is to split
a string based on a specified delimiter and flatten the results into
rows.This table function is useful for transforming a delimited string
into a set of rows that can be further processed or queried5.

Question Type: Single Choice


How can a Snowflake user traverse semi-structured data?

Options:
A Insert a colon (:) between the VARIANT column name and any first-level element.

B Insert a colon (:) between the VARIANT column name and any second-level element. C. Insert
a double colon (: :) between the VARIANT column name and any first-level element.

D Insert a double colon (: :) between the VARIANT column name and any second-level element.
Answer: A

Explanation:
To traverse semi-structured data in Snowflake, a user can insert a
colon (:) between the VARIANT column name and any first-level
element.This path syntax is used to retrieve elements in a
VARIANT column4.

01. Which copy option is used to delete the file from the Snowflake stage
when data from staged files are loaded successfully?
a) DELETE = TRUE
b) DEL = TRUE
c) PURGE = TRUE
d) REMOVE = TRUE

02. A clustering key is added or modified for a large table. Which type of
queries will likely see performance improvement?
a) Queries that select all rows in the table
b) Queries that sort on the columns which are part of the cluster key
c) Queries that join on the columns which are part of the cluster key
d) Queries that select all columns in the table
e) Queries that group on the columns which are part of the cluster key
f) Queries that filter on the columns which are part of the cluster key

03. For a non-ACCOUNTADMIN user, what privileges are required to create a


share?
a) MANAGE ACCOUNT privileges
b) CREATE ACCOUNT privileges
c) SECURITY privileges
d) CREATE SHARE privileges

04. Dynamic Data Masking provides what sort of security in Snowflake?


a) Object Security
b) Row-level security
c) Column-level security
d) Database-level security

05. What is Snowflake's behaviour when enforcing a network policy with an


IP address in both the block list and the allow list?
a) The specific IP address is ignored from the network policy
b) Snowflake uses the allow list first, ensuring that the IP address can connect even
if it is also in the block list.
c) Because both the allowed and blocked lists cannot be filled, the network policy is
invalid.
d) Snowflake initially applies the block list, preventing the IP address from connecting,
even if it is also defined in the allow list.
06. How can a directory table metadata be refreshed automatically and
efficiently to synchronize the metadata with the latest associated files in the
external stage and path?
a) Using Stream
b) Using Tasks
c) Using Cloud event notification service
d) Using both Tasks and Stream
e) It is a manual process and cant be automatically refreshed

07. When the Virtual Warehouse data cache gets filled up, in which fashion
does the data get flushed out from the data cache?
a) LEAST-RECENTLY USED (LRU)
b) First In First Out (FIFO)
c) Last In Last Out (LILO)
d) MOST-RECENTLY USED (MRU)

08. External tables are a good solution for which of the following is true?
a) Typically only a subset of data is being accessed
b) Data is in binary format and can not be loaded into Snowflake.
c) The data is not accessed frequently.
d) Data is already in a data lake on a cloud platform (e.g., S3, Azure Blob Storage)

09. You need to see the history of all queries executed in the last 60 minutes.
Which of the following method should you use?
a) Request Snowflake support to provide query history
b) Use the QUERY_HISTORY table function in the INFORMATION schema
c) Use the QUERY_HISTORY view in the ACCOUNT_USAGE schema
d) View the historical queries using the history tab

10. What all options are available for data transformation while loading data
into a table using the COPY command?
a) Column omission
b) Column reordering
c) Casts
d) Truncation of Text Strings
e) Join

Questions 1

Which kind of Snowflake table stores file-level metadata for each file in a stage?

Options:

A. Directory

B. External

C. Temporary

D. Transient
Show Answer

Answer:

Explanation:

Explanation:

The kind of Snowflake table that stores file-level metadata for each file in a stage is a directory
table. A directory table is an implicit object layered on a stage and stores file-level metadata about
the data files in the stage3.

Questions 2

What is the minimum Snowflake edition needed for database failover and fail-back between
Snowflake accounts for business continuity and disaster recovery?

Options:

A. Standard

B. Enterprise

C. Business Critical

D. Virtual Private Snowflake

Show Answer

Answer:

Explanation:

Explanation:

The minimum Snowflake edition required for database failover and fail-back between Snowflake
accounts for business continuity and disaster recovery is the Business Critical edition. References:
Snowflake Documentation3.

Questions 3

Which database objects can be shared with the Snowflake secure data sharing feature? (Choose
two.)
Options:

A. Files

B. External tables

C. Secure User-Defined Functions (UDFs)

D. Sequences

E. Streams

Show Answer

Answer:

B, C

Explanation:

Explanation:

Snowflake’s secure data sharing feature allows sharing of certain database objects with other
Snowflake accounts. Among the options provided, external tables and secure UDFs can be shared

Question 1

In the Snowflake access control model, which entity owns an object by default?

• A. The user who created the object


• B. The SYSADMIN role
• C. Ownership depends on the type of object
• D. The role used to create the object

Answer:D

Question 2

What is the default access of a securable object until other access is granted?

• A. No access
• B. Read access
• C. Write access
• D. Full access

Answer:A
Question 3

A user created a new worksheet within the Snowsight UI and wants to share this with
teammates.

How can this worksheet be shared?

• A. Create a zero-copy clone of the worksheet and grant permissions to


teammates.
• B. Create a private Data Exchange so that any teammate can use the
worksheet.
• C. Share the worksheet with teammates within Snowsight.
• D. Create a database and grant all permissions to teammates.

Answer:B

Question 4

Which command is used to generate a zero-copy snapshot of any table, schema, or


database?

• A. ALTER
• B. CREATE ... CLONE
• C. COPY INTO
• D. CREATE REPLICATION GROUP

Answer:D

Question 5

Which sequence (order) of object privileges should be used to grant a custom role
read-only access on a table?

• D. None

Answer:C

Question 6

Which of the following objects can be directly restored using the UNDROP
command? (Choose two.)

• A. Schema
• B. View
• C. Internal stage
• D. Table
• E. User
• F. Role

Answer: AD

Question 7

What is the default compression typo when unloading data from Snowflake?

• A. Brotli
• B. bzip2
• C. Zstandard
• D. gzip

Answer: D

Question 8

Who can create network policies within Snowflake? (Choose two.)

• A. SYSADMIN only
• B. ORGADMIN only
• C. SECURITYADMIN or higher roles
• D. A role with the CREATE NETWORK POLICY privilege
• E. A role with the CREATE SECURITY INTEGRATION privilege

Answer: CD

Question 9

Which table function should be used to view details on a Directed Acyclic Graph
(DAG) run that is presently scheduled or is executing?

• A. TASK_HISTORY
• B. TASK_DEPENDENTS
• C. CURRENT_TASK_GRAPHS
• D. COMPLETE_TASK_GRAPHS

Answer: C

Question 10

What is used to extract the content of PDF files stored in Snowflake stages?

• A. FLATTEN function
• B. Window function
• C. HyperLogLog (HLL) function
• D. Java User-Defined Function (UDF)

Answer: D

Question 11

Account-level storage usage can be monitored via:

• A. The Snowflake Web Interface (UI) in the Databases section


• B. The Snowflake Web Interface (UI) in the Account -> Billing & Usage section
• C. The Information Schema -> ACCOUNT_USAGE_HISTORY View
• D. The Account Usage Schema -> ACCOUNT_USAGE_METRICS View

Answer: B

Question 12

By default, which role has access to the


SYSTEM$GLOBAL_ACCOUNT_SET_PARAMETER function?

• A. ACCOUNTADMIN
• B. SECURITYADMIN
• C. SYSADMIN
• D. ORGADMIN

Answer: D

Question 13

Which Snowflake features can be enabled by calling the


SYSTEM$GLOBAL_ACCOUNT_SET_PARAMETER function by a user with the
ORGADMIN role? (Choose two.)

• A. Clustering
• B. Client redirect
• C. Fail-safe
• D. Search optimization service
• E. Account and database replication

Answer: BE

Question 14

What is the maximum total Continuous Data Protection (CDP) charges incurred for a
temporary table?
• A. 30 days
• B. 7 days
• C. 48 hours
• D. 24 hours

Answer: D

Question 15

What happens to the shared objects for users in a consumer account from a share,
once a database has been created in that account?

• A. The shared objects are transferred.


• B. The shared objects are copied.
• C. The shared objects become accessible.
• D. The shared objects can be re-shared.

Answer: B

Question 16

What services does Snowflake automatically provide for customers that they may
have been responsible for with their on-premise system? (Choose all that apply.)

• A. Installing and configuring hardware


• B. Patching software
• C. Physical security
• D. Maintaining metadata and statistics

Answer: ABD

Question 17

Which COPY INTO statement accurately describes how to unload data from a
Snowflake table?

• A. The default value for the SINGLE option is set to TRUE.


• B. By default, COPY INTO statements do not separate table data into a set of
output files.
• C. The OBJECT_CONSTRUCT function can be combined with the COPY
command to convert the rows in a relational table to a single VARIANT
column.
• D. If the COMPRESSION option is set to TRUE, a file's name can be specified
with the appropriate file extension so that the output file can be compressed.
Answer: D

Question 18

What tasks can an account administrator perform in the Data Exchange? (Choose
two.)

• A. Add and remove members.


• B. Delete data categories.
• C. Approve and deny listing approval requests.
• D. Transfer listing ownership.
• E. Transfer ownership of a provider profile.

Answer: AC

Question 19

How is unstructured data retrieved from data storage?

• A. SQL functions like the GET command can be used to copy the unstructured
data to a location on the client.
• B. SQL functions can be used to create different types of URLs pointing to the
unstructured data. These URLs can be used to download the data to a client.
• C. SQL functions can be used to retrieve the data from the query results cache.
When the query results are output to a client. the unstructured data will be
output to the client as files.
• D. SQL functions can call on different web extensions designed to display
different types of files as a web page. The web extensions will allow the files to
be downloaded to the client.

Answer: B

Which Snowflake database object can be shared with other accounts?

• A. Tasks
• B. Pipes
• C. Secure User-Defined Functions (UDFs)
• D. Stored Procedures

Answer: C
Questions 4

What happens when a virtual warehouse is resized?

Options:

A.

When increasing the size of an active warehouse the compute resource for all running and
queued queries on the warehouse are affected

B.

When reducing the size of a warehouse the compute resources are removed only when they are
no longer being used to execute any current statements.

C.

The warehouse will be suspended while the new compute resource is provisioned and will resume
automatically once provisioning is complete.

D.

Users who are trying to use the warehouse will receive an error message until the resizing is
complete
Answer: A

Explanation:

Explanation:

When a virtual warehouse in Snowflake is resized, specifically when it is increased in size, the
additional compute resources become immediately available to all running and queued queries.
This means that the performance of these queries can improve due to the increased
resources. Conversely, when the size of a warehouse is reduced, the compute resources are not
removed until they are no longer being used by any current operations1.

Questions 5

Which Snowflake objects track DML changes made to tables, like inserts, updates, and deletes?

Options:

A.

Pipes

B.

Streams

C.

Tasks

D.

Procedures

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, Streams are the objects that track Data Manipulation Language (DML) changes
made to tables, such as inserts, updates, and deletes. Streams record these changes along with
metadata about each change, enabling actions to be taken using the changed data. This process is
known as change data capture (CDC)2.

Questions 6

When unloading to a stage, which of the following is a recommended practice or approach?

Options:

A.

Set SINGLE: = true for larger files

B.

Use OBJECT_CONSTRUCT ( * ) when using Parquet

C.

Avoid the use of the CAST function

D.

Define an individual file format

Show Answer Buy Now

Answer:

Explanation:

Explanation:

When unloading data to a stage, it is recommended to define an individual file format. This
ensures that the data is unloaded in a consistent and expected format, which can be crucial for
downstream processing and analysis2

Questions 7

Will data cached in a warehouse be lost when the warehouse is resized?

Options:

A.

Possibly, if the warehouse is resized to a smaller size and the cache no longer fits.
B.

Yes. because the compute resource is replaced in its entirety with a new compute resource.

C.

No. because the size of the cache is independent from the warehouse size

D.

Yes. became the new compute resource will no longer have access to the cache encryption key

Show Answer Buy Now

Answer:

Explanation:

Explanation:

When a Snowflake virtual warehouse is resized, the data cached in the warehouse is not lost. This
is because the cache is maintained independently of the warehouse size. Resizing a warehouse,
whether scaling up or down, does not affect the cached data, ensuring that query performance is
not impacted by such changes.

References:

• [COF-C02] SnowPro Core Certification Exam Study Guide


• Snowflake Documentation on Virtual Warehouse Performance1

Questions 8

What is the minimum Snowflake edition required to create a materialized view?

Options:

A.

Standard Edition

B.

Enterprise Edition

C.
Business Critical Edition

D.

Virtual Private Snowflake Edition

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Materialized views in Snowflake are a feature that allows for the pre-computation and storage of
query results for faster query performance. This feature is available starting from the Enterprise
Edition of Snowflake. It is not available in the Standard Edition, and while it is also available in
higher editions like Business Critical and Virtual Private Snowflake, the Enterprise Edition is the
minimum requirement.

Questions 9

What can be used to view warehouse usage over time? (Select Two).

Options:

A.

The load HISTORY view

B.

The Query history view

C.

The show warehouses command

D.

The WAREHOUSE_METERING__HISTORY View

E.

The billing and usage tab in the Snowflake web Ul


Show Answer Buy Now

Answer:

B, D

Explanation:

Explanation:

To view warehouse usage over time, the Query history view and the
WAREHOUSE_METERING__HISTORY View can be utilized. The Query history view allows users
to monitor the performance of their queries and the load on their warehouses over a specified
period1. The WAREHOUSE_METERING__HISTORY View provides detailed information about
the workload on a warehouse within a specified date range, including average running and
queued loads2. References: [COF-C02] SnowPro Core Certification Exam Study Guide

Questions 10

Which command can be used to stage local files from which Snowflake interface?

Options:

A.

SnowSQL

B.

Snowflake classic web interface (Ul)

C.

Snowsight

D.

.NET driver

Show Answer Buy Now

Answer:

A
Explanation:

Explanation:

SnowSQL is the command-line client for Snowflake that allows users to execute SQL queries and
perform all DDL and DML operations, including staging files for bulk data loading. It is specifically
designed for scripting and automating tasks.

Questions 11

Which data type can be used to store geospatial data in Snowflake?

Options:

A.

Variant

B.

Object

C.

Geometry

D.

Geography

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowflake supports two geospatial data types: GEOGRAPHY and GEOMETRY.


The GEOGRAPHY data type is used to store geospatial data that models the Earth as a perfect
sphere, which is suitable for global geospatial data. This data type follows the WGS 84 standard
and is used for storing points, lines, and polygons on the Earth’s surface. The GEOMETRY data
type, on the other hand, represents features in a planar (Euclidean, Cartesian) coordinate system
and is typically used for local spatial reference systems. Since the question specifically asks about
geospatial data, which commonly refers to Earth-related spatial data, the correct answer
is GEOGRAPHY3. References: [COF-C02] SnowPro Core Certification Exam Study Guide

Questions 12

What are two ways to create and manage Data Shares in Snowflake? (Choose two.)

Options:

A.

Via the Snowflake Web Interface (Ul)

B.

Via the data_share=true parameter

C.

Via SQL commands

D.

Via Virtual Warehouses

Show Answer Buy Now

Answer:

A, C

Explanation:

Explanation:

In Snowflake, Data Shares can be created and managed in two primary ways:

• Via the Snowflake Web Interface (UI): Users can create and manage shares through the
graphical interface provided by Snowflake, which allows for a user-friendly experience.
• Via SQL commands: Snowflake also allows the creation and management of shares using
SQL commands. This method is more suited for users who prefer scripting or need to
automate the process.

Questions 13
What transformations are supported in a CREATE PIPE ... AS COPY ... FROM (....) statement?
(Select TWO.)

Options:

A.

Data can be filtered by an optional where clause

B.

Incoming data can be joined with other tables

C.

Columns can be reordered

D.

Columns can be omitted

E.

Row level access can be defined

Show Answer Buy Now

Answer:

A, D

Explanation:

Explanation:

In a CREATE PIPE ... AS COPY ... FROM (....) statement, the supported transformations include
filtering data using an optional WHERE clause and omitting columns. The WHERE clause allows
for the specification of conditions to filter the data that is being loaded, ensuring only relevant
data is inserted into the table. Omitting columns enables the exclusion of certain columns from
the data load, which can be useful when the incoming data contains more columns than are
needed for the target table.

Questions 14

What tasks can be completed using the copy command? (Select TWO)
Options:

A.

Columns can be aggregated

B.

Columns can be joined with an existing table

C.

Columns can be reordered

D.

Columns can be omitted

E.

Data can be loaded without the need to spin up a virtual warehouse

Show Answer Buy Now

Answer:

C, D

Explanation:

Explanation:

The COPY command in Snowflake allows for the reordering of columns as they are loaded into a
table, and it also permits the omission of columns from the source file during the load process.
This provides flexibility in handling the schema of the data being ingested. References: [COF-C02]
SnowPro Core Certification Exam Study Guide

Questions 15

A sales table FCT_SALES has 100 million records.

The following Query was executed

SELECT COUNT (1) FROM FCT__SALES;

How did Snowflake fulfill this query?


Options:

A.

Query against the result set cache

B.

Query against a virtual warehouse cache

C.

Query against the most-recently created micro-partition

D.

Query against the metadata excite

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowflake is designed to optimize query performance by utilizing metadata for certain types of
queries. When executing a COUNT query, Snowflake can often fulfill the request by accessing
metadata about the table’s row count, rather than scanning the entire table or micro-partitions.
This is particularly efficient for large tables like FCT_SALES with a significant number of records.
The metadata layer maintains statistics about the table, including the row count, which enables
Snowflake to quickly return the result of a COUNT query without the need to perform a full scan.

Questions 16

The fail-safe retention period is how many days?

Options:

A.

1 day

B.

7 days
C.

45 days

D.

90 days

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Fail-safe is a feature in Snowflake that provides an additional layer of data protection. After the
Time Travel retention period ends, Fail-safe offers a non-configurable 7-day period during which
historical data may be recoverable by Snowflake. This period is designed to protect against
accidental data loss and is not intended for customer access.

References: Understanding and viewing Fail-safe | Snowflake Documentation

Questions 17

Which command is used to unload data from a Snowflake table into a file in a stage?

Options:

A.

COPY INTO

B.

GET

C.

WRITE

D.

EXTRACT INTO

Show Answer Buy Now


Answer:

Explanation:

Explanation:

The COPY INTO command is used in Snowflake to unload data from a table into a file in a stage.
This command allows for the export of data from Snowflake tables into flat files, which can then
be used for further analysis, processing, or storage in external systems.

Questions 18
Which command can be used to load data into an internal stage?

Options:

A.

LOAD

B.

copy

C.

GET

D.

PUT

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The PUT command is used to load data into an internal stage in Snowflake. This command
uploads data files from a local file system to a named internal stage, making the data available for
subsequent loading into a Snowflake table using the COPY INTO command.
Questions 19
Which of the following conditions must be met in order to return results from the results cache?
(Select TWO).

Options:

A.

The user has the appropriate privileges on the objects associated with the query

B.

Micro-partitions have been reclustered since the query was last run

C.

The new query is run using the same virtual warehouse as the previous query

D.

The query includes a User Defined Function (UDF)

E.

The query has been run within 24 hours of the previously-run query

Show Answer Buy Now

Answer:

A, E

Explanation:

Explanation:

To return results from the results cache in Snowflake, certain conditions must be met:

• Privileges: The user must have the appropriate privileges on the objects associated with
the query. This ensures that only authorized users can access cached data.
• Time Frame: The query must have been run within 24 hours of the previously-run query.
Snowflake’s results cache is designed to store the results of queries for a short period,
typically 24 hours, to improve performance for repeated queries.

Questions 20
True or False: It is possible for a user to run a query against the query result cache without
requiring an active Warehouse.

Options:

A.

True

B.

False

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowflake’s architecture allows for the use of a query result cache that stores the results of
queries for a period of time. If the same query is run again and the underlying data has not
changed, Snowflake can retrieve the result from this cache without needing to re-run the query
on an active warehouse, thus saving on compute resources.

Questions 21

A company's security audit requires generating a report listing all Snowflake logins (e.g.. date and
user) within the last 90 days. Which of the following statements will return the required
information?

Options:

A.

SELECT LAST_SUCCESS_LOGIN, LOGIN_NAME

FROM ACCOUNT_USAGE.USERS;

B.

SELECT EVENT_TIMESTAMP, USER_NAME

FROM table(information_schema.login_history_by_user())
C.

SELECT EVENT_TIMESTAMP, USER_NAME

FROM ACCOUNT_USAGE.ACCESS_HISTORY;

D.

SELECT EVENT_TIMESTAMP, USER_NAME

FROM ACCOUNT_USAGE.LOGIN_HISTORY;

Show Answer Buy Now

Answer:

Explanation:

Explanation:

To generate a report listing all Snowflake logins within the last 90 days,
the ACCOUNT_USAGE.LOGIN_HISTORY view should be used. This view provides information
about login attempts, including successful and unsuccessful logins, and is suitable for security
audits4.

Questions 22

Which Snowflake partner specializes in data catalog solutions?

Options:

A.

Alation

B.

DataRobot

C.

dbt

D.

Tableau
Show Answer Buy Now

Answer:

Explanation:

Explanation:

Alation is known for specializing in data catalog solutions and is a partner of Snowflake. Data
catalog solutions are essential for organizations to effectively manage their metadata and make it
easily accessible and understandable for users, which aligns with the capabilities provided by
Alation.

Questions 23

When is the result set cache no longer available? (Select TWO)

Options:

A.

When another warehouse is used to execute the query

B.

When another user executes the query

C.

When the underlying data has changed

D.

When the warehouse used to execute the query is suspended

E.

When it has been 24 hours since the last query

Show Answer Buy Now

Answer:

C, E
Explanation:

Explanation:

The result set cache in Snowflake is invalidated and no longer available when the underlying data
of the query results has changed, ensuring that queries return the most current data. Additionally,
the cache expires after 24 hours to maintain the efficiency and accuracy of data retrieval1.

Questions 24

Which of the following describes how multiple Snowflake accounts in a single organization relate
to various cloud providers?

Options:

A.

Each Snowflake account can be hosted in a different cloud vendor and region.

B.

Each Snowflake account must be hosted in a different cloud vendor and region

C.

All Snowflake accounts must be hosted in the same cloud vendor and region

D.

Each Snowflake account can be hosted in a different cloud vendor, but must be in the same
region.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowflake’s architecture allows for flexibility in account hosting across different cloud vendors
and regions. This means that within a single organization, different Snowflake accounts can be set
up in various cloud environments, such as AWS, Azure, or GCP, and in different geographical
regions. This allows organizations to leverage the global infrastructure of multiple cloud providers
and optimize their data storage and computing needs based on regional requirements, data
sovereignty laws, and other considerations.

https://docs.snowflake.com/en/user-guide/intro-regions.html

Questions 25

What features does Snowflake Time Travel enable?

Options:

A.

Querying data-related objects that were created within the past 365 days

B.

Restoring data-related objects that have been deleted within the past 90 days

C.

Conducting point-in-time analysis for Bl reporting

D.

Analyzing data usage/manipulation over all periods of time

Show Answer Buy Now

Answer:

B, C

Explanation:

Explanation:

Snowflake Time Travel is a powerful feature that allows users to access historical data within a
defined period. It enables two key capabilities:

• B. Restoring data-related objects that have been deleted within the past 90 days: Time
Travel can be used to restore tables, schemas, and databases that have been accidentally
or intentionally deleted within the Time Travel retention period.
• C. Conducting point-in-time analysis for BI reporting: It allows users to query historical
data as it appeared at a specific point in time within the Time Travel retention period,
which is crucial for business intelligence and reporting purposes.
While Time Travel does allow querying of past data, it is limited to the retention period set for the
Snowflake account, which is typically 1 day for standard accounts and can be extended up to 90
days for enterprise accounts. It does not enable querying or restoring objects created or deleted
beyond the retention period, nor does it provide analysis over all periods of time.

Questions 26

What is a limitation of a Materialized View?

Options:

A.

A Materialized View cannot support any aggregate functions

B.

A Materialized View can only reference up to two tables

C.

A Materialized View cannot be joined with other tables

D.

A Materialized View cannot be defined with a JOIN

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Materialized Views in Snowflake are designed to store the result of a query and can be refreshed
to maintain up-to-date data. However, they have certain limitations, one of which is that they
cannot be defined using a JOIN clause. This means that a Materialized View can only be created
based on a single source table and cannot combine data from multiple tables using JOIN
operations.

Questions 27

Which command can be used to list all the file formats for which a user has access privileges?
Options:

A.

LIST

B.

ALTER FILE FORMAT

C.

DESCRIBE FILE FORMAT

D.

SHOW FILE FORMATS

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The command to list all the file formats for which a user has access privileges in Snowflake
is SHOW FILE FORMATS. This command provides a list of all file formats defined in the user's
current session or specified database/schema, along with details such as the name, type, and
creation time of each file format. It is a valuable tool for users to understand and manage the file
formats available for data loading and unloading operations.

Questions 28

Which data types optimally store semi-structured data? (Select TWO).

Options:

A.

ARRAY

B.

CHARACTER
C.

STRING

D.

VARCHAR

E.

VARIANT

Show Answer Buy Now

Answer:

A, E

Explanation:

Explanation:

In Snowflake, semi-structured data is optimally stored using specific data types that are designed
to handle the flexibility and complexity of such data. The VARIANT data type can store structured
and semi-structured data types, including JSON, Avro, ORC, Parquet, or XML, in a single column.
The ARRAY data type, on the other hand, is suitable for storing ordered sequences of elements,
which can be particularly useful for semi-structured data types like JSON arrays. These data types
provide the necessary flexibility to store and query semi-structured data efficiently in Snowflake.

Questions 29

Which function will provide the proxy information needed to protect Snowsight?

Options:

A.

SYSTEMADMIN_TAG

B.

SYSTEM$GET_PRIVATELINK

C.

SYSTEMSALLONTLIST

D.
SYSTEMAUTHORIZE

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The SYSTEM$GET_PRIVATELINK function in Snowflake provides proxy information necessary


for configuring PrivateLink connections, which can protect Snowsight as well as other Snowflake
services. PrivateLink enhances security by allowing Snowflake to be accessed via a private
connection within a cloud provider’s network, reducing exposure to the public internet.

Questions 30

Which view can be used to determine if a table has frequent row updates or deletes?

Options:

A.

TABLES

B.

TABLE_STORAGE_METRICS

C.

STORAGE_DAILY_HISTORY

D.

STORAGE USAGE

Show Answer Buy Now

Answer:

B
Explanation:

Explanation:

The TABLE_STORAGE_METRICS view can be used to determine if a table has frequent row
updates or deletes. This view provides detailed metrics on the storage utilization of tables within
Snowflake, including metrics that reflect the impact of DML operations such as updates and
deletes on table storage. For example, metrics related to the number of active and deleted rows
can help identify tables that experience high levels of row modifications, indicating frequent
updates or deletions.

Questions 31

Which use case does the search optimization service support?

Options:

A.

Disjuncts (OR) in join predicates

B.

LIKE/ILIKE/RLIKE join predicates

C.

Join predicates on VARIANT columns

D.

Conjunctions (AND) of multiple equality predicates

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The search optimization service in Snowflake supports use cases involving conjunctions (AND) of
multiple equality predicates. This service enhances the performance of queries that include
multiple equality conditions by utilizing search indexes to quickly filter data without scanning
entire tables or partitions. It's particularly beneficial for improving the response times of complex
queries that rely on specific data matching across multiple conditions.

Questions 32

Why would a Snowflake user decide to use a materialized view instead of a regular view?

Options:

A.

The base tables do not change frequently.

B.

The results of the view change often.

C.

The query is not resource intensive.

D.

The query results are not used frequently.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

A Snowflake user would decide to use a materialized view instead of a regular view primarily
when the base tables do not change frequently. Materialized views store the result of the view
query and update it as the underlying data changes, making them ideal for situations where the
data is relatively static and query performance is critical. By precomputing and storing the query
results, materialized views can significantly reduce query execution times for complex
aggregations, joins, and calculations.

References:

• Snowflake Documentation: Materialized Views

Questions 33
What is the only supported character set for loading and unloading data from all supported file
formats?

Options:

A.

UTF-8

B.

UTF-16

C.

ISO-8859-1

D.

WINDOWS-1253

Show Answer Buy Now

Answer:

Explanation:

Explanation:

UTF-8 is the only supported character set for loading and unloading data from all supported file
formats in Snowflake. UTF-8 is a widely used encoding that supports a large range of characters
from various languages, making it suitable for internationalization and ensuring data compatibility
across different systems and platforms.

Questions 34

What happens when a network policy includes values that appear in both the allowed and
blocked IP address list?

Options:

A.

Those IP addresses are allowed access to the Snowflake account as Snowflake applies the allowed
IP address list first.
B.

Those IP addresses are denied access lei the Snowflake account as Snowflake applies the blocked
IP address list first.

C.

Snowflake issues an alert message and adds the duplicate IP address values lo both 'he allowed
and blocked IP address lists.

D.

Snowflake issues an error message and adds the duplicate IP address values to both the allowed
and blocked IP address list

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, when setting up a network policy that specifies both allowed and blocked IP
address lists, if an IP address appears in both lists, access from that IP address will be denied. The
reason is that Snowflake prioritizes security, and the presence of an IP address in the blocked list
indicates it should not be allowed regardless of its presence in the allowed list. This ensures that
access controls remain stringent and that any potentially unsafe IP addresses are not
inadvertently permitted access.

Questions 35

If a virtual warehouse runs for 61 seconds, shut down, and then restart and runs for 30 seconds,
for how many seconds is it billed?

Options:

A.

60

B.

91
C.

120

D.

121

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowflake bills virtual warehouse usage in one-minute increments, rounding up to the nearest
minute for any partial minute of compute time used. If a virtual warehouse runs for 61 seconds
and then, after being shut down, restarts and runs for an additional 30 seconds, the total time
billed would be 120 seconds or 2 minutes. The first 61 seconds are rounded up to 2 minutes, and
the subsequent 30 seconds are within a new minute, which is also rounded up to the nearest
minute.

Questions 36

What are valid sub-clauses to the OVER clause for a window function? (Select TWO).

Options:

A.

GROUP BY

B.

LIMIT

C.

ORDER BY

D.

PARTITION BY

E.
UNION ALL

Show Answer Buy Now

Answer:

C, D

Explanation:

Explanation:

Valid sub-clauses to the OVER clause for a window function in SQL are:

• C. ORDER BY: This clause specifies the order in which the rows in a partition are
processed by the window function. It is essential for functions that depend on the row
order, such as ranking functions.
• D. PARTITION BY: This clause divides the result set into partitions to which the window
function is applied. Each partition is processed independently of other partitions, making it
crucial for functions that compute values across sets of rows that share common
characteristics.

These clauses are fundamental to defining the scope and order of data over which the window
function operates, enabling complex analytical computations within SQL queries.

Questions 37

Which SQL command can be used to verify the privileges that are granted to a role?

Options:

A.

SHOW GRANTS ON ROLE

B.

SHOW ROLES

C.

SHOW GRANTS TO ROLE

D.

SHOW GRANTS FOR ROLE


Show Answer Buy Now

Answer:

Explanation:

Explanation:

To verify the privileges that have been granted to a specific role in Snowflake, the correct SQL
command is SHOW GRANTS TO ROLE <Role Name>. This command lists all the privileges
granted to the specified role, including access to schemas, tables, and other database objects. This
is a useful command for administrators and users with sufficient privileges to audit and manage
role permissions within the Snowflake environment.

Questions 38

Which Snowflake layer is associated with virtual warehouses?

Options:

A.

Cloud services

B.

Query processing

C.

Elastic memory

D.

Database storage

Show Answer Buy Now

Answer:

Explanation:

Explanation:
The layer of Snowflake's architecture associated with virtual warehouses is the Query Processing
layer. Virtual warehouses in Snowflake are dedicated compute clusters that execute SQL queries
against the stored data. This layer is responsible for the entire query execution process, including
parsing, optimization, and the actual computation. It operates independently of the storage layer,
enabling Snowflake to scale compute and storage resources separately for efficiency and cost-
effectiveness.

References:

• Snowflake Documentation: Snowflake Architecture

Questions 39

What are characteristics of transient tables in Snowflake? (Select TWO).

Options:

A.

Transient tables have a Fail-safe period of 7 days.

B.

Transient tables can be cloned to permanent tables.

C.

Transient tables persist until they are explicitly dropped.

D.

Transient tables can be altered to make them permanent tables.

E.

Transient tables have Time Travel retention periods of 0 or 1 day.

Show Answer Buy Now

Answer:

B, C

Explanation:

Explanation:
Transient tables in Snowflake are designed for temporary or intermediate workloads with the
following characteristics:

• B. Transient tables can be cloned to permanent tables: This feature allows users to create
copies of transient tables for permanent use, providing flexibility in managing data
lifecycles.
• C. Transient tables persist until they are explicitly dropped: Unlike temporary tables that
exist for the duration of a session, transient tables remain in the database until explicitly
removed by a user, offering more durability for short-term data storage needs.

References:

• Snowflake Documentation: Transient Tables

Questions 40

Which function is used to convert rows in a relational table to a single VARIANT column?

Options:

A.

ARRAY_AGG

B.

OBJECT_AGG

C.

ARRAY_CONSTRUCT

D.

OBJECT_CONSTRUCT

Show Answer Buy Now

Answer:

Explanation:

Explanation:
The OBJECT_CONSTRUCT function in Snowflake is used to convert rows in a relational table
into a single VARIANT column that represents each row as a JSON object. This function
dynamically creates a JSON object from a list of key-value pairs, where each key is a column
name and each value is the corresponding column value for a row. This is particularly useful for
aggregating and transforming structured data into semi-structured JSON format for further
processing or analysis.

References:

• Snowflake Documentation: Semi-structured Data Functions

Questions 41

Which Snowflow object does not consume and storage costs?

Options:

A.

Secure view

B.

Materialized view

C.

Temporary table

D.

Transient table

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Temporary tables in Snowflake do not consume storage costs. They are designed for transient
data that is needed only for the duration of a session. Data stored in temporary tables is held in
the virtual warehouse's cache and does not persist beyond the session's lifetime, thereby not
incurring any storage charges.

References:

• Snowflake Documentation: Temporary Tables

Questions 42

When floating-point number columns are unloaded to CSV or JSON files, Snowflake truncates the
values to approximately what?

Options:

A.

(12,2)

B.

(10,4)

C.

(14,8)

D.

(15,9)

Show Answer Buy Now

Answer:

Explanation:

Explanation:

When unloading floating-point number columns to CSV or JSON files, Snowflake truncates the
values to approximately 15 significant digits with 9 digits following the decimal point, which can
be represented as (15,9). This ensures a balance between accuracy and efficiency in representing
floating-point numbers in text-based formats, which is essential for data interchange and
processing applications that consume these files.

References:
• Snowflake Documentation: Data Unloading Considerations

Questions 43

What information does the Query Profile provide?

Options:

A.

Graphical representation of the data model

B.

Statistics for each component of the processing plan

C.

Detailed Information about I he database schema

D.

Real-time monitoring of the database operations

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The Query Profile in Snowflake provides a graphical representation and statistics for each
component of the query's execution plan. This includes details such as the execution time, the
number of rows processed, and the amount of data scanned for each operation within the query.
The Query Profile is a crucial tool for understanding and optimizing the performance of queries,
as it helps identify potential bottlenecks and inefficiencies.

References:

• Snowflake Documentation: Understanding the Query Profile

Questions 44

What are characteristics of reader accounts in Snowflake? (Select TWO).


Options:

A.

Reader account users cannot add new data to the account.

B.

Reader account users can share data to other reader accounts.

C.

A single reader account can consume data from multiple provider accounts.

D.

Data consumers are responsible for reader account setup and data usage costs.

E.

Reader accounts enable data consumers to access and query data shared by the provider.

Show Answer Buy Now

Answer:

A, E

Explanation:

Explanation:

Characteristics of reader accounts in Snowflake include:

• A. Reader account users cannot add new data to the account: Reader accounts are
intended for data consumption only. Users of these accounts can query and analyze the
data shared with them but cannot upload or add new data to the account.
• E. Reader accounts enable data consumers to access and query data shared by the
provider: One of the primary purposes of reader accounts is to allow data consumers to
access and perform queries on the data shared by another Snowflake account, facilitating
secure and controlled data sharing.

References:

• Snowflake Documentation: Reader Accounts


Questions 45

What is it called when a customer managed key is combined with a Snowflake managed key to
create a composite key for encryption?

Options:

A.

Hierarchical key model

B.

Client-side encryption

C.

Tri-secret secure encryption

D.

Key pair authentication

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Tri-secret secure encryption is a security model employed by Snowflake that involves combining a
customer-managed key with a Snowflake-managed key to create a composite key for encrypting
data. This model enhances data security by requiring both the customer-managed key and the
Snowflake-managed key to decrypt data, thus ensuring that neither party can access the data
independently. It represents a balanced approach to key management, leveraging both customer
control and Snowflake's managed services for robust data encryption.

References:

• Snowflake Documentation: Encryption and Key Management

Questions 46

Which file function generates a SnowFlake-hosted URL that must be authenticated when used?
Options:

A.

GET_STATE_LOCATION

B.

GET_PRESENT_URL

C.

BUILD_SCOPED_FILE_URL

D.

BUILD_STAGE_FILE_URL

Show Answer Buy Now

Answer:

Explanation:

Explanation:

• Purpose: The BUILD_STAGE_FILE_URL function generates a temporary, pre-signed URL


that allows you to access a file within a Snowflake stage (internal or external). This URL
requires authentication to use.
• Key Points:

• Snowflake Documentation (BUILD_STAGE_FILE_URL): https://docs.snowflake.com/en/sql-


reference/functions/build_stage_file_url.html

Questions 47

There are two Snowflake accounts in the same cloud provider region: one is production and the
other is non-production. How can data be easily transferred from the production account to the
non-production account?

Options:

A.
Clone the data from the production account to the non-production account.

B.

Create a data share from the production account to the non-production account.

C.

Create a subscription in the production account and have it publish to the non-production
account.

D.

Create a reader account using the production account and link the reader account to the non-
production account.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

To easily transfer data from a production account to a non-production account in Snowflake


within the same cloud provider region, creating a data share is the most efficient approach. Data
sharing allows for live, read-only access to selected data objects from the production account to
the non-production account without the need to duplicate or move the actual data. This method
facilitates seamless access to the data for development, testing, or analytics purposes in the non-
production environment.

References:

• Snowflake Documentation: Data Sharing

Questions 48

Which command removes a role from another role or a user in Snowflak?

Options:

A.

ALTER ROLE
B.

REVOKE ROLE

C.

USE ROLE

D.

USE SECONDARY ROLES

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The REVOKE ROLE command is used to remove a role from another role or a user in Snowflake.
This command is part of Snowflake's role-based access control system, allowing administrators to
manage permissions and access to database objects efficiently by adding or removing roles from
users or other roles.

References:

• Snowflake Documentation: REVOKE ROLE

Questions 49

How does a Snowflake stored procedure compare to a User-Defined Function (UDF)?

Options:

A.

A single executable statement can call only two stored procedures. In contrast, a single SQL
statement can call multiple UDFs.

B.

A single executable statement can call only one stored procedure. In contrast, a single SQL
statement can call multiple UDFs.
C.

A single executable statement can call multiple stored procedures. In contrast, multiple SQL
statements can call the same UDFs.

D.

Multiple executable statements can call more than one stored procedure. In contrast, a single SQL
statement can call multiple UDFs.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, stored procedures and User-Defined Functions (UDFs) have different invocation
patterns within SQL:

• Option B is correct: A single executable statement can call only one stored procedure due
to the procedural and potentially transactional nature of stored procedures. In contrast, a
single SQL statement can call multiple UDFs because UDFs are designed to operate more
like functions in traditional programming, where they return a value and can be embedded
within SQL queries.References: Snowflake documentation comparing the operational
differences between stored procedures and UDFs.

Questions 50

Regardless of which notation is used, what are considerations for writing the column name and
element names when traversing semi-structured data?

Options:

A.

The column name and element names are both case-sensitive.

B.

The column name and element names are both case-insensitive.


C.

The column name is case-sensitive but element names are case-insensitive.

D.

The column name is case-insensitive but element names are case-sensitive.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

When querying semi-structured data in Snowflake, the behavior towards case sensitivity is
distinct between column names and the names of elements within the semi-structured data.
Column names follow the general SQL norm of being case-insensitive, meaning you can reference
them in any case without affecting the query. However, element names within JSON, XML, or
other semi-structured data are case-sensitive. This distinction is crucial for accurate data retrieval
and manipulation in Snowflake, especially when working with JSON objects where the case of
keys can significantly alter the outcome of queries.

References:

• Snowflake Documentation: Querying Semi-structured Data

Questions 51

While clustering a table, columns with which data types can be used as clustering keys? (Select
TWO).

Options:

A.

BINARY

B.

GEOGRAPHY

C.
GEOMETRY

D.

OBJECT

E.

VARIANT

Show Answer Buy Now

Answer:

A, C

Explanation:

Explanation:

A clustering key can be defined when a table is created by appending a CLUSTER Where each
clustering key consists of one or more table columns/expressions, which can be of any data type,
except GEOGRAPHY, VARIANT, OBJECT, or ARRAY https://docs.snowflake.com/en/user-
guide/tables-clustering-keys

Questions 52

A Snowflake user is writing a User-Defined Function (UDF) that includes some unqualified object
names.

How will those object names be resolved during execution?

Options:

A.

Snowflake will resolve them according to the SEARCH_PATH parameter.

B.

Snowflake will only check the schema the UDF belongs to.

C.

Snowflake will first check the current schema, and then the schema the previous query used

D.
Snowflake will first check the current schema, and them the PUBLIC schema of the current
database.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

• Object Name Resolution: When unqualified object names (e.g., table name without
schema) are used in a UDF, Snowflake follows a specific hierarchy to resolve them. Here's
the order:
• Note: The SEARCH_PATH parameter influences object resolution for queries, not within
UDFs.

References:

• Snowflake Documentation (Object Naming


Resolution): https://docs.snowflake.com/en/sql-reference/name-resolution.html

Questions 53

Which function should be used to insert JSON format string data inot a VARIANT field?

Options:

A.

FLATTEN

B.

CHECK_JSON

C.

PARSE_JSON

D.

TO_VARIANT
Show Answer Buy Now

Answer:

Explanation:

Explanation:

To insert JSON formatted string data into a VARIANT field in Snowflake, the correct function to
use is PARSE_JSON. The PARSE_JSON function is specifically designed to interpret a JSON
formatted string and convert it into a VARIANT type, which is Snowflake's flexible format for
handling semi-structured data like JSON, XML, and Avro. This function is essential for loading and
querying JSON data within Snowflake, allowing users to store and manage JSON data efficiently
while preserving its structure for querying purposes. This function's usage and capabilities are
detailed in the Snowflake documentation, providing users with guidance on how to handle semi-
structured data effectively within their Snowflake environments.

References:

• Snowflake Documentation: PARSE_JSON

Questions 54

What is the MAXIMUM number of clusters that can be provisioned with a multi-cluster virtual
warehouse?

Options:

A.

B.

C.

10

D.

100
Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, the maximum number of clusters that can be provisioned within a multi-cluster
virtual warehouse is 10. This allows for significant scalability and performance management by
enabling Snowflake to handle varying levels of query load by adjusting the number of active
clusters within the warehouse.References: Snowflake documentation on virtual warehouses,
particularly the scalability options available in multi-cluster configurations.

Questions 55

Which data formats are supported by Snowflake when unloading semi-structured data? (Select
TWO).

Options:

A.

Binary file in Avro

B.

Binary file in Parquet

C.

Comma-separated JSON

D.

Newline Delimited JSON

E.

Plain text file containing XML elements

Show Answer Buy Now


Answer:

B, D

Explanation:

Explanation:

Snowflake supports a variety of file formats for unloading semi-structured data, among which
Parquet and Newline Delimited JSON (NDJSON) are two widely used formats.

• B. Binary file in Parquet: Parquet is a columnar storage file format optimized for large-scale
data processing and analysis. It is especially suited for complex nested data structures.
• D. Newline Delimited JSON (NDJSON): This format represents JSON records separated by
newline characters, facilitating the storage and processing of multiple, separate JSON
objects in a single file.

These formats are chosen for their efficiency and compatibility with data analytics tools and
ecosystems, enabling seamless integration and processing of exported data.

References:

• Snowflake Documentation: Data Unloading

Questions 56

When using the ALLOW_CLI£NT_MFA_CACHING parameter, how long is a cached Multi-Factor


Authentication (MFA) token valid for?

Options:

A.

1 hour

B.

2 hours

C.

4 hours

D.

8 hours
Show Answer Buy Now

Answer:

Explanation:

Explanation:

A cached MFA token is valid for up to four hours. https://docs.snowflake.com/en/user-


guide/security-mfa#using-mfa-token-caching-to-minimize-the-number-of-prompts-during-
authentication-optional

Questions 57

Which URL provides access to files in Snowflake without authorization?

Options:

A.

File URL

B.

Scoped URL

C.

Pre-signed URL

D.

Scoped file URL

Show Answer Buy Now

Answer:

Explanation:

Explanation:
A Pre-signed URL provides access to files stored in Snowflake without requiring authorization at
the time of access. This feature allows users to generate a URL with a limited validity period that
grants temporary access to a file in a secure manner. It's particularly useful for sharing data with
external parties or applications without the need for them to authenticate directly with
Snowflake.

References:

• Snowflake Documentation: Using Pre-signed URLs

Questions 58

Which of the following statements apply to Snowflake in terms of security? (Choose two.)

Options:

A.

Snowflake leverages a Role-Based Access Control (RBAC) model.

B.

Snowflake requires a user to configure an IAM user to connect to the database.

C.

All data in Snowflake is encrypted.

D.

Snowflake can run within a user's own Virtual Private Cloud (VPC).

E.

All data in Snowflake is compressed.

Show Answer Buy Now

Answer:

A, C

Explanation:

Explanation:
Snowflake uses a Role-Based Access Control (RBAC) model to manage access to data and
resources. Additionally, Snowflake ensures that all data is encrypted, both at rest and in transit, to
provide a high level of security for data stored within the platform. References: [COF-C02]
SnowPro Core Certification Exam Study Guide

Questions 59

Which services does the Snowflake Cloud Services layer manage? (Choose two.)

Options:

A.

Compute resources

B.

Query execution

C.

Authentication

D.

Data storage

E.

Metadata

Show Answer Buy Now

Answer:

C, E

Explanation:

Explanation:

The Snowflake Cloud Services layer manages various services, including authentication and
metadata management. This layer ties together all the different components of Snowflake to
process user requests, manage sessions, and control access3.

Questions 60
Which Snowflake SQL statement would be used to determine which users and roles have access
to a role called MY_ROLE?

Options:

A.

SHOW GRANTS OF ROLE MY_ROLE

B.

SHOW GRANTS TO ROLE MY_ROLE

C.

SHOW GRANTS FOR ROLE MY_ROLE

D.

SHOW GRANTS ON ROLE MY_ROLE

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The SQL statement SHOW GRANTS TO ROLE MY_ROLE is used to determine which users and
roles have access to a role called MY_ROLE. This statement lists all the privileges granted to the
role, including which roles and users can assume MY_ROLE. References: [COF-C02] SnowPro
Core Certification Exam Study Guide

Questions 61

What is the SNOWFLAKE.ACCOUNT_USAGE view that contains information about which


objects were read by queries within the last 365 days (1 year)?

Options:

A.

VIEWS_HISTORY

B.
OBJECT_HISTORY

C.

ACCESS_HISTORY

D.

LOGIN_HISTORY

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The ACCESS_HISTORY view in the SNOWFLAKE.ACCOUNT_USAGE schema contains


information about the access history of Snowflake objects, such as tables and views, within the
last 365 days1.

Questions 62

What is the following SQL command used for?

Select * from table(validate(t1, job_id => '_last'));

Options:

A.

To validate external table files in table t1 across all sessions

B.

To validate task SQL statements against table t1 in the last 14 days

C.

To validate a file for errors before it gets executed using a COPY command

D.

To return errors from the last executed COPY command into table t1 in the current session

Show Answer Buy Now


Answer:

Explanation:

Explanation:

The SQL command Select * from table(validate(t1, job_id => '_last')); is used to return errors from
the last executed COPY command into table t1 in the current session. It checks the results of the
most recent data load operation and provides details on any errors that occurred during that
process1.

Questions 63

By default, which Snowflake role is required to create a share?

Options:

A.

ORGADMIN

B.

SECURITYADMIN

C.

SHAREADMIN

D.

ACCOUNTADMIN

Show Answer Buy Now

Answer:

Explanation:

Explanation:
By default, the Snowflake role required to create a share is ACCOUNTADMIN (D). This role has
the necessary privileges to perform administrative tasks, including creating shares for data sharing
purposes

Questions 64

Why does Snowflake recommend file sizes of 100-250 MB compressed when loading data?

Options:

A.

Optimizes the virtual warehouse size and multi-cluster setting to economy mode

B.

Allows a user to import the files in a sequential order

C.

Increases the latency staging and accuracy when loading the data

D.

Allows optimization of parallel operations

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowflake recommends file sizes between 100-250 MB compressed when loading data to
optimize parallel processing. Smaller, compressed files can be loaded in parallel, which maximizes
the efficiency of the virtual warehouses and speeds up the data loading process

Questions 65

Snowflake supports the use of external stages with which cloud platforms? (Choose three.)

Options:

A.
Amazon Web Services

B.

Docker

C.

IBM Cloud

D.

Microsoft Azure Cloud

E.

Google Cloud Platform

F.

Oracle Cloud

Show Answer Buy Now

Answer:

A, D, E

Explanation:

Explanation:

Snowflake supports the use of external stages with Amazon Web Services (AWS), Microsoft
Azure Cloud, and Google Cloud Platform (GCP). These platforms allow users to stage data
externally and integrate with Snowflake for data loading operations

Questions 66

Which columns are part of the result set of the Snowflake LATERAL FLATTEN command?
(Choose two.)

Options:

A.

CONTENT

B.
PATH

C.

BYTE_SIZE

D.

INDEX

E.

DATATYPE

Show Answer Buy Now

Answer:

B, D

Explanation:

Explanation:

The LATERAL FLATTEN command in Snowflake produces a result set that includes several
columns, among which PATH and INDEX are includedPATH indicates the path to the element
within a data structure that needs to be flattened, and INDEX represents the index of the element
if it is an array2.

Questions 67

A virtual warehouse is created using the following command:

Create warehouse my_WH with

warehouse_size = MEDIUM

min_cluster_count = 1

max_cluster_count = 1

auto_suspend = 60

auto_resume = true;

The image below is a graphical representation of the warehouse utilization across two days.
What action should be taken to address this situation?

Options:

A.

Increase the warehouse size from Medium to 2XL.

B.

Increase the value for the parameter MAX_CONCURRENCY_LEVEL.

C.

Configure the warehouse to a multi-cluster warehouse.

D.

Lower the value of the parameter STATEMENT_QUEUED_TIMEOUT_IN_SECONDS.

Show Answer Buy Now

Answer:

C
Explanation:

Explanation:

The graphical representation of warehouse utilization indicates periods of significant queuing,


suggesting that the current single cluster cannot efficiently handle all incoming
queries. Configuring the warehouse to a multi-cluster warehouse will distribute the load among
multiple clusters, reducing queuing times and improving overall performance1.

References = Snowflake Documentation on Multi-cluster Warehouses1

Questions 68

Network policies can be set at which Snowflake levels? (Choose two.)

Options:

A.

Role

B.

Schema

C.

User

D.

Database

E.

Account

F.

Tables

Show Answer Buy Now

Answer:

C, E
Explanation:

Explanation:

Network policies in Snowflake can be set at the user level and at the account level2.

Reference: [Reference: https://docs.snowflake.com/en/user-guide/network-


policies.html#creating-network-policies, ]

Questions 69

When loading data into Snowflake via Snowpipe what is the compressed file size
recommendation?

Options:

A.

10-50 MB

B.

100-250 MB

C.

300-500 MB

D.

1000-1500 MB

Show Answer Buy Now

Answer:

Explanation:

Explanation:

For loading data into Snowflake via Snowpipe, the recommended compressed file size is between
100-250 MB. This size range is optimal for balancing the performance of parallel processing and
minimizing the overhead associated with handling many small files2.

Questions 70
Which minimum Snowflake edition allows for a dedicated metadata store?

Options:

A.

Standard

B.

Enterprise

C.

Business Critical

D.

Virtual Private Snowflake

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The Enterprise edition of Snowflake allows for a dedicated metadata store, providing additional
features designed for large-scale enterprises

Reference: [Reference: https://docs.snowflake.com/en/user-guide/intro-editions.html, ]

Questions 71

What impacts the credit consumption of maintaining a materialized view? (Choose two.)

Options:

A.

Whether or not it is also a secure view

B.

How often the underlying base table is queried


C.

How often the base table changes

D.

Whether the materialized view has a cluster key defined

E.

How often the materialized view is queried

Show Answer Buy Now

Answer:

C, D

Explanation:

Explanation:

The credit consumption for maintaining a materialized view is impacted by how often the base
table changes © and whether the materialized view has a cluster key defined (D). Changes to the
base table can trigger a refresh of the materialized view, consuming credits. Additionally, having a
cluster key defined can optimize the performance and credit usage during the materialized view’s
maintenance. References: SnowPro Core Certification materialized view credit consumption

Questions 72

How many days is load history for Snowpipe retained?

Options:

A.

1 day

B.

7 days

C.

14 days

D.
64 days

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowpipe retains load history for 14 days. This allows users to view and audit the data that has
been loaded into Snowflake using Snowpipe within this time frame3.

Questions 73

In a Snowflake role hierarchy, what is the top-level role?

Options:

A.

SYSADMIN

B.

ORGADMIN

C.

ACCOUNTADMIN

D.

SECURITYADMIN

Show Answer Buy Now

Answer:

Explanation:

Explanation:
In a Snowflake role hierarchy, the top-level role is ACCOUNTADMIN. This role has the highest
level of privileges and is capable of performing all administrative functions within the Snowflake
account

Questions 74

What types of data listings are available in the Snowflake Data Marketplace? (Choose two.)

Options:

A.

Reader

B.

Consumer

C.

Vendor

D.

Standard

E.

Personalized

Show Answer Buy Now

Answer:

C, E

Explanation:

Explanation:

In the Snowflake Data Marketplace, the types of data listings available include ‘Vendor’, which
refers to the providers of data, and ‘Personalized’, which indicates customized data offerings
tailored to specific consumer needs45.

Questions 75

Which of the following accurately describes shares?


Options:

A.

Tables, secure views, and secure UDFs can be shared

B.

Shares can be shared

C.

Data consumers can clone a new table from a share

D.

Access to a share cannot be revoked once granted

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Shares in Snowflake are named objects that encapsulate all the information required to share
databases, schemas, tables, secure views, and secure UDFs. These objects can be added to a
share by granting privileges on them to the share via a database role

Questions 76

Which of the following describes the Snowflake Cloud Services layer?

Options:

A.

Coordinates activities in the Snowflake account

B.

Executes queries submitted by the Snowflake account users

C.

Manages quotas on the Snowflake account storage


D.

Manages the virtual warehouse cache to speed up queries

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The Snowflake Cloud Services layer is a collection of services that coordinate activities across
Snowflake, tying together all the different components to process user requests, from login to
query dispatch1.

References = [COF-C02] SnowPro Core Certification Exam Study Guide, Snowflake


Documentation1

Questions 77

What features that are part of the Continuous Data Protection (CDP) feature set in Snowflake do
not require additional configuration? (Choose two.)

Options:

A.

Row level access policies

B.

Data masking policies

C.

Data encryption

D.

Time Travel

E.

External tokenization
Show Answer Buy Now

Answer:

C, D

Explanation:

Explanation:

Data encryption and Time Travel are part of Snowflake’s Continuous Data Protection (CDP)
feature set that do not require additional configuration. Data encryption is automatically applied
to all files stored on internal stages, and Time Travel allows for querying and restoring data
without any extra setup

Questions 78

Which statement is true about running tasks in Snowflake?

Options:

A.

A task can be called using a CALL statement to run a set of predefined SQL commands.

B.

A task allows a user to execute a single SQL statement/command using a predefined schedule.

C.

A task allows a user to execute a set of SQL commands on a predefined schedule.

D.

A task can be executed using a SELECT statement to run a predefined SQL command.

Show Answer Buy Now

Answer:

Explanation:

Explanation:
In Snowflake, a task allows a user to execute a single SQL statement/command using a
predefined schedule (B). Tasks are used to automate the execution of SQL statements at
scheduled intervals.

Questions 79

What is the purpose of multi-cluster virtual warehouses?

Options:

A.

To create separate data warehouses to increase query optimization

B.

To allow users the ability to choose the type of compute nodes that make up a virtual warehouse
cluster

C.

To eliminate or reduce Queuing of concurrent queries

D.

To allow the warehouse to resize automatically

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Multi-cluster virtual warehouses in Snowflake are designed to manage user and query
concurrency needs. They allow for the allocation of additional clusters of compute resources,
either statically or dynamically, to handle increased loads and reduce or eliminate the queuing of
concurrent queries2.

https://docs.snowflake.com/en/user-guide/warehouses-
multicluster.html#:~:text=Multi%2Dcluster%20warehouses%20enable%20you,during%20peak%2
0and%20off%20hours .

Questions 80
When publishing a Snowflake Data Marketplace listing into a remote region what should be taken
into consideration? (Choose two.)

Options:

A.

There is no need to have a Snowflake account in the target region, a share will be created for each
user.

B.

The listing is replicated into all selected regions automatically, the data is not.

C.

The user must have the ORGADMIN role available in at least one account to link accounts for
replication.

D.

Shares attached to listings in remote regions can be viewed from any account in an organization.

E.

For a standard listing the user can wait until the first customer requests the data before
replicating it to the target region.

Show Answer Buy Now

Answer:

B, C

Explanation:

Explanation:

When publishing a Snowflake Data Marketplace listing into a remote region, it’s important to note
that while the listing is replicated into all selected regions automatically, the data itself is not.
Therefore, the data must be replicated separately. Additionally, the user must have the
ORGADMIN role in at least one account to manage the replication of accounts1.

Questions 81

What is true about sharing data in Snowflake? (Choose two.)


Options:

A.

The Data Consumer pays for data storage as well as for data computing.

B.

The shared data is copied into the Data Consumer account, so the Consumer can modify it
without impacting the base data of the Provider.

C.

A Snowflake account can both provide and consume shared data.

D.

The Provider is charged for compute resources used by the Data Consumer to query the shared
data.

E.

The Data Consumer pays only for compute resources to query the shared data.

Show Answer Buy Now

Answer:

C, E

Explanation:

Explanation:

In Snowflake’s data sharing model, any full Snowflake account can both provide and consume
shared data. Additionally, the data consumer pays only for the compute resources used to query
the shared data. No actual data is copied or transferred between accounts, and shared data does
not take up any storage in a consumer account, so the consumer does not pay for data storage1.

References = Introduction to Secure Data Sharing | Snowflake Documentation

Questions 82

What is the minimum Snowflake edition required to use Dynamic Data Masking?

Options:

A.
Standard

B.

Enterprise

C.

Business Critical

D.

Virtual Private Snowflake (VPC)

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The minimum Snowflake edition required to use Dynamic Data Masking is the Enterprise
edition. This feature is not available in the Standard edition2.

Questions 83

Which Snowflake architectural layer is responsible for a query execution plan?

Options:

A.

Compute

B.

Data storage

C.

Cloud services

D.

Cloud provider
Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake’s architecture, the Cloud Services layer is responsible for generating the query
execution plan. This layer handles all the coordination, optimization, and management tasks,
including query parsing, optimization, and compilation into an execution plan that can be
processed by the Compute layer.

Questions 84

What COPY INTO SQL command should be used to unload data into multiple files?

Options:

A.

SINGLE=TRUE

B.

MULTIPLE=TRUE

C.

MULTIPLE=FALSE

D.

SINGLE=FALSE

Show Answer Buy Now

Answer:

Explanation:

Explanation:
The COPY INTO SQL command with the option SINGLE=FALSE is used to unload data into
multiple files. This option allows the data to be split into multiple files during the unload
process. References: SnowPro Core Certification COPY INTO SQL command unload multiple files

Questions 85

Which Snowflake feature allows a user to substitute a randomly generated identifier for sensitive
data, in order to prevent unauthorized users access to the data, before loading it into Snowflake?

Options:

A.

External Tokenization

B.

External Tables

C.

Materialized Views

D.

User-Defined Table Functions (UDTF)

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The feature in Snowflake that allows a user to substitute a randomly generated identifier for
sensitive data before loading it into Snowflake is known as External Tokenization. This process
helps to secure sensitive data by ensuring that it is not exposed in its original form, thus
preventing unauthorized access3.

Questions 86

How should a virtual warehouse be configured if a user wants to ensure that additional multi-
clusters are resumed with no delay?
Options:

A.

Configure the warehouse to a size larger than generally required

B.

Set the minimum and maximum clusters to autoscale

C.

Use the standard warehouse scaling policy

D.

Use the economy warehouse scaling policy

Show Answer Buy Now

Answer:

Explanation:

Explanation:

To ensure that additional multi-clusters are resumed with no delay, a virtual warehouse should be
configured to a size larger than generally required. This configuration allows for immediate
availability of additional resources when needed, without waiting for new clusters to start up

Questions 87

A table needs to be loaded. The input data is in JSON format and is a concatenation of multiple
JSON documents. The file size is 3 GB. A warehouse size small is being used. The following COPY
INTO command was executed:

COPY INTO SAMPLE FROM @~/SAMPLE.JSON (TYPE=JSON)

The load failed with this error:

Max LOB size (16777216) exceeded, actual size of parsed column is 17894470.

How can this issue be resolved?


Options:

A.

Compress the file and load the compressed file.

B.

Split the file into multiple files in the recommended size range (100 MB - 250 MB).

C.

Use a larger-sized warehouse.

D.

Set STRIP_OUTER_ARRAY=TRUE in the COPY INTO command.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The error “Max LOB size (16777216) exceeded” indicates that the size of the parsed column
exceeds the maximum size allowed for a single column value in Snowflake, which is 16 MB. To
resolve this issue, the file should be split into multiple smaller files that are within the
recommended size range of 100 MB to 250 MB. This will ensure that each JSON document
within the files is smaller than the maximum LOB size allowed. Compressing the file, using a
larger-sized warehouse, or setting STRIP_OUTER_ARRAY=TRUE will not resolve the issue of the
column size exceeding the maximum allowed. References: COPY INTO Error during Structured
Data Load: “Max LOB size (16777216) exceeded…”

Questions 88

Which statements are correct concerning the leveraging of third-party data from the Snowflake
Data Marketplace? (Choose two.)

Options:

A.
Data is live, ready-to-query, and can be personalized.

B.

Data needs to be loaded into a cloud provider as a consumer account.

C.

Data is not available for copying or moving to an individual Snowflake account.

D.

Data is available without copying or moving.

E.

Data transformations are required when combining Data Marketplace datasets with existing data
in Snowflake.

Show Answer Buy Now

Answer:

A, D

Explanation:

Explanation:

When leveraging third-party data from the Snowflake Data Marketplace, the data is live, ready-
to-query, and can be personalized. Additionally, the data is available without the need for copying
or moving it to an individual Snowflake account, allowing for seamless integration with existing
data

Questions 89

Which Snowflake database object can be used to track data changes made to table data?

Options:

A.

Tag

B.

Task
C.

Stream

D.

Stored procedure

Show Answer Buy Now

Answer:

Explanation:

Explanation:

A Stream object in Snowflake is used for change data capture (CDC), which records data
manipulation language (DML) changes made to tables, including inserts, updates, and deletes3.

Questions 90

When working with a managed access schema, who has the OWNERSHIP privilege of any tables
added to the schema?

Options:

A.

The database owner

B.

The object owner

C.

The schema owner

D.

The Snowflake user's role

Show Answer Buy Now


Answer:

Explanation:

Explanation:

In a managed access schema, the schema owner retains the OWNERSHIP privilege of any tables
added to the schema. This means that while object owners have certain privileges over the
objects they create, only the schema owner can manage privilege grants on these objects1.

Questions 91

Which object can be used with Secure Data Sharing?

Options:

A.

View

B.

Materialized view

C.

External table

D.

User-Defined Function (UDF)

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Views can be used with Secure Data Sharing in Snowflake. Materialized views, external tables,
and UDFs are not typically shared directly for security and performance reasons2.

Questions 92
Which Snowflake view is used to support compliance auditing?

Options:

A.

ACCESS_HISTORY

B.

COPY_HISTORY

C.

QUERY_HISTORY

D.

ROW ACCESS POLICIES

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The ACCESS_HISTORY view in Snowflake is utilized to support compliance auditing. It provides


detailed information on data access within Snowflake, including reads and writes by user
queries. This view is essential for regulatory compliance auditing as it offers insights into the
usage of tables and columns, and maintains a direct link between the user, the query, and the
accessed data1.

References: [COF-C02] SnowPro Core Certification Exam Study Guide

Questions 93

How is unstructured data retrieved from data storage?

Options:

A.

SQL functions like the GET command can be used to copy the unstructured data to a location on
the client.
B.

SQL functions can be used to create different types of URLs pointing to the unstructured data.
These URLs can be used to download the data to a client.

C.

SQL functions can be used to retrieve the data from the query results cache. When the query
results are output to a client, the unstructured data will be output to the client as files.

D.

SQL functions can call on different web extensions designed to display different types of files as a
web page. The web extensions will allow the files to be downloaded to the client.

Show Answer Buy Now

Questions 94

What does a Notify & Suspend action for a resource monitor do?

Options:

A.

Send an alert notification to all account users who have notifications enabled.

B.

Send an alert notification to all virtual warehouse users when thresholds over 100% have been
met.

C.

Send a notification to all account administrators who have notifications enabled, and suspend all
assigned warehouses after all statements being executed by the warehouses have completed.

D.

Send a notification to all account administrators who have notifications enabled, and suspend all
assigned warehouses immediately, canceling any statements being executed by the warehouses.

Show Answer Buy Now

Answer:

C
Explanation:

Explanation:

The Notify & Suspend action for a resource monitor in Snowflake sends a notification to all
account administrators who have notifications enabled and suspends all assigned
warehouses. However, the suspension only occurs after all currently running statements in the
warehouses have been completed1. References: [COF-C02] SnowPro Core Certification Exam
Study Guide

Questions 95

Which views are included in the DATA SHARING USAGE schema? (Select TWO).

Options:

A.

ACCESS_HISTORY

B.

DATA_TRANSFER_HISTORY

C.

WAREHOUSE_METERING_HISTORY

D.

MONETIZED_USAGE_DAILY

E.

LISTING TELEMETRY DAILY

Show Answer Buy Now

Answer:

D, E

Explanation:

Explanation:
The DATA_SHARING_USAGE schema includes views that display information about listings
published in the Snowflake Marketplace or a data exchange, which includes
DATA_TRANSFER_HISTORY and LISTING_TELEMETRY_DAILY2.

Questions 96

What are key characteristics of virtual warehouses in Snowflake? (Select TWO).

Options:

A.

Warehouses that are multi-cluster can have nodes of different sizes.

B.

Warehouses can be started and stopped at any time.

C.

Warehouses can be resized at any time, even while running.

D.

Warehouses are billed on a per-minute usage basis.

E.

Warehouses can only be used for querying and cannot be used for data loading.

Show Answer Buy Now

Answer:

B, C

Explanation:

Explanation:

Virtual warehouses in Snowflake can be started and stopped at any time, providing flexibility in
managing compute resources. They can also be resized at any time, even while running, to
accommodate varying workloads910. References: [COF-C02] SnowPro Core Certification Exam
Study Guide

Questions 97

Which Snowflake command can be used to unload the result of a query to a single file?
Options:

A.

Use COPY INTO followed by a GET command to download the file.

B.

Use COPY INTO followed by a put command to download the file.

C.

Use COPY INTO with SINGLE = TRUE followed by a GET command to download the file.

D.

Use COPY INTO with SINGLE = TRUE followed by a PUT command to download the file.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The Snowflake command to unload the result of a query to a single file is COPY INTO with
SINGLE = TRUE followed by a GET command to download the file. This command unloads the
query result into a single file in the specified internal stage

Questions 98

What does SnowCD help Snowflake users to do?

Options:

A.

Copy data into files.

B.

Manage different databases and schemas.

C.

Troubleshoot network connections to Snowflake.


D.

Write SELECT queries to retrieve data from external tables.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

SnowCD is a connectivity diagnostic tool that helps users troubleshoot network connections to
Snowflake. It performs a series of checks to evaluate the network connection and provides
suggestions for resolving any issues4.

Questions 99

How can performance be optimized for a query that returns a small amount of data from a very
large base table?

Options:

A.

Use clustering keys

B.

Create materialized views

C.

Use the search optimization service

D.

Use the query acceleration service

Show Answer Buy Now

Answer:

C
Explanation:

Explanation:

The search optimization service in Snowflake is designed to improve the performance of selective
point lookup queries on large tables, which is ideal for scenarios where a query returns a small
amount of data from a very large base table1. References: [COF-C02] SnowPro Core Certification
Exam Study Guide

Questions 100

Which statistics are displayed in a Query Profile that indicate that intermediate results do not fit
in memory? (Select TWO).

Options:

A.

Bytes scanned

B.

Partitions scanned

C.

Bytes spilled to local storage

D.

Bytes spilled to remote storage

E.

Percentage scanned from cache

Show Answer Buy Now

Answer:

C, D

Explanation:

Explanation:

The Query Profile statistics that indicate intermediate results do not fit in memory are the bytes
spilled to local storage and bytes spilled to remote storage2.
Questions 101

Which Snowflake feature allows a user to track sensitive data for compliance, discovery,
protection, and resource usage?

Options:

A.

Tags

B.

Comments

C.

Internal tokenization

D.

Row access policies

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Tags in Snowflake allow users to track sensitive data for compliance, discovery, protection, and
resource usage. They enable the categorization and tracking of data, supporting compliance with
privacy regulations678. References: [COF-C02] SnowPro Core Certification Exam Study Guide

Questions 102

A Snowflake account has activated federated authentication.

What will occur when a user with a password that was defined by Snowflake attempts to log in to
Snowflake?

Options:

A.
The user will be unable to enter a password.

B.

The user will encounter an error, and will not be able to log in.

C.

The user will be able to log into Snowflake successfully.

D.

After entering the username and password, the user will be redirected to an Identity Provider
(IdP) login page.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

When federated authentication is activated in Snowflake, users authenticate via an external


identity provider (IdP) rather than using Snowflake-managed credentials. Therefore, a user with a
password defined by Snowflake will be unable to enter a password and must use their IdP
credentials to log in.

QUSTION NO: 579

What value provides information about disk usage for operations where intermediate results do
not fit in memory in a Query Profile?

A. IO

B. Network

C. Pruning

D. Spilling

Answer: D

In Snowflake, when a query execution requires more memory than what is available, Snowflake
handles these situations by spilling the intermediate results to disk. This process is known as
"spilling." The Query Profile in Snowflake includes a metric that helps users identify when and
how much data spilling occurs during the execution of a query. This information is crucial for
optimizing queries as excessive spilling can significantly slow down query performance. The value
that provides this information about disk usage due to intermediate results not fitting in memory
is appropriately labeled as "Spilling" in the Query Profile.

References:

• Snowflake Documentation on Query Profile and Performance: This section explains the
various components of the query profile, including the spilling metric, which indicates disk
usage for operations where intermediate results exceed available memory.

Questions 103

What metadata does Snowflake store for rows in micro-partitions? (Select TWO).

Options:

A.

Range of values

B.

Distinct values

C.

Index values

D.

Sorted values

E.

Null values

Show Answer Buy Now

Answer:

A, B

Explanation:

Explanation:
Snowflake stores metadata for rows in micro-partitions, including the range of values for each
column and the number of distinct values1.

Questions 104

Which metadata table will store the storage utilization information even for dropped tables?

Options:

A.

DATABASE_STORAGE_USAGE_HISTORY

B.

TABLE_STORAGE_METRICS

C.

STORAGE_DAILY_HISTORY

D.

STAGE STORAGE USAGE HISTORY

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The TABLE_STORAGE_METRICS metadata table stores the storage utilization information,


including for tables that have been dropped but are still incurring storage costs2.

Questions 105

Which Snowflake function will parse a JSON-null into a SQL-null?

Options:

A.

TO_CHAR

B.
TO_VARIANT

C.

TO_VARCHAR

D.

STRIP NULL VALUE

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The STRIP_NULL_VALUE function in Snowflake is used to convert a JSON null value into a SQL
NULL value1.

Questions 106

Which type of loop requires a BREAK statement to stop executing?

Options:

A.

FOR

B.

LOOP

C.

REPEAT

D.

WHILE

Show Answer Buy Now


Answer:

Explanation:

Explanation:

The LOOP type of loop in Snowflake Scripting does not have a built-in termination condition and
requires a BREAK statement to stop executing4.

Questions 107

Which commands are restricted in owner's rights stored procedures? (Select TWO).

Options:

A.

SHOW

B.

MERGE

C.

INSERT

D.

DELETE

E.

DESCRIBE

Show Answer Buy Now

Answer:

A, E

Explanation:

Explanation:
In owner’s rights stored procedures, certain commands are restricted to maintain security and
integrity. The SHOW and DESCRIBE commands are limited because they can reveal metadata and
structure information that may not be intended for all roles.

Questions 108

What factors impact storage costs in Snowflake? (Select TWO).

Options:

A.

The account type

B.

The storage file format

C.

The cloud region used by the account

D.

The type of data being stored

E.

The cloud platform being used

Show Answer Buy Now

Answer:

A, C

Explanation:

Explanation:

The factors that impact storage costs in Snowflake include the account type (Capacity or On
Demand) and the cloud region used by the account. These factors determine the rate at which
storage is billed, with different regions potentially having different rates3.

Questions 109

A permanent table and temporary table have the same name, TBL1, in a schema.
What will happen if a user executes select * from TBL1 ;?

Options:

A.

The temporary table will take precedence over the permanent table.

B.

The permanent table will take precedence over the temporary table.

C.

An error will say there cannot be two tables with the same name in a schema.

D.

The table that was created most recently will take precedence over the older table.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, if a temporary table and a permanent table have the same name within the same
schema, the temporary table takes precedence over the permanent table within the session
where the temporary table was created4.

Questions 110

What does a masking policy consist of in Snowflake?

Options:

A.

A single data type, with one or more conditions, and one or more masking functions

B.

A single data type, with only one condition, and only one masking function

C.
Multiple data types, with only one condition, and one or more masking functions

D.

Multiple data types, with one or more conditions, and one or more masking functions

Show Answer Buy Now

Answer:

Explanation:

Explanation:

A masking policy in Snowflake consists of a single data type, with one or more conditions, and
one or more masking functions. These components define how the data is masked based on the
specified conditions3.

Questions 111

What step can reduce data spilling in Snowflake?

Options:

A.

Using a larger virtual warehouse

B.

Increasing the virtual warehouse maximum timeout limit

C.

Increasing the amount of remote storage for the virtual warehouse

D.

Using a common table expression (CTE) instead of a temporary table

Show Answer Buy Now

Answer:

A
Explanation:

Explanation:

To reduce data spilling in Snowflake, using a larger virtual warehouse is effective because it
provides more memory and local disk space, which can accommodate larger data operations and
minimize the need to spill data to disk or remote storage1. References: [COF-C02] SnowPro Core
Certification Exam Study Guide

Questions 112

What objects in Snowflake are supported by Dynamic Data Masking? (Select TWO).'

Options:

A.

Views

B.

Materialized views

C.

Tables

D.

External tables

E.

Future grants

Show Answer Buy Now

Answer:

A, C

Explanation:

Explanation:

Dynamic Data Masking in Snowflake supports tables and views. These objects can have masking
policies applied to their columns to dynamically mask data at query time3.
Questions 113

Which Snowflake data types can be used to build nested hierarchical data? (Select TWO)

Options:

A.

INTEGER

B.

OBJECT

C.

VARIANT

D.

VARCHAR

E.

LIST

Show Answer Buy Now

Answer:

B, C

Explanation:

Explanation:

The Snowflake data types that can be used to build nested hierarchical data are OBJECT and
VARIANT. These data types support the storage and querying of semi-structured data, allowing
for the creation of complex, nested data structures

Questions 114

A column named "Data" contains VARIANT data and stores values as follows:
How will Snowflake extract the employee's name from the column data?

Options:

A.

Data:employee.name

B.

DATA:employee.name

C.

data:Employee.name

D.

data:employee.name

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, to extract a specific value from a VARIANT column, you use the column name
followed by a colon and then the key. The keys are case-sensitive. Therefore, to extract the
employee’s name from the “Data” column, the correct syntax is data:employee.name.

Questions 115

Which function unloads data from a relational table to JSON?


Options:

A.

TO_OBJECT

B.

TO_JSON

C.

TO_VARIANT

D.

OBJECT CONSTRUCT

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The TO_JSON function is used to convert a VARIANT value into a string containing the JSON
representation of the value. This function is suitable for unloading data from a relational table to
JSON format. References: [COF-C02] SnowPro Core Certification Exam Study Guide

Questions 116

Who can grant object privileges in a regular schema?

Options:

A.

Object owner

B.

Schema owner

C.

Database owner
D.

SYSADMIN

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In a regular schema within Snowflake, the object owner has the privilege to grant object
privileges. The object owner is typically the role that created the object or to whom the
ownership of the object has been transferred78.

References = [COF-C02] SnowPro Core Certification Exam Study Guide

Questions 117

Which VALIDATION_MODE value will return the errors across the files specified in a COPY
command, including files that were partially loaded during an earlier load?

Options:

A.

RETURN_-1_R0WS

B.

RETURN_n_ROWS

C.

RETURN_ERRORS

D.

RETURN ALL ERRORS

Show Answer Buy Now


Answer:

Explanation:

Explanation:

The RETURN_ERRORS value in the VALIDATION_MODE option of the COPY command instructs
Snowflake to validate the data files and return errors encountered across all specified files,
including those that were partially loaded during an earlier load2. References: [COF-C02]
SnowPro Core Certification Exam Study Guide

Questions 118

Which data types can be used in Snowflake to store semi-structured data? (Select TWO)

Options:

A.

ARRAY

B.

BLOB

C.

CLOB

D.

JSON

E.

VARIANT

Show Answer Buy Now

Answer:

A, E
Explanation:

Explanation:

Snowflake supports the storage of semi-structured data using the ARRAY and VARIANT data
types. The ARRAY data type can directly contain VARIANT, and thus indirectly contain any other
data type, including itself. The VARIANT data type can store a value of any other type,
including OBJECT and ARRAY, and is often used to represent semi-structured data formats like
JSON, Avro, ORC, Parquet, or XML34.

References: [COF-C02] SnowPro Core Certification Exam Study Guide

Questions 119

In the Data Exchange, who can get or request data from the listings? (Select TWO).

Options:

A.

Users with ACCOUNTADMIN role

B.

Users with sysadmin role

C.

Users with ORGADMIN role

D.

Users with import share privilege

E.

Users with manage grants privilege

Show Answer Buy Now

Answer:

A, D

Explanation:

Explanation:
In the Snowflake Data Exchange, the ability to get or request data from listings is generally
controlled by specific roles and privileges:

• A. Users with ACCOUNTADMIN role: This role typically has the highest level of access
within a Snowflake account, including the ability to manage and access all features and
functions. Users with this role can access data listings within the Data Exchange.
• D. Users with import share privilege: This specific privilege is necessary for users who
need to import shared data from the Data Exchange. This privilege allows them to request
and access data listings explicitly shared with them.

Questions 120

Which solution improves the performance of point lookup queries that return a small number of
rows from large tables using highly selective filters?

Options:

A.

Automatic clustering

B.

Materialized views

C.

Query acceleration service

D.

Search optimization service

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The search optimization service improves the performance of point lookup queries on large tables
by using selective filters to quickly return a small number of rows. It creates an optimized data
structure that helps in pruning the micro-partitions that do not contain the queried values3.
References: [COF-C02] SnowPro Core Certification Exam Study Guide

Questions 121

How does Snowflake handle the data retention period for a table if a stream has not been
consumed?

Options:

A.

The data retention period is reduced to a minimum of 14 days.

B.

The data retention period is permanently extended for the table.

C.

The data retention period is temporarily extended to the stream's offset.

D.

The data retention period is not affected by the stream consumption.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, the use of streams impacts how the data retention period for a table is handled,
particularly in scenarios where the stream has not been consumed. The key point to understand is
that Snowflake's streams are designed to capture data manipulation language (DML) changes such
as INSERTS, UPDATES, and DELETES that occur on a source table. Streams maintain a record of
these changes until they are consumed by a DML operation or a COPY command that references
the stream.

When a stream is created on a table and remains unconsumed, Snowflake extends the data
retention period of the table to ensure that the changes captured by the stream are preserved.
This extension is specifically up to the point in time represented by the stream's offset, which
effectively ensures that the data necessary for consuming the stream's contents is retained. This
mechanism is in place to prevent data loss and ensure the integrity of the stream's data,
facilitating accurate and reliable data processing and analysis based on the captured DML
changes.

This behavior emphasizes the importance of managing streams and their consumption
appropriately to balance between data retention needs and storage costs. It's also crucial to
understand how this temporary extension of the data retention period impacts the overall
management of data within Snowflake, including aspects related to data lifecycle, storage cost
implications, and the planning of data consumption strategies.

References:

• Snowflake Documentation on Streams: Using Streams


• Snowflake Documentation on Data Retention: Understanding Data Retention

Questions 122

What is a non-configurable feature that provides historical data that Snowflake may recover
during a 7-day period?

Options:

A.

Fail-safe

B.

Time Travel

C.

Cloning

D.

Account replication

Show Answer Buy Now

Answer:

A
Explanation:

Explanation:

Fail-safe is a non-configurable feature in Snowflake that provides an additional layer of data


protection beyond Time Travel. Time Travel allows users to access historical data within a
configurable period (up to 90 days), while Fail-safe provides an additional 7-day period during
which Snowflake retains historical data to recover from significant data loss or corruption
incidents. This period is not accessible by users but can be used by Snowflake support to assist in
data recovery efforts.References: Snowflake Documentation on Fail-safe and Time Travel

Questions 123

Which statement describes Snowflake tables?

Options:

A.

Snowflake tables arc logical representation of underlying physical data.

B.

Snowflake tables are the physical instantiation of data loaded into Snowflake.

C.

Snowflake tables require that clustering keys be defined lo perform optimally.

D.

Snowflake tables are owned by a use.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, tables represent a logical structure through which users interact with the stored
data. The actual physical data is stored in micro-partitions managed by Snowflake, and the logical
table structure provides the means by which SQL operations are mapped to this data. This
architecture allows Snowflake to optimize storage and querying across its distributed, cloud-
based data storage system.References: Snowflake Documentation on Tables

Questions 124

Which Snowflake table type is only visible to the user who creates it, can have the same name as
permanent tables in the same schema, and is dropped at the end of the session?

Options:

A.

Temporary

B.

Local

C.

User

D.

Transient

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, a Temporary table is a type of table that is only visible to the user who creates it,
can have the same name as permanent tables in the same schema, and is automatically dropped at
the end of the session in which it was created. Temporary tables are designed for transient data
processing needs, where data is needed for the duration of a specific task or session but not
beyond. Since they are automatically cleaned up at the end of the session, they help manage
storage usage efficiently and ensure that sensitive data is not inadvertently persisted.

References:

• Snowflake Documentation on Temporary Tables: Temporary Tables


Questions 125

How does Snowflake define i1s approach to Discretionary Access Control (DAC)?

Options:

A.

A defined level of access to an object

B.

An entity in which access can be granted

C.

Each object has an owner, who can in turn grail access to that object.

D.

Access privileges are assigned to roles. which are in turn assigned to use's

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowflake implements Discretionary Access Control (DAC) by using a role-based access control
model. In this model, access privileges are not directly assigned to individual objects or users but
are encapsulated within roles. These roles are then assigned to users, effectively granting them
the access privileges contained within the role. This approach allows for granular control over
database access, making it easier to manage permissions in a scalable and flexible
manner.References: Snowflake Documentation on Access Control

Questions 126

A Snowflake user wants to optimize performance for a query that queries only a small number of
rows in a table. The rows require significant processing. The data in the table does not change
frequently.

What should the user do?


Options:

A.

Add a clustering key to the table.

B.

Add the search optimization service to the table.

C.

Create a materialized view based on the query.

D.

Enable the query acceleration service for the virtual warehouse.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In a scenario where a Snowflake user queries only a small number of rows that require significant
processing and the data in the table does not change frequently, the most effective way to
optimize performance is by creating a materialized view based on the query. Materialized views
store the result of the query and can significantly reduce the computation time for queries that
are executed frequently over unchanged data.

• Why Materialized Views: Materialized views precompute and store the result of the query.
This is especially beneficial for queries that require heavy processing. Since the data does
not change frequently, the materialized view will not need to be refreshed often, making it
an ideal solution for this use case.
• Implementation Steps:

CREATE MATERIALIZED VIEW my_materialized_view AS SELECT ... FROM my_table WHERE ...;

• uk.co.certification.simulator.questionpool.PList@18720bb0
Reference: [Reference: For more information on materialized views and how they can be used to
optimize query performance, refer to the Snowflake documentation on materialized views:
https://docs.snowflake.com/en/user-guide/views-materialized.html, , ]
Questions 127

Which chart type is supported in Snowsight for Snowflake users to visualize data with
dashboards?

Options:

A.

Area chart

B.

Box plot

C.

Heat grid

D.

Pie chart

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowsight, Snowflake's user interface for exploring, analyzing, and visualizing data, supports a
variety of chart types for creating dashboards and visualizations. One of the supported chart
types in Snowsight is the Area Chart (A). Area charts are useful for representing quantities over
time and can be used to highlight volume change and rate of change, as well as to compare
multiple quantities.

While Snowsight supports many types of visualizations to help users analyze their data
effectively, including line charts, bar charts, and scatter plots, it's important to select the specific
reference documentation or release notes for the most current list of supported chart types, as
Snowflake continues to enhance and update Snowsight's capabilities.

As of the last update, Box plots (B), Heat grids (C), and Pie charts (D) are types of visualizations
that may be supported in various analytics and visualization tools, but for the specific context of
Snowsight's currently confirmed features, Area charts are a verified option for users to visualize
their data.

Reference: [Reference: Snowflake Documentation on Snowsight


(https://docs.snowflake.com/en/user-guide/ui-snowsight.html), , ]

Questions 128

What can a user win a reader account do in Snowflake?

Options:

A.

Load now data

B.

Update existing data

C.

Create a new snare

D.

Query shared data

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, a user within a reader account primarily has read-only access to the shared data.
Reader accounts are created to enable third parties or separate business units to access and query
data shared with them without allowing them to modify the underlying data. This means a user
with a reader account can perform operations like querying shared data to analyze or report on it
but cannot load new data, update existing data, or create new shares. This setup is crucial for
maintaining data governance and security while enabling data sharing and
collaboration.References: Snowflake Documentation on Reader Accounts

Questions 129

Which object type is granted permissions for reading a table?

Options:

A.

User

B.

Role

C.

Attribute

D.

Schema

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, permissions for accessing database objects, including tables, are not granted directly
to users but rather to roles. A role encapsulates a collection of privileges on various Snowflake
objects. Users are then granted roles, and through those roles, they inherit the permissions
necessary to read a table or perform other actions. This approach adheres to the principle of least
privilege, allowing for granular control over database access and simplifying the management of
user permissions.

Reference: [Reference: Snowflake's official documentation on access control introduces the


concept of roles and how they are used to manage permissions:
https://docs.snowflake.com/en/user-guide/security-access-control-overview.html#roles, , ]
Questions 130

What does Snowflake recommend for a user assigned the ACCOUNTADMIN role?

Options:

A.

The ACCCUKTMKIN role should be set as tie user's default role.

B.

The user should use federated authentication instead of a password

C.

The user should be required to use Multi-Factor Authentication (MFA).

D.

There should be just one user with the ACCOUNTADMIN role in each Snowflake account.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

For users assigned the ACCOUNTADMIN role, Snowflake recommends enforcing Multi-Factor
Authentication (MFA) to enhance security. The ACCOUNTADMIN role has extensive
permissions, making it crucial to secure accounts held by such users against unauthorized access.
MFA adds an additional layer of security by requiring a second form of verification beyond just
the username and password, significantly reducing the risk of account compromise.References:
Snowflake Security Best Practices

Questions 131

What does Snowflake attempt to do if any of the compute resources for a virtual warehouse fail
to provision during start-up?

Options:

A.
Repair the failed resources.

B.

Restart failed resources.

C.

Queue the failed resources

D.

Provision the failed resources

Show Answer Buy Now

Answer:

Explanation:

Explanation:

If any compute resources for a virtual warehouse fail to provision during startup, Snowflake will
attempt to restart those failed resources. The system is designed to automatically handle transient
issues that might occur during the provisioning of compute resources. By restarting the failed
resources, Snowflake aims to ensure that the virtual warehouse has the necessary compute
capacity to handle the user's workloads without manual intervention.References: Snowflake
Documentation on Virtual Warehouses

Questions 132

When unloading data with the COPY into command, what is the purpose of the PARTITION
BY parameter option?

Options:

A.

To sort the contents of the output file by the specified expression.

B.

To delimit the records in the output file using the specified expression.

C.
To include a new column in the output using the specified window function expression.

D.

To split the output into multiple files, one for each distinct value of the specified expression.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The PARTITION BY <expression> parameter option in the COPY INTO <location> command is
used to split the output into multiple files based on the distinct values of the specified expression.
This feature is particularly useful for organizing large datasets into smaller, more manageable files
and can help with optimizing downstream processing or consumption of the data. For example, if
you are unloading a large dataset of transactions and use PARTITION BY
DATE(transactions.transaction_date), Snowflake generates a separate output file for each unique
transaction date, facilitating easier data management and access.

This approach to data unloading can significantly improve efficiency when dealing with large
volumes of data by enabling parallel processing and simplifying data retrieval based on specific
criteria or dimensions.

References:

• Snowflake Documentation on Unloading Data: COPY INTO

Questions 133

A user needs to MINIMIZE the cost of large tables that are used to store transitory data. The data
does not need to be protected against failures, because the data can be reconstructed outside of
Snowflake.

What table type should be used?

Options:

A.

Permanent
B.

Transient

C.

Temporary

D.

Externa

Show Answer Buy Now

Answer:

Explanation:

Explanation:

For minimizing the cost of large tables that are used to store transitory data, which does not need
to be protected against failures because it can be reconstructed outside of Snowflake, the best
table type to use is Transient. Transient tables in Snowflake are designed for temporary or
transitory data storage and offer reduced storage costs compared to permanent tables. However,
unlike temporary tables, they persist across sessions until explicitly dropped.

• Why Transient Tables: Transient tables provide a cost-effective solution for storing data
that is temporary but needs to be available longer than a single session. They have lower
data storage costs because Snowflake does not maintain historical data (Time Travel) for as
long as it does for permanent tables.
• Creating a Transient Table:

CREATE TRANSIENT TABLE my_transient_table (...);

• Use Case Considerations: Transient tables are ideal for scenarios where the data is not
critical, can be easily recreated, and where cost optimization is a priority. They are suitable
for development, testing, or staging environments where data longevity is not a concern.

Reference: [Reference: For more details on transient tables and their usage scenarios, refer to the
Snowflake documentation on table types: https://docs.snowflake.com/en/sql-
reference/sql/create-table.html#table-types, , ]
Questions 134

Which statements reflect valid commands when using secondary roles? (Select TWO).

Options:

A.

Use SECONDARY ROLES RESUME

B.

USE SECONDARY ROLES SUSPEND

C.

USE SECONDARY RLES ALL

D.

USE SECONDARY ROLES ADD

E.

Use SECONDARY ROLES NONE

Show Answer Buy Now

Answer:

C, E

Explanation:

Explanation:


• Incorrect Commands: The options referencing "RESUME", "SUSPEND", and "ADD" are not
valid commands in the context of secondary roles.

References:

• Snowflake Documentation (USE SECONDARY


ROLES): https://docs.snowflake.com/en/sql-reference/sql/use-secondary-roles.html

Questions 135
Who can create network policies within Snowflake? (Select TWO).

Options:

A.

SYSADMIN only

B.

ORCADMIN only

C.

SECURITYADMIN or higher roles

D.

A role with the CREATE NETWORK POLICY privilege

E.

A role with the CREATE SECURITY INTEGRATION privilege

Show Answer Buy Now

Answer:

C, D

Explanation:

Explanation:

In Snowflake, network policies define the allowed IP address ranges from which users can
connect to Snowflake, enhancing security by restricting access based on network location. The
creation and management of network policies require sufficient privileges. Specifically, a user with
the SECURITYADMIN role or any role with higher privileges, such as ACCOUNTADMIN, can
create network policies. Additionally, a custom role can be granted the CREATE NETWORK
POLICY privilege, enabling users assigned to that role to also create network policies. This
approach allows for flexible and secure management of network access to Snowflake.References:
Snowflake Documentation on Network Policies

Questions 136

Which table function should be used to view details on a Directed Acyclic Graphic (DAG) run that
is presently scheduled or is executing?
Options:

A.

TASK_HISTORY

B.

TASK_DEPENDENTS

C.

CURRENT_TASK_GRAPHS

D.

COMPLETE_TASK_GRAPHS

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The CURRENT_TASK_GRAPHS table function is designed to provide information on Directed


Acyclic Graphs (DAGs) that are currently scheduled or executing within Snowflake. This function
offers insights into the structure and status of task chains, enabling users to monitor and
troubleshoot task executions. DAGs in Snowflake represent sequences of tasks with
dependencies, and understanding their current state is crucial for managing complex
workflows.References: Snowflake Documentation on Task Management

Questions 137

Which Snowflake feature or tool helps troubleshoot issues in SQL query expressions that
commonly cause performance bottlenecks?

Options:

A.

Persisted query results

B.
QUERY_HISTORY View

C.

Query acceleration service

D.

Query Profile

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The Snowflake feature that helps troubleshoot issues in SQL query expressions and commonly
identify performance bottlenecks is the Query Profile. The Query Profile provides a detailed
breakdown of a query's execution plan, including each operation's time and resources consumed.
It visualizes the steps involved in the query execution, highlighting areas that may be causing
inefficiencies, such as full table scans, large joins, or operations that could benefit from
optimization.

By examining the Query Profile, developers and database administrators can identify and
troubleshoot performance issues, optimize query structures, and make informed decisions about
potential schema or indexing changes to improve performance.

References:

• Snowflake Documentation on Query Profile: Using the Query Profile

Questions 138

The following settings are configured:

For how many days will the data be retained at the object level?
Options:

A.

B.

C.

D.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The settings shown in the image indicate that the data retention time in days is configured at two
different levels: the account level and the object level. At the account level,
the MIN_DATA_RETENTION_TIME_IN_DAYS is set to 5 days, and at the object level,
the DATA_RETENTION_TIME_IN_DAYS is set to 2 days. Since the object level setting has a
lower value, it takes precedence over the account level setting for the specific object. Therefore,
the data will be retained for 2 days at the object level.References: Snowflake Documentation on
Data Retention Policies

Questions 139

Which activities are managed by Slowflake's Cloud Services layer? (Select TWO).

Options:

A.

Authorisation

B.
Access delegation

C.

Data pruning

D.

Data compression

E.

Query parsing and optimization

Show Answer Buy Now

Answer:

A, E

Explanation:

Explanation:

Snowflake's Cloud Services layer is responsible for managing various aspects of the platform that
are not directly related to computing or storage. Specifically, it handles authorisation, ensuring
that users have appropriate access rights to perform actions or access data. Additionally, it takes
care of query parsing and optimization, interpreting SQL queries and optimizing their execution
plans for better performance. This layer abstracts much of the platform's complexity, allowing
users to focus on their data and queries without managing the underlying
infrastructure.References: Snowflake Architecture Documentation

Questions 140

What happens when a suspended virtual warehouse is resized in Snowflake?

Options:

A.

It will return an error.

B.

It will return a warning.

C.
The suspended warehouse is resumed and new compute resources are provisioned immediately.

D.

The additional compute resources are provisioned when the warehouse is resumed.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, resizing a virtual warehouse that is currently suspended does not immediately
provision the new compute resources. Instead, the change in size is recorded, and the additional
compute resources are provisioned when the warehouse is resumed. This means that the action
of resizing a suspended warehouse does not cause it to resume operation automatically. The
warehouse remains suspended until an explicit command to resume it is issued, or until it
automatically resumes upon the next query execution that requires it.

This behavior allows for efficient management of compute resources, ensuring that credits are not
consumed by a warehouse that is not in use, even if its size is adjusted while it is suspended.

Reference: [Reference: Snowflake Documentation on Resizing Warehouses


(https://docs.snowflake.com/en/user-guide/warehouses-tasks.html#resizing-a-warehouse), , ]

Questions 141

Which type of role can be granted to a share?

Options:

A.

Account role

B.

Custom role

C.

Database role
D.

Secondary role

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, shares are used to share data between Snowflake accounts. When creating a share,
it is possible to grant access to the share to roles within the Snowflake account that is creating
the share. The type of role that can be granted to a share is a Custom role. Custom roles are user-
defined roles that account administrators can create to manage access control in a more granular
way. Unlike predefined roles such as ACCOUNTADMIN or SYSADMIN, custom roles can be
tailored with specific privileges to meet the security and access requirements of different groups
within an organization.

Granting a custom role access to a share enables users associated with that role to access the
shared data if the share is received by another Snowflake account. It is important to carefully
manage the privileges granted to custom roles to ensure that data sharing aligns with
organizational policies and data governance standards.

References:

• Snowflake Documentation on Shares: Shares


• Snowflake Documentation on Roles: Access Control

Questions 142

Snowflake's access control framework combines which models for securing data? (Select TWO).

Options:

A.

Attribute-based Access Control (ABAC 1

B.

Discretionary Access Control (DAC)


C.

Access Control List (ACL)

D.

Role-based Access Control (RBAC)

E.

Rule-based Access Control (RuBAC)

Show Answer Buy Now

Answer:

B, D

Explanation:

Explanation:

Snowflake's access control framework utilizes a combination of Discretionary Access Control


(DAC) and Role-based Access Control (RBAC). DAC in Snowflake allows the object owner to grant
access privileges to other roles. RBAC involves assigning roles to users and then granting
privileges to those roles. Through roles, Snowflake manages which users have access to specific
objects and what actions they can perform, which is central to security and governance in the
Snowflake environment.References: Snowflake Documentation on Access Control,

Questions 143

Based on Snowflake recommendations, when creating a hierarchy of custom roles, the top-most
custom role should be assigned to which role?

Options:

A.

ACCOUNTADMIN

B.

SECURITYADMIN

C.

SYSADMIN
D.

USERADMIN

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Based on Snowflake recommendations, when creating a hierarchy of custom roles, the top-most
custom role should ideally be granted to the ACCOUNTADMIN role. This recommendation stems
from the best practices for implementing a least privilege access control model, ensuring that only
the necessary permissions are granted at each level of the role hierarchy. The ACCOUNTADMIN
role has the highest level of privileges in Snowflake, including the ability to manage all aspects of
the Snowflake account. By assigning the top-most custom role to ACCOUNTADMIN, you ensure
that the administration of role hierarchies and the assignment of roles remain under the control of
users with the highest level of oversight and responsibility within the Snowflake environment.

References:

• Snowflake Documentation on Access Control: Managing Access Control

Questions 144

If a virtual warehouse is suspended, what happens to the warehouse cache?

Options:

A.

The cache is dropped when the warehouse is suspended and is no longer available upon restart.

B.

The warehouse cache persists for as long the warehouse exists, regardless of its suspension
status.

C.
The cache is maintained for up to two hours and can be restored If the warehouse Is restarted
within this limit.

D.

The cache is maintained for the auto suspend duration and can be restored it the warehouse 15
restarted within this limit.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

When a virtual warehouse in Snowflake is suspended, the cache is dropped and is no longer
available upon restart. This means that all cached data, including results and temporary data, are
cleared from memory. The purpose of this behavior is to conserve resources while the warehouse
is not active. Upon restarting the warehouse, it will need to reload any data required for queries
from storage, which may result in a slower initial performance until the cache is repopulated. This
is a critical consideration for managing performance and cost in Snowflake.

Questions 145

Which Snowflake feature records changes mace to a table so actions can be taken using that
change data capture?

Options:

A.

Materialized View

B.

Pipe

C.

Stream

D.

Task
Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowflake's Streams feature is specifically designed for change data capture (CDC). A stream
records insert, update, and delete operations performed on a table, and allows users to query
these changes. This enables actions to be taken on the changed data, facilitating processes like
incremental data loads and real-time analytics. Streams provide a powerful mechanism for
applications to respond to data changes in Snowflake tables efficiently.References: Snowflake
Documentation on Streams

Questions 146

Which privilege is required on a virtual warehouse to abort any existing executing queries?

Options:

A.

USAGE

B.

OPERATE

C.

MODIFY

D.

MONITOR

Show Answer Buy Now

Answer:

B
Explanation:

Explanation:

The privilege required on a virtual warehouse to abort any existing executing queries
is OPERATE. The OPERATE privilege on a virtual warehouse allows a user to perform operational
tasks on the warehouse, including starting, stopping, and restarting the warehouse, as well as
aborting running queries. This level of control is essential for managing resource utilization and
ensuring that the virtual warehouse operates efficiently.

References:

• Snowflake Documentation on Access Control: Access Control Privileges

Questions 147

What happens to the privileges granted to Snowflake system-defined roles?

Options:

A.

The privileges cannot be revoked.

B.

The privileges can be revoked by an ACCOUNTADMIN.

C.

The privileges can be revoked by an orgadmin.

D.

The privileges can be revoked by any user-defined role with appropriate privileges.

Show Answer Buy Now

Answer:

Explanation:

Explanation:
The privileges granted to Snowflake's system-defined roles cannot be revoked. System-defined
roles, such as SYSADMIN, ACCOUNTADMIN, SECURITYADMIN, and others, come with a set of
predefined privileges that are essential for the roles to function correctly within the Snowflake
environment. These privileges are intrinsic to the roles and ensure that users assigned these roles
can perform the necessary tasks and operations relevant to their responsibilities.

The design of Snowflake's role-based access control (RBAC) model ensures that system-defined
roles have a specific set of non-revocable privileges to maintain the security, integrity, and
operational efficiency of the Snowflake environment. This approach prevents accidental or
intentional modification of privileges that could disrupt the essential functions or compromise the
security of the Snowflake account.

References:

• Snowflake Documentation on Access Control: Understanding Role-Based Access Control


(RBAC)

Questions 148

Which function can be used with the copy into statement to convent rows from a relational table
to a single variant column, and to unload rows into a JSON file?

Options:

A.

FLATTEN

B.

OBJECT_AS

C.

OBJECT_CONSTRUCT

D.

TO VARIANT

Show Answer Buy Now

Answer:

D
Explanation:

Explanation:

The correct function to use with the COPY INTO <location> statement to convert rows from a
relational table into a single variant column and to unload rows into a JSON file is TO VARIANT.
The TO VARIANT function is used to explicitly convert a value of any supported data type into a
VARIANT data type. This is particularly useful when needing to aggregate multiple columns or
complex data structures into a single JSON-formatted string, which can then be unloaded into a
file.

In the context of unloading data, the COPY INTO <location> statement combined with TO
VARIANT enables the conversion of structured data from Snowflake tables into a semi-structured
VARIANT format, typically JSON, which can then be efficiently exported and stored. This
approach is often utilized for data integration scenarios, backups, or when data needs to be
shared in a format that is easily consumed by various applications or services that support JSON.

References:

• Snowflake Documentation on Data Unloading: Unloading Data


• Snowflake Documentation on VARIANT Data Type: Working with JSON

Questions 149

Which Snowflake data governance feature can support auditing when a user query reads column
data?

Options:

A.

Access History

B.

Data classification

C.

Column-level security

D.

Object dependencies

Show Answer Buy Now


Answer:

Explanation:

Explanation:

Access History in Snowflake is a feature designed to support auditing by tracking access to data
within Snowflake, including when a user's query reads column data. It provides detailed
information on queries executed, including the user who ran the query, the query text, and the
objects (e.g., tables, views) accessed by the query. This feature is instrumental for auditing
purposes, helping organizations to monitor and audit data access for security and compliance.

Reference: [Reference: Snowflake Documentation on Access History which details its use for
auditing access to data: https://docs.snowflake.com/en/user-guide/access-history.html, , ]

Questions 150

How would a user run a multi-cluster warehouse in maximized mode?

Options:

A.

Configure the maximum clusters setting to "Maximum."

B.

Turn on the additional clusters manually after starting the warehouse.

C.

Set the minimum Clusters and maximum Clusters settings to the same value.

D.

Set the minimum clusters and maximum clusters settings to different values.

Show Answer Buy Now

Answer:

C
Explanation:

Explanation:

To run a multi-cluster warehouse in maximized mode, a user should set the minimum and
maximum number of clusters to the same value. This ensures that all clusters are available when
the warehouse is started, providing maximum resources for query execution. References:
Snowflake Documentation2.

Questions 151

By default, which role can create resource monitors?

Options:

A.

ACCOUNTADMIN

B.

SECURITYADMIN

C.

SYSADMIN

D.

USERADMIN

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The role that can by default create resource monitors in Snowflake is the ACCOUNTADMIN role.
Resource monitors are a crucial feature in Snowflake that allows administrators to track and
control the consumption of compute resources, ensuring that usage stays within specified limits.
The creation and management of resource monitors involve defining thresholds for credits usage,
setting up notifications, and specifying actions to be taken when certain thresholds are exceeded.
Given the significant impact that resource monitors can have on the operational aspects and
billing of a Snowflake account, the capability to create and manage them is restricted to
the ACCOUNTADMIN role. This role has the broadest set of privileges in Snowflake, including
the ability to manage all aspects of the account, such as users, roles, warehouses, databases, and
resource monitors, among others.

References:

• Snowflake Documentation on Resource Monitors: Managing Resource Monitors

Questions 152

If file format options are specified in multiple locations, the load operation selects which option
FIRST to apply in order of precedence?

Options:

A.

Table definition

B.

Stage definition

C.

Session level

D.

COPY INTO TABLE statement

Show Answer Buy Now

Answer:

Explanation:

Explanation:

When file format options are specified in multiple locations, the load operation applies the
options in the following order of precedence: first, the COPY INTO TABLE statement; second, the
stage definition; and third, the table definition1
Questions 153

What effect does WAIT_FOR_COMPLETION = TRUE have when running an ALTER


WAREHOUSE command and changing the warehouse size?

Options:

A.

The warehouse size does not change until all queries currently running in the warehouse have
completed.

B.

The warehouse size does not change until all queries currently in the warehouse queue have
completed.

C.

The warehouse size does not change until the warehouse is suspended and restarted.

D.

It does not return from the command until the warehouse has finished changing its size.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The WAIT_FOR_COMPLETION = TRUE parameter in an ALTER WAREHOUSE command ensures


that the command does not return until the warehouse has completed resizing. This means that
the command will wait until all the necessary compute resources have been provisioned and the
warehouse size has been changed. References: [COF-C02] SnowPro Core Certification Exam
Study Guide

Questions 154

What action can a user take to address query concurrency issues?


Options:

A.

Enable the query acceleration service.

B.

Enable the search optimization service.

C.

Add additional clusters to the virtual warehouse

D.

Resize the virtual warehouse to a larger instance size.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

To address query concurrency issues, a user can add additional clusters to the virtual
warehouse. This allows for the distribution of queries across multiple clusters, reducing the load
on any single cluster and improving overall query performance2.

Questions 155

A user needs to create a materialized view in the schema MYDB.MYSCHEMA. Which statements
will provide this access?

Options:

A.

GRANT ROLE MYROLE TO USER USER1;

GRANT CREATE MATERIALIZED VIEW ON SCHEMA MYDB.MYSCHEMA TO ROLE MYROLE;

B.

GRANT ROLE MYROLE TO USER USER1;


GRANT CREATE MATERIALIZED VIEW ON SCHEMA MYDB.MYSCHEMA TO USER USER1;

C.

GRANT ROLE MYROLE TO USER USER1;

GRANT CREATE MATERIALIZED VIEW ON SCHEMA MYDB. K"-'SCHEMA TO USER! ;

D.

GRANT ROLE MYROLE TO USER USER1;

GRANT CREATE MATERIALIZED VIEW ON SCHEMA MYDB.MYSCHEMA TO MYROLE;

Show Answer Buy Now

Answer:

Explanation:

Explanation:

To provide a user with the necessary access to create a materialized view in a schema, the user
must be granted a role that has the CREATE MATERIALIZED VIEW privilege on that
schema. First, the role is granted to the user, and then the privilege is granted to the role

Questions 156

Which REST API can be used with unstructured data?

Options:

A.

inscrtFilcs

B.

insertReport

C.

GET /api/tiles/

D.

loadHistoryScan
Show Answer Buy Now

Answer:

Explanation:

Explanation:

The REST API used with unstructured data in Snowflake is GET /api/files/, which retrieves
(downloads) a data file from an internal or external stage4.

Questions 157

For the ALLOWED VALUES tag property, what is the MAXIMUM number of possible string
values for a single tag?

Options:

A.

10

B.

50

C.

64

D.

256

Show Answer Buy Now

Answer:

Explanation:

Explanation:
For the ALLOWED VALUES tag property, the maximum number of possible string values for a
single tag is 256. This allows for a wide range of values to be assigned to a tag when it is set on an
object

Questions 158

What is the recommended compressed file size range for continuous data loads using Snowpipe?

Options:

A.

8-16 MB

B.

16-24 MB

C.

10-99 MB

D.

100-250 MB

Show Answer Buy Now

Answer:

Explanation:

Explanation:

For continuous data loads using Snowpipe, the recommended compressed file size range is
between 100-250 MB. This size range is suggested to optimize the number of parallel operations
for a load and to avoid size limitations, ensuring efficient and cost-effective data loading

Questions 159

What internal stages are available in Snowflake? (Choose three.)

Options:

A.
Schema stage

B.

Named stage

C.

User stage

D.

Stream stage

E.

Table stage

F.

Database stage

Show Answer Buy Now

Answer:

B, C, E

Explanation:

Explanation:

Snowflake supports three types of internal stages: Named, User, and Table stages. These stages
are used for staging data files to be loaded into Snowflake tables. Schema, Stream, and Database
stages are not supported as internal stages in Snowflake. References: Snowflake
Documentation1.

Questions 160

How can a user change which columns are referenced in a view?

Options:

A.

Modify the columns in the underlying table

B.
Use the ALTER VIEW command to update the view

C.

Recreate the view with the required changes

D.

Materialize the view to perform the changes

Show Answer Buy Now

Answer:

Explanation:

Explanation:

In Snowflake, to change the columns referenced in a view, the view must be recreated with the
required changes. The ALTER VIEW command does not allow changing the definition of a view; it
can only be used to rename a view, convert it to or from a secure view, or add, overwrite, or
remove a comment for a view. Therefore, the correct approach is to drop the existing view and
create a new one with the desired column references.

Questions 161

How does Snowflake recommend handling the bulk loading of data batches from files already
available in cloud storage?

Options:

A.

Use Snowpipe.

B.

Use the INSERT command.

C.

Use an external table.

D.

Use the COPY command.


Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowflake recommends using the COPY command for bulk loading data batches from files
already available in cloud storage. This command allows for efficient and large-scale data loading
operations from files staged in cloud storage into Snowflake tables3.

Questions 162

What are benefits of using Snowpark with Snowflake? (Select TWO).

Options:

A.

Snowpark uses a Spark engine to generate optimized SQL query plans.

B.

Snowpark automatically sets up Spark within Snowflake virtual warehouses.

C.

Snowpark does not require that a separate cluster be running outside of Snowflake.

D.

Snowpark allows users to run existing Spark code on virtual warehouses without the need to
reconfigure the code.

E.

Snowpark executes as much work as possible in the source databases for all operations including
User-Defined Functions (UDFs).

Show Answer Buy Now

Answer:

C, D
Explanation:

Explanation:

Snowpark is designed to bring the data programmability to Snowflake, enabling developers to


write code in familiar languages like Scala, Java, and Python. It allows for the execution of these
codes directly within Snowflake’s virtual warehouses, eliminating the need for a separate
cluster. Additionally, Snowpark’s compatibility with Spark allows users to leverage their existing
Spark code with minimal changes1.

Questions 163

A Snowflake user executed a query and received the results. Another user executed the same
query 4 hours later. The data had not changed.

What will occur?

Options:

A.

No virtual warehouse will be used, data will be read from the result cache.

B.

No virtual warehouse will be used, data will be read from the local disk cache.

C.

The default virtual warehouse will be used to read all data.

D.

The virtual warehouse that is defined at the session level will be used to read all data.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Snowflake maintains a result cache that stores the results of every query for 24 hours. If the
same query is executed again within this time frame and the data has not changed, Snowflake will
retrieve the data from the result cache instead of using a virtual warehouse to recompute the
results2.

Questions 164

What is cached during a query on a virtual warehouse?

Options:

A.

All columns in a micro-partition

B.

Any columns accessed during the query

C.

The columns in the result set of the query

D.

All rows accessed during the query

Show Answer Buy Now

Answer:

Explanation:

Explanation:

During a query on a virtual warehouse, the columns in the result set of the query are cached. This
allows for faster retrieval of data if the same or a similar query is run again, as the system can
retrieve the data from the cache rather than reprocessing the entire query. References: [COF-
C02] SnowPro Core Certification Exam Study Guide

Questions 165

Which native data types are used for storing semi-structured data in Snowflake? (Select TWO)

Options:

A.
NUMBER

B.

OBJECT

C.

STRING

D.

VARCHAR

E.

VARIANT

Show Answer Buy Now

Answer:

B, E

Explanation:

Explanation:

Snowflake supports semi-structured data types, which include OBJECT and VARIANT. These
data types are capable of storing JSON-like data structures, allowing for flexibility in data
representation. OBJECT can directly contain VARIANT, and thus indirectly contain any other data
type, including itself1.

Questions 166

Which statement accurately describes a characteristic of a materialized view?

Options:

A.

A materialized view can query only a single table.

B.

Data accessed through materialized views can be stale.

C.
Materialized view refreshes need to be maintained by the user.

D.

Querying a materialized view is slower than executing a query against the base table of the view.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

A characteristic of a materialized view is that the data accessed through it can be stale. This is
because the data in a materialized view may not reflect the latest changes in the base tables until
the view is refreshed

Questions 167

Which parameter can be used to instruct a COPY command to verify data files instead of loading
them into a specified table?

Options:

A.

STRIP_NULL_VALUES

B.

SKIP_BYTE_ORDER_MARK

C.

REPLACE_INVALID_CHARACTERS

D.

VALIDATION_MODE

Show Answer Buy Now


Answer:

Explanation:

Explanation:

The VALIDATION_MODE parameter can be used with the COPY command to verify data files
without loading them into the specified table. This parameter allows users to check for errors in
the files

Questions 168

A materialized view should be created when which of the following occurs? (Choose two.)

Options:

A.

There is minimal cost associated with running the query.

B.

The query consumes many compute resources every time it runs.

C.

The base table gets updated frequently.

D.

The query is highly optimized and does not consume many compute resources.

E.

The results of the query do not change often and are used frequently.

Show Answer Buy Now

Answer:

B, E

Explanation:

Explanation:
A materialized view is beneficial when the query consumes many compute resources every time it
runs (B), and when the results of the query do not change often and are used frequently (E). This
is because materialized views store pre-computed data, which can speed up query performance
for workloads that are run frequently or are complex

Questions 169

A data provider wants to share data with a consumer who does not have a Snowflake account.
The provider creates a reader account for the consumer following these steps:

1. Created a user called "CONSUMER"

2. Created a database to hold the share and an extra-small warehouse to query the data

3. Granted the role PUBLIC the following privileges: Usage on the warehouse, database, and
schema, and SELECT on all the objects in the share

Based on this configuration what is true of the reader account?

Options:

A.

The reader account will automatically use the Standard edition of Snowflake.

B.

The reader account compute will be billed to the provider account.

C.

The reader account can clone data the provider has shared, but cannot re-share it.

D.

The reader account can create a copy of the shared data using CREATE TABLE AS...

Show Answer Buy Now

Answer:

Explanation:

Explanation:
The reader account compute will be billed to the provider account. Very Comprehensive
Explanation: In Snowflake, when a provider creates a reader account for a consumer who does
not have a Snowflake account, the compute resources used by the reader account are billed to
the provider’s account. This allows the consumer to query the shared data without incurring any
costs. References: [COF-C02] SnowPro Core Certification Exam Study Guide

Questions 170

What file formats does Snowflake support for loading semi-structured data? (Choose three.)

Options:

A.

TSV

B.

JSON

C.

PDF

D.

Avro

E.

Parquet

F.

JPEG

Show Answer Buy Now

Answer:

B, D, E

Explanation:

Explanation:
Snowflake supports several semi-structured data formats for loading data. The supported
formats include JSON, Avro, and Parquet12. These formats allow for efficient storage and
querying of data that does not conform to a traditional relational database schema.

Questions 171

Which Snowflake tool would be BEST to troubleshoot network connectivity?

Options:

A.

SnowCLI

B.

SnowUI

C.

SnowSQL

D.

SnowCD

Show Answer Buy Now

Answer:

Explanation:

Explanation:

SnowCD (Snowflake Connectivity Diagnostic Tool) is the best tool provided by Snowflake for
troubleshooting network connectivity issues. It helps diagnose and resolve issues related to
connecting to Snowflake services

https://docs.snowflake.com/en/user-
guide/snowcd.html#:~:text=SnowCD%20(i.e.%20Snowflake%20Connectivity%20Diagnostic,their
%20network%20connection%20to%20Snowflake .

Questions 172
What is the recommended way to change the existing file format type in my format from CSV to
JSON?

Options:

A.

ALTER FILE FORMAT my_format SET TYPE=JSON;

B.

ALTER FILE FORMAT my format SWAP TYPE WITH JSON;

C.

CREATE OR REPLACE FILE FORMAT my format TYPE-JSON;

D.

REPLACE FILE FORMAT my format TYPE-JSON;

Show Answer Buy Now

Answer:

Explanation:

Explanation:

To change the existing file format type from CSV to JSON, the recommended way is to use the
ALTER FILE FORMAT command with the SET TYPE=JSON clause. This alters the file format
specification to use JSON instead of CSV. References: Based on my internal knowledge as of
2021.

Questions 173

Which user object property requires contacting Snowflake Support in order to set a value for it?

Options:

A.

DISABLED

B.
MINS TO BYPASS MFA

C.

MINS TO BYPASS NETWORK POLICY

D.

MINS TO UNLOCK

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The user property ‘MINS TO BYPASS MFA’ in Snowflake allows temporary bypass of MFA for a
user, which can be set by an account administrator without contacting Snowflake Support2.

Questions 174

A view is defined on a permanent table. A temporary table with the same name is created in the
same schema as the referenced table. What will the query from the view return?

Options:

A.

The data from the permanent table.

B.

The data from the temporary table.

C.

An error stating that the view could not be compiled.

D.

An error stating that the referenced object could not be uniquely identified.

Show Answer Buy Now


Answer:

Explanation:

Explanation:

When a view is defined on a permanent table, and a temporary table with the same name is
created in the same schema, the query from the view will return the data from the permanent
table. Temporary tables are session-specific and do not affect the data returned by views defined
on permanent tables2.

Questions 175

Which type of join will list a I rows in the specified table, even if those rows have no match in the
other table?

Options:

A.

Cross join

B.

Inner join

C.

Natural join

D.

Outer join

Show Answer Buy Now

Answer:

Explanation:

Explanation:
An outer join, specifically a left outer join, will list all rows from the left table and match them
with rows from the right table. If there is no match, the result will still include the row from the
left table, with NULLs for columns from the right table. References: Based on general SQL
knowledge as of 2021.

Questions 176

What is the difference between a stored procedure and a User-Defined Function (UDF)?

Options:

A.

Stored procedures can execute database operations while UDFs cannot.

B.

Returning a value is required in a stored procedure while returning values in a UDF is optional.

C.

Values returned by a stored procedure can be used directly in a SQL statement while the values
returned by a UDF cannot.

D.

Multiple stored procedures can be called as part of a single executable statement while a single
SQL statement can only call one UDF at a time.

Show Answer Buy Now

Answer:

Explanation:

Explanation:

Stored procedures in Snowflake can perform a variety of database operations, including DDL and
DML, whereas UDFs are designed to return values and cannot execute database operations1.

Questions 177

Snowflake's hierarchical key mode includes which keys? (Select TWO).


Options:

A.

Account master keys

B.

Database master keys

C.

File keys

D.

Secure view keys

E.

Schema master keys

Show Answer Buy Now

Answer:

A, C

Explanation:

Explanation:

Snowflake’s hierarchical key model includes several levels of keys, where Account master keys
and File keys are part of this hierarchy. Account master keys are used to encrypt all the data
within an account, while File keys are used to encrypt individual files within the database2.

Questions 178

Which kind of Snowflake table stores file-level metadata for each file in a stage?

Options:

A.

Directory

B.

External
C.

Temporary

D.

Transient

Show Answer Buy Now

Questions 179

What is the minimum Snowflake edition needed for database failover and fail-back between
Snowflake accounts for business continuity and disaster recovery?

Options:

A.

Standard

B.

Enterprise

C.

Business Critical

D.

Virtual Private Snowflake

Show Answer Buy Now

Answer:

Explanation:

Explanation:

The minimum Snowflake edition required for database failover and fail-back between Snowflake
accounts for business continuity and disaster recovery is the Business Critical edition. References:
Snowflake Documentation3.

Questions 180
Which database objects can be shared with the Snowflake secure data sharing feature? (Choose
two.)

Options:

A.

Files

B.

External tables

C.

Secure User-Defined Functions (UDFs)

D.

Sequences

E.

Streams

Show Answer Buy Now

Answer:

B, C

Explanation:

Explanation:

Snowflake’s secure data sharing feature allows sharing of certain database objects with other
Snowflake accounts. Among the options provided, external tables and secure UDFs can be shared

Questions 181

Which operations are handled in the Cloud Services layer of Snowflake? (Select TWO).

Options:

A.

Security

B.
Data storage

C.

Data visualization

D.

Query computation

E.

Metadata management

Show Answer Buy Now

Answer:

A, E

Explanation:

Explanation:

The Cloud Services layer in Snowflake is responsible for various services, including security (like
authentication and authorization) and metadata management (like query parsing and
optimization). References: Based on general cloud architecture knowledge as of 2021.

Q1. What can you easily check to see if a large table will
benefit from explicitly defining a clustering key?

A. Clustering depth

B. Clustering ratio

C . Values in a table
Answer: A. The clustering depth measures the average depth of the
overlapping micro-partitions for specified columns in a table (1 or
greater). The smaller the cluster depth is, the better clustered the
table is. You can get the clustering depth of a Snowflake table using
this command:

SELECT SYSTEM$CLUSTERING_DEPTH('TABLE NAME');

Further Reading: https://docs.snowflake.com/en/sql-


reference/functions/system_clustering_depth

Q2. A query executed a couple of hours ago, which took


more than 5 minutes to run, is executed again, and it
returned the results in less than a second. What might
have happened?

A. Snowflake used the persisted query results from the metadata


cache

B. Snowflake used the persisted query results from the query result
cache

C. Snowflake used the persisted query results from the warehouse


cache

D. A new Snowflake version has been released in the last two hours,
improving the speed of the service
Answer: B. The query result cache stores the results of our queries
for 24 hours, so as long as we perform the same query and the data
hasn’t changed in the storage layer, it will return the same result
without using the warehouse and without consuming credits.

Q3. Which type of data incurs Snowflake storage costs?

A. Data Stored in permanent tables.

B. Data Stored in temporal tables.

C. Cache results.

D. Data retained for Fail-Safe & Time-Travel.

Answer: A, B, D. Storage fees are incurred for maintaining


historical data during both the Time Travel and Fail-safe periods. To
help manage the storage costs, Snowflake provides temporary and
transient tables, which do not incur the same fees as permanent
tables but also incur charges. We can also include data stored in
Snowflake locations (i.e., user and table stages or internal named
stages).

Q4. What option will you specify to delete the stage files
after a successful load into a Snowflake table with the
COPY INTO command?

A. DELETE = TRUE
B. REMOVE = TRUE

C. PURGE = TRUE

D. TRUNCATE = TRUE

Answer: C. If the PURGE option is set to TRUE, Snowflake will try


its best to remove successfully loaded data files from stages. If the
purge operation fails for any reason, it won’t return any error for
now:

COPY INTO <TABLE NAME> PURGE = TRUE;

Q5. We need to temporarily store intermediate data, which


an ETL process will only use. We don’t need the data
outside the ETL process. If you want to optimize storage
cost, what type of table will you create to store this data?

A. Permanent

B. Temporary

C. Transient

D. External
Answer: B. With temporary tables, you can optimize storage costs,
as when the Snowflake session ends, data stored in the table is
entirely purged from the system. But they also require storage costs
while the session is active. A temporary table is purged once the
session ends, so the retention period is for 24 hours or the
remainder of the session.
Question # 1

When does a materialized view get suspended in Snowflake?

A.

When a column is added to the base table

B.

When a column is dropped from the base table

C.

When a DML operation is run on the base table

D.

When the base table is reclustered


View Answer

Answer:

B
Explanation:

Explanation:

A materialized view in Snowflake gets suspended when structural changes that could impact the
view's integrity are made to the base table, such as B. When a column is dropped from the base
table. Dropping a column from the base table on which a materialized view is defined can
invalidate the view's data, as the view might rely on the column that is being removed. To
maintain data consistency and prevent the materialized view from serving stale or incorrect data,
Snowflake automatically suspends the materialized view.

Upon suspension, the materialized view does not reflect changes to the base table until it is
refreshed or re-created. This ensures that only accurate and current data is presented to users
querying the materialized view.

References:

• Snowflake Documentation on Materialized Views: Materialized Views

Question # 2
Which task is supported by the use of Access History in Snowflake?

A.

Data backups

B.

Cost monitoring

C.

Compliance auditing

D.

Performance optimization

View Answer
Answer:

C
Explanation:

Explanation:

Access History in Snowflake is primarily utilized for compliance auditing. The Access History
feature provides detailed logs that track data access and modifications, including queries that read
from or write to database objects. This information is crucial for organizations to meet regulatory
requirements and to perform audits related to data access and usage.

• Role of Access History: Access History logs are designed to help organizations understand
who accessed what data and when. This is particularly important for compliance with
various regulations that require detailed auditing capabilities.
• How Access History Supports Compliance Auditing:

Reference: [Reference: For more information on how Access History supports compliance auditing,
refer to the Snowflake documentation on Access History: https://docs.snowflake.com/en/sql-
reference/account-usage/access_history.html, , ]
Question # 3

The property mins_to_bypass_network_policy is set at which level?

A.

User

B.

Role

C.

Account

D.

Organization

View Answer

Answer:

C
Explanation:

Explanation:

The property mins_to_bypass_network_policy is set at the account level in Snowflake. This


setting allows administrators to specify a time frame during which users can bypass network
policies that have been set on their account. It is particularly useful in scenarios where temporary
access needs to be granted from IPs not covered by the existing network policies. By adjusting
this property at the account level, Snowflake administrators can manage and enforce network
access controls efficiently across the entire account.

References:

• Snowflake Documentation on Network Policies: Network Policies

Question # 4
In which hierarchy is tag inheritance possible?

A.

Organization » Account» Role

B.

Account » User » Schema

C.

Database » View » Column

D.

Schema » Table » Column

View Answer

Answer:

D
Explanation:

Explanation:

In Snowflake, tag inheritance is a feature that allows tags, which are key-value pairs assigned to
objects for the purpose of data governance and metadata management, to be inherited within a
hierarchy. The hierarchy in which tag inheritance is possible is from Schema to Table to Column.
This means that a tag applied to a schema can be inherited by the tables within that schema, and a
tag applied to a table can be inherited by the columns within that table.References: Snowflake
Documentation on Tagging and Object Hierarchy

Question # 5

What command is used to export or unload data from Snowflake?

A.

PUT @mystage

B.

GET @mystage

C.

COPY INTO @mystage

D.

INSERT @mystage

View Answer

Answer:

A
Explanation:

Explanation:

The command used to export or unload data from Snowflake to a stage (such as a file in an S3
bucket, Azure Blob Storage, or Google Cloud Storage) is the PUT command. The PUT command is
designed to upload data files from a local file system (in the case of SnowSQL or other client) or a
virtual warehouse to a specified stage. This functionality is critical for scenarios where data needs
to be extracted from Snowflake for use in external systems, backups, or further processing.

The syntax for the PUT command follows the structure: PUT
file://<local_file_path> @<stage_name>, where <local_file_path> specifies the path to the file(s)
on the local file system that you wish to upload, and <stage_name> specifies the destination stage
in Snowflake.

It's important to distinguish that the PUT command is used for exporting data out of Snowflake,
whereas the COPY INTO <table> command is used for importing data into Snowflake from a
stage. The GET command, on the other hand, is used to download files from a stage to the local
file system, essentially the inverse operation of the PUT command.

Question # 6

A user with which privileges can create or manage other users in a Snowflake account? (Select
TWO).

A.

GRANT

B.

SELECT

C.

MODIFY

D.

OWNERSHIP

E.

CREATE USER

View Answer Full Access

Answer:

D, E
Explanation:

Explanation:

A user with the OWNERSHIP privilege on a user object or the CREATE USER privilege on the
account can create or manage other users in a Snowflake account56.

Question # 7

For the ALLOWED VALUES tag property, what is the MAXIMUM number of possible string
values for a single tag?

A.

10
B.

50

C.

64

D.

256

View Answer Full Access

Answer:

D
Explanation:

Explanation:

For the ALLOWED VALUES tag property, the maximum number of possible string values for a
single tag is 256. This allows for a wide range of values to be assigned to a tag when it is set on an
object

Question # 8

Which view can be used to determine if a table has frequent row updates or deletes?

A.

TABLES

B.

TABLE_STORAGE_METRICS

C.

STORAGE_DAILY_HISTORY

D.

STORAGE USAGE

View Answer Full Access


Answer:

B
Explanation:

Explanation:

The TABLE_STORAGE_METRICS view can be used to determine if a table has frequent row
updates or deletes. This view provides detailed metrics on the storage utilization of tables within
Snowflake, including metrics that reflect the impact of DML operations such as updates and
deletes on table storage. For example, metrics related to the number of active and deleted rows
can help identify tables that experience high levels of row modifications, indicating frequent
updates or deletions.

References:

• Snowflake Documentation: TABLE_STORAGE_METRICS View

Question # 9
What type of columns does Snowflake recommend to be used as clustering keys? (Select TWO).

A.

A VARIANT column

B.

A column with very low cardinality

C.

A column with very high cardinality

D.

A column that is most actively used in selective filters

E.

A column that is most actively used in join predicates

View Answer Full Access

Answer:

C, D
Explanation:

Explanation:

Snowflake recommends using columns with very high cardinality and those that are most actively
used in selective filters as clustering keys. High cardinality columns have a wide range of unique
values, which helps in evenly distributing the data across micro-partitions. Columns used in
selective filters help in pruning the number of micro-partitions to scan, thus improving query
performance. References: Based on general database optimization principles.

You might also like