snowpro_core_dumps
snowpro_core_dumps
Correct Answer: B
Which of the following are valid Snowflake Virtual Warehouse Scaling Policies? (Choose two.)
A. Custom
B. Economy
C. Optimized
D. Standard
Correct Answer: BD
True or False: A single database can exist in more than one Snowflake account.
A. True
B. False
Correct Answer: B
Which of the following roles is recommended to be used to create and manage users and roles?
A. SYSADMIN
B. SECURITYADMIN
C. PUBLIC
D. ACCOUNTADMIN
Correct Answer: D
Question #5 Topic 1
True or False: Bulk unloading of data from Snowflake supports the use of a SELECT statement.
A. True
B. False
Correct Answer: A
True or False: A customer using SnowSQL / native connectors will be unable to also use the Snowflake
Web Interface (UI) unless access to the UI is explicitly granted by support.
A. True
B. False
Correct Answer: B
Question #8 Topic 1
Account-level storage usage can be monitored via:
A. The Snowflake Web Interface (UI) in the Databases section
B. The Snowflake Web Interface (UI) in the Account -> Billing & Usage section
C. The Information Schema -> ACCOUNT_USAGE_HISTORY View
D. The Account Usage Schema -> ACCOUNT_USAGE_METRICS View
Correct Answer: B
Credit Consumption by the Compute Layer (Virtual Warehouses) is based on: (Choose two.)
A. Number of users
B. Warehouse size
C. Amount of data processed
D. # of clusters for the Warehouse Most Voted
Correct Answer: BC
Correct Answer: A
True or False: The COPY command must specify a File Format in order to execute.
A. True
B. False Most Voted
Correct Answer: A
Which of the following commands sets the Virtual Warehouse for a session?
A. COPY WAREHOUSE FROM <<config file>>;
B. SET WAREHOUSE = <<warehouse name>>;
C. USE WAREHOUSE <<warehouse name>>;
D. USE VIRTUAL_WAREHOUSE <<warehouse name>>;
Correct Answer: C
Which object allows you to limit the number of credits consumed within a Snowflake account?
A. Account Usage Tracking
B. Resource Monitor
C. Warehouse Limit Parameter
D. Credit Consumption Tracker
Correct Answer: B
Question #15 Topic 1
Snowflake is designed for which type of workloads? (Choose two.)
A. OLAP (Analytics) workloads Most Voted
B. OLTP (Transactional) workloads
C. Concurrent workloads Most Voted
D. On-premise workloads
Correct Answer: AB
Correct Answer: A
Correct Answer: B
Correct Answer: B
Which of the following statements describes a benefit of Snowflake's separation of compute and storage?
(Choose all that apply.)
A. Growth of storage and compute are tightly coupled together
B. Storage expands without the requirement to add more compute Most Voted
C. Compute can be scaled up or down without the requirement to add more storage Most Voted
D. Multiple compute clusters can access stored data without contention Most Voted
Correct Answer: A
Correct Answer: D
Correct Answer: B
Correct Answer: B
Correct Answer: BC
Correct Answer: A
Correct Answer: B
Correct Answer: B
What is the minimum Snowflake edition that customers planning on storing protected information in
Snowflake should consider for regulatory compliance?
A. Standard
B. Premier
C. Enterprise
D. Business Critical Edition Most Voted
Correct Answer: D
Correct Answer: B
Correct Answer: A
Correct Answer: A
A. True
B. False Most Voted
Correct Answer: B
Correct Answer: AC
Correct Answer: B
Correct Answer: D
Correct Answer: A
Correct Answer: C
Correct Answer: AB
Correct Answer: B
Correct Answer: D
Correct Answer: CD
Correct Answer: A
True or False: When a data share is established between a Data Provider and a Data Consumer, the Data
Consumer can extend that data share to other Data
Consumers.
A. True
B. False
Correct Answer: B
Correct Answer: AC
Correct Answer: A
Correct Answer: D
Correct Answer: D
Correct Answer: AB
Correct Answer: A
Correct Answer: D
Correct Answer: A
Question #60 Topic 1
The Query History in the Snowflake Web Interface (UI) is kept for approximately:
A. 60 minutes
B. 24 hours
C. 14 days Most Voted
D. 30 days
E. 1 year
Correct Answer: E
Correct Answer: D
Correct Answer: B
Correct Answer: CD
Correct Answer: B
Correct Answer: AB
Correct Answer: D
Correct Answer: A
Correct Answer: BD
Correct Answer: BD
Correct Answer: D
Correct Answer: B
Correct Answer: B
Question #73 Topic 1
True or False: The user has to specify which cluster a query will run on in a multi-cluster Warehouse.
A. True
B. False Most Voted
Correct Answer: B
Correct Answer: B
Correct Answer: BC
Correct Answer: BC
Correct Answer: D
Correct Answer: B
Question #80 Topic 1
The number of queries that a Virtual Warehouse can concurrently process is determined by (Choose
two.):
A. The complexity of each query Most Voted
B. The CONCURRENT_QUERY_LIMIT parameter set on the Snowflake account
C. The size of the data required for each query Most Voted
D. The tool that is executing the query
Correct Answer: AC
Which of the following statements are true of VALIDATION_MODE in Snowflake? (Choose two.)
A. The VALIDATION_MODE option is used when creating an Internal Stage
B. VALIDATION_MODE=RETURN_ALL_ERRORS is a parameter of the COPY command Most
Voted
C. The VALIDATION_MODE option will validate data to be loaded by the COPY statement while
completing the load and will return the rows that could not be loaded without error
D. The VALIDATION_MODE option will validate data to be loaded by the COPY statement without
completing the load and will return possible errors Most Voted
Correct Answer: BC
Correct Answer: A
Correct Answer: B
Correct Answer: A
Correct Answer: B
Correct Answer: A
Correct Answer: B
Correct Answer: A
True or False: AWS Private Link provides a secure connection from the Customer's on-premise data
center to the Snowflake.
A. True
B. False Most Voted
Correct Answer: B
Correct Answer: B
Question #93 Topic 1
True or False: It is best practice to define a clustering key on every table.
A. True
B. False
Correct Answer: B
Correct Answer: A
Correct Answer: C
Correct Answer: D
Correct Answer: C
Correct Answer: BC
Correct Answer: A
Correct Answer: C
A. True
B. False Most Voted
Correct Answer: B
Correct Answer: B
Correct Answer: D
Correct Answer: B
Correct Answer: B
True or False: When a new Snowflake object is created, it is automatically owned by the user who created
it.
A. True
B. False Most Voted
Correct Answer: A
True or False: A Virtual Warehouse consumes Snowflake credits even when inactive.
A. True
B. False
Correct Answer: B
True or False: During data unloading, only JSON and CSV files can be compressed.
• A. True
• B. False Most Voted
Correct Answer: B 🗳️
Which of the following are options when creating a Virtual Warehouse? (Choose two.)
• A. Auto-suspend
• B. Auto-resume
• C. Local SSD size
• D. User count
Correct Answer: AB 🗳️
Which formats are supported for unloading data from Snowflake? (Choose two.)
Correct Answer: AC 🗳️
True or False: Data Providers can share data with only the Data Consumer.
• A. True
• B. False Most Voted
Correct Answer: A 🗳️
• A. 1 day
• B. 7 days Most Voted
• C. 45 days
• D. 90 days
Correct Answer: B 🗳️
Correct Answer: A 🗳️
What services does Snowflake automatically provide for customers that they may have been
responsible for with their on-premise system? (Choose all that apply.)
Which of the following statements would be used to export/unload data from Snowflake?
Correct Answer: A 🗳️
True or False: A 4X-Large Warehouse may, at times, take longer to provision than a X-Small
Warehouse.
Correct Answer: B 🗳️
How would you determine the size of the virtual warehouse used for a task?
Correct Answer: C 🗳️
The Information Schema and Account Usage Share provide storage information for which of
the following objects? (Choose three.)
• A. Users
• B. Tables Most Voted
• C. Databases Most Voted
• D. Internal Stages Most Voted
What is the default File Format used in the COPY command if one is not specified?
• A. CSV
• B. JSON
• C. Parquet
• D. XML
Correct Answer: A 🗳️
True or False: Reader Accounts are able to extract data from shared data objects for use
outside of Snowflake.
Correct Answer: A 🗳️
True or False: You can define multiple columns within a clustering key on a table.
Correct Answer: A 🗳️
True or False: Snowflake enforces unique, primary key, and foreign key constraints during DML
operations.
• A. True
• B. False Most Voted
Correct Answer: B 🗳️
True or False: Loading data into Snowflake requires that source data files be no larger than
16MB.
• A. True
• B. False Most Voted
Correct Answer: B 🗳️
Correct Answer: A 🗳️
True or False: When you create a custom role, it is a best practice to immediately grant that role
to ACCOUNTADMIN.
• A. True
• B. False Most Voted
Correct Answer: B 🗳️
Which of the following accurately represents how a table fits into Snowflake's logical container
hierarchy?
Correct Answer: B 🗳️
• A. True
• B. False Most Voted
Correct Answer: B 🗳️
What are two ways to create and manage Data Shares in Snowflake? (Choose two.)
Correct Answer: AC 🗳️
• A. True
• B. False Most Voted
Correct Answer: A 🗳️
True or False: It is possible for a user to run a query against the query result cache without
requiring an active Warehouse.
True or False: When Snowflake is configured to use Single Sign-On (SSO), Snowflake receives
the usernames and credentials from the SSO service and loads them into the customer's
Snowflake account.
• A. True
• B. False Most Voted
Correct Answer: B 🗳️
Which of the following are best practices for loading data into Snowflake? (Choose three.)
• A. Aim to produce data files that are between 100 MB and 250 MB in size,
compressed. Most Voted
• B. Load data from files in a cloud storage service in a different region or cloud platform
from the service or region containing the Snowflake account, to save on cost.
• C. Enclose fields that contain delimiter characters in single or double quotes. Most Voted
• D. Split large files into a greater number of smaller files to distribute the load among
the compute resources in an active warehouse. Most Voted
• E. When planning which warehouse(s) to use for data loading, start with the largest
warehouse possible.
• F. Partition the staged data into large folders with random paths, allowing Snowflake to
determine the best way to load each file.
Which Snowflake feature is used for both querying and restoring data?
• A. Cluster keys
• B. Time Travel Most Voted
• C. Fail-safe
• D. Cloning
Correct Answer: B 🗳️
What do the terms scale up and scale out refer to in Snowflake? (Choose two.)
• A. Scaling out adds clusters of the same size to a virtual warehouse to handle more
concurrent queries. Most Voted
• B. Scaling out adds clusters of varying sizes to a virtual warehouse.
• C. Scaling out adds additional database servers to an existing running cluster to handle
more concurrent queries.
• D. Snowflake recommends using both scaling up and scaling out to handle more
concurrent queries.
• E. Scaling up resizes a virtual warehouse so it can handle more complex
workloads. Most Voted
• F. Scaling up adds additional database servers to an existing running cluster to handle
larger workloads.
Correct Answer: AE 🗳️
What is the minimum Snowflake edition that has column-level security enabled?
• A. Standard
• B. Enterprise
• C. Business Critical
• D. Virtual Private Snowflake
Correct Answer: B 🗳️
What parameter controls if the Virtual Warehouse starts immediately after the CREATE
WAREHOUSE statement?
•• A. INITIALLY_SUSPENDED == TRUE/FALSE
B. START_AFTER_CREATE TRUE/FALSE Most Voted
• C. START_THE = 60 // (seconds from now)
• D. START_TIME = CURRENT_DATE()
Correct Answer: A 🗳️
When cloning a database, what is cloned with the database? (Choose two.)
Correct Answer: BE 🗳️
Which of the following describes the Snowflake Cloud Services layer?
What is the maximum total Continuous Data Protection (CDP) charges incurred for a temporary
table?
• A. 30 days
• B. 7 days
• C. 48 hours
• D. 24 hours Most Voted
Correct Answer: D 🗳️
When reviewing a query profile, what is a symptom that a query is too large to fit into the
memory?
• A. A single join node uses more than 50% of the query time
• B. Partitions scanned is equal to partitions total
• C. An AggregateOperator node is present
• D. The query is spilling to remote storage Most Voted
Correct Answer: B 🗳️
Correct Answer: C 🗳️
What transformations are supported in a CREATE PIPE ... AS COPY `¦ FROM (`¦) statement?
(Choose two.)
Correct Answer: CD 🗳️
Which of the following are characteristics of Snowflake virtual warehouses? (Choose two.)
• A. Auto-resume applies only to the last warehouse that was started in a multi-cluster
warehouse.
• B. The ability to auto-suspend a warehouse is only available in the Enterprise edition or
above.
• C. SnowSQL supports both a configuration file and a command line option for
specifying a default warehouse. Most Voted
• D. A user cannot specify a default warehouse when using the ODBC driver.
• E. The default virtual warehouse size can be changed at any time.
Correct Answer: A 🗳️
Which command should be used to load data from a file, located in an external stage, into a
table in Snowflake?
• A. PUT
B. INSERT
• C. GET
• D. COPY
Correct Answer: D 🗳️
The Snowflake Cloud Data Platform is described as having which of the following
architectures?
• A. Shared-disk
• B. Shared-nothing
• C. Multi-cluster shared data Most Voted
• D. Serverless query engine
Correct Answer: C 🗳️
Correct Answer: A 🗳️
What versions of Snowflake should be used to manage compliance with Personal Identifiable
Information (PII) requirements? (Choose two.)
• A. Custom Edition
• B. Virtual Private Snowflake Most Voted
• C. Business Critical Edition Most Voted
• D. Standard Edition
• E. Enterprise Edition
Correct Answer: BC 🗳️
What are supported file formats for unloading data from Snowflake? (Choose three.)
• A. XML
• B. JSON Most Voted
• C. Parquet Most Voted
• D. ORC
• E. AVRO
• F. CSV Most Voted
The Snowflake cloud services layer is responsible for which tasks? (Choose two.)
Correct Answer: CD 🗳️
• A. Software
B. Zero-copyupdates
cloning are
creates a mirror copy
automatically of aon
applied database thatbasis.
a quarterly updates with the original.
• C. Snowflake eliminates resource contention with its virtual warehouse
implementation. Most Voted
• D. Multi-cluster warehouses allow users to run a query that spans across multiple
clusters.
• E. Snowflake automatically sorts DATE columns during ingest, for fast retrieval by date.
Correct Answer: D 🗳️
When publishing a Snowflake Data Marketplace listing into a remote region what should be
taken into consideration? (Choose two.)
• A. There is no need to have a Snowflake account in the target region, a share will be
created for each user.
• B. The listing is replicated into all selected regions automatically, the data is not. Most
Voted
• C. The user must have the ORGADMIN role available in at least one account to link
accounts for replication.
• D. Shares attached to listings in remote regions can be viewed from any account in an
organization.
• E. For a standard listing the user can wait until the first customer requests the data
before replicating it to the target region. Most Voted
Correct Answer: AC 🗳️
When loading data into Snowflake via Snowpipe what is the compressed file size
recommendation?
• A. 10-50 MB
• B. 100-250 MB
• C. 300-500 MB
• D. 1000-1500 MB
Correct Answer: B 🗳️
Which Snowflake feature allows a user to substitute a randomly generated identifier for
sensitive data, in order to prevent unauthorized users access to the data, before loading it into
Snowflake?
Correct Answer: A 🗳️
Which of the following are examples of operations that require a Virtual Warehouse to
complete, assuming no queries have been executed previously? (Choose three.)
What is the SNOWFLAKE.ACCOUNT_USAGE view that contains information about which objects
were read by queries within the last 365 days (1 year)?
• A. VIEWS_HISTORY
• B. OBJECT_HISTORY
• C. ACCESS_HISTORY Most Voted
• D. LOGIN_HISTORY
Correct Answer: C 🗳️
•• A. Column-level
B. SOC security
2 type II certification
• C. Multi-factor Authentication (MFA)
• D. Object-level access control
Correct Answer: A 🗳️
• A. Possibly, if the warehouse is resized to a smaller size and the cache no longer
• fits.
B. Yes, because
Most Voted the compute resource is replaced in its entirety with a new compute
resource.
• C. No, because the size of the cache is independent from the warehouse size.
• D. Yes, because the new compute resource will no longer have access to the cache
encryption key.
Correct Answer: D 🗳️
Which semi-structured file formats are supported when unloading data from a table? (Choose
two.)
• A. ORC
• B. XML
• C. Avro
• D. Parquet Most Voted
• E. JSON Most Voted
Correct Answer: DE 🗳️
• A. 1 second
• B. 60 seconds
• C. 5 minutes
• D. 60 minutes
Correct Answer: B 🗳️
What are the responsibilities of Snowflake's Cloud Service layer? (Choose three.)
How long is the Fail-safe period for temporary and transient tables?
Correct Answer: A 🗳️
Which command should be used to download files from a Snowflake stage to a local folder on
a client's machine?
• A. PUT
• B. GET Most Voted
• C. COPY
• D. SELECT
Correct Answer: B 🗳️
Correct Answer: C 🗳️
A virtual warehouse is created using the following command:
Correct Answer: B 🗳️
• A. Standard
• B. Enterprise
• C. Business Critical
• D. Virtual Private Snowflake Most Voted
Correct Answer: D 🗳️
• A. Role
• B. Schema
• C. User Most Voted
• D. Database
• E. Account Most Voted
• F. Tables
• Correct Answer: CE 🗳️
What are the correct parameters for time travel and fail-safe in the Snowflake Enterprise
Edition?
• A. Default Time Travel Retention is set to 0 days. Maximum Time Travel Retention is 30
days. Fail Safe retention time is 1 day.
• B. Default Time Travel Retention is set to 1 day. Maximum Time Travel Retention is 365
days. Fail Safe retention time is 7 days.
• C. Default Time Travel Retention is set to 0 days. Maximum Time Travel Retention is 90
days. Fail Safe retention time is 7 days.
• D. Default Time Travel Retention is set to 1 day. Maximum Time Travel Retention is 90
days. Fail Safe retention time is 7 days. Most Voted
• E. Default Time Travel Retention is set to 7 days. Maximum Time Travel Retention is 1
day. Fail Safe retention time is 90 days.
• F. Default Time Travel Retention is set to 90 days. Maximum Time Travel Retention is 7
days. Fail Safe retention time is 356 days.
Correct Answer: D 🗳️
Which of the following objects are contained within a schema? (Choose two.)
• A. Role
• B. Stream Most Voted
• C. Warehouse
• D. External table Most Voted
• E. User
• F. Share
Correct Answer: CD 🗳️
Which of the following statements describe features of Snowflake data caching? (Choose two.)
• A. When a virtual warehouse is suspended, the data cache is saved on the remote
storage layer.
• B. When the data cache is full, the least-recently used data will be cleared to make
room. Most Voted
• C. A user can only access their own queries from the query result cache.
• D. A user must set USE_METADATA_CACHE to TRUE to use the metadata cache in
queries.
• E. The RESULT_SCAN table function can access and filter the contents of the query
result cache. Most Voted
Correct Answer: BD 🗳️
A table needs to be loaded. The input data is in JSON format and is a concatenation of multiple
JSON documents. The file size is 3 GB. A warehouse size S is being used. The following COPY
INTO command was executed:
COPY INTO SAMPLE FROM @~/SAMPLE.JSON (TYPE=JSON)
The load failed with this error:
Max LOB size (16777216) exceeded, actual size of parsed column is 17894470.
How can this issue be resolved?
Correct Answer: A 🗳️
• A. They can be created as secure and hide the underlying metadata from all users.
• B. They can access tables from a single database.
• C. They can only contain a single SQL statement.
• D. They can be created to run with a caller's rights or an owner's rights. Most Voted
Correct Answer: A 🗳️
Which columns are part of the result set of the Snowflake LATERAL FLATTEN command?
(Choose two.)
• A. CONTENT
• B. PATH Most Voted
• C. BYTE_SIZE
• D. INDEX Most Voted
• E. DATATYPE
Correct Answer: BC 🗳️
• A. Standard Edition
• B. Enterprise Edition Most Voted
• C. Business Critical Edition
• D. Virtual Private Snowflake Edition
Correct Answer: B 🗳️
Which Snowflake function will interpret an input string as a JSON document, and produce a
VARIANT value?
Correct Answer: B 🗳️
• A. Per second multiplied by an automatic sizing for the job Most Voted
• B. Per minute multiplied by an automatic sizing for the job, with a minimum of one
minute
• C. Per second multiplied by the size, as determined by the
SERVERLESS_FEATURES_SIZE account parameter
• D. Serverless features are not billed, unless the total cost for the month exceeds 10% of
the warehouse credits, on the account
Correct Answer: D 🗳️
• A. Compute
• B. Data storage
• C. Cloud services Most Voted
• D. Cloud provider
Correct Answer: D 🗳️
Correct Answer: C 🗳️
Which SQL commands, when committed, will consume a stream and advance the stream
offset? (Choose two.)
Correct Answer: CD 🗳️
Which methods can be used to delete staged files from a Snowflake stage? (Choose two.)
What Snowflake role must be granted for a user to create and manage accounts?
o A. ACCOUNTADMIN
o B. ORGADMIN
o C. SECURITYADMIN
o D. SYSADMIN
Correct Answer: B 🗳️
Assume there is a table consisting of five micro-partitions with values ranging from A to
Z.
o B.
o C.
o D.
Correct Answer: A 🗳️
What feature can be used to reorganize a very large table on one or more columns?
o A. Micro-partitions
o B. Clustering keys
o C. Key partitions
o D. Clustered partitions
Correct Answer: B 🗳️
What is an advantage of using an explain plan instead of the query profiler to evaluate
the performance of a query?
Correct Answer: B 🗳️
Which data types are supported by Snowflake when using semi-structured data?
(Choose two.)
o A. VARIANT
o B. VARRAY
o C. STRUCT
o D. ARRAY
o E. QUEUE
Correct Answer: AD 🗳️
Why does Snowflake recommend file sizes of 100-250 MB compressed when loading
data?
Correct Answer: D 🗳️
Which of the following features are available with the Snowflake Enterprise edition?
(Choose two.)
Correct Answer: AD 🗳️
What is the default file size when unloading data from Snowflake using the COPY command?
• A. 5 MB
• B. 8 GB
• C. 16 MB Most Voted
• D. 32 MB
Correct Answer: C 🗳️
What features that are part of the Continuous Data Protection (CDP) feature set in Snowflake
do not require additional configuration? (Choose two.)
• A. Data
B. Row level access
masking policies
policies
• C. Data encryption
• D. Time Travel
• E. External tokenization
Correct Answer: CD 🗳️
Which Snowflake layer is always leveraged when accessing a query from the result cache?
• A. Metadata
• B. Data Storage
• C. Compute
• D. Cloud Services
Correct Answer: D 🗳️
Which connectors are available in the downloads section of the Snowflake web interface (UI)?
(Choose two.)
Correct Answer: CD 🗳️
A Snowflake Administrator needs to ensure that sensitive corporate data in Snowflake tables is
not visible to end users, but is partially visible to functional managers.
Correct Answer: B 🗳️
Users are responsible for data storage costs until what occurs?
Correct Answer: B 🗳️
A user has an application that writes a new file to a cloud storage location every 5 minutes.
What would be the MOST efficient way to get the files into Snowflake?
• A. Create a task that runs a COPY INTO operation from an external stage every 5
minutes.
• B. Create a task that PUTS the files in an internal stage and automate the data loading
wizard.
• C. Create a task that runs a GET operation to intermittently check for new files.
• D. Set up cloud provider notifications on the file location and use Snowpipe with auto-
ingest. Most Voted
Correct Answer: D 🗳️
Correct Answer: B 🗳️
Which of the following is an example of an operation that can be completed without requiring
compute, assuming no queries have been executed previously?
Correct Answer: B 🗳️
• A. 1 day
• B. 7 days
• C. 14 days Most Voted
• D. 64 days
• Correct Answer: C 🗳️
What Snowflake features allow virtual warehouses to handle high concurrency workloads?
(Choose two.)
Correct Answer: BD 🗳️
Which COPY INTO command outputs the data into one file?
Correct Answer: A 🗳️
In which scenarios would a user have to pay Cloud Services costs? (Choose two.)
Correct Answer: DE 🗳️
A user created a new worksheet within the Snowsight UI and wants to share this with
teammates.
Correct Answer: B 🗳️
How can a row access policy be applied to a table or a view? (Choose two.)
Which command can be used to load data files into a Snowflake stage?
• A. JOIN
• B. COPY INTO
• C. PUT Most Voted
• D. GET
Correct Answer: C 🗳️
What types of data listings are available in the Snowflake Data Marketplace? (Choose two.)
• A. Reader
• B. Consumer
• C. Vendor
• D. Standard Most Voted
• E. Personalized Most Voted
Correct Answer: DE 🗳️
What is the maximum Time Travel retention period for a temporary Snowflake table?
• A. 90 days
• B. 1 day Most Voted
• C. 7 days
• D. 45 days
Correct Answer: B 🗳️
Correct Answer: D 🗳️
What happens when a cloned table is replicated to a secondary database? (Choose two.)
Correct Answer: CD 🗳️
Snowflake supports the use of external stages with which cloud platforms? (Choose three.)
Correct Answer: C 🗳️
In the Snowflake access control model, which entity owns an object by default?
Correct Answer: D 🗳️
What is the minimum Snowflake edition required to use Dynamic Data Masking?
• A. Standard
• B. Enterprise Most Voted
• C. Business Critical
• D. Virtual Private Snowflake (VPC)
Correct Answer: D 🗳️
Which services does the Snowflake Cloud Services layer manage? (Choose two.)
• A. Compute resources
• B. Query execution
• C. Authentication Most Voted
• D. Data storage
• E. Metadata Most Voted
Correct Answer: CE 🗳️
A company needs to allow some users to see Personally Identifiable Information (PII) while
limiting other users from seeing the full value of the PII.
Correct Answer: B 🗳️
Which command can be used to verify if data has been uploaded to the external stage named
my_stage?
• A. view @my_stage
• B. list @my_stage Most Voted
• C. show @my_stage
• D. display @my_stage
Correct Answer: B 🗳️
Which tasks are performed in the Snowflake Cloud Services layer? (Choose two.)
Correct Answer: AE 🗳️
• A. The Data Consumer pays for data storage as well as for data computing.
• B. The shared data is copied into the Data Consumer account, so the Consumer can
modify it without impacting the base data of the Provider.
• C. A Snowflake account can both provide and consume shared data. Most Voted
• D. The Provider is charged for compute resources used by the Data Consumer to query
the shared data.
• E. The Data Consumer pays only for compute resources to query the shared data. Most
Voted
Correct Answer: BC 🗳️
The following JSON is stored in a VARIANT column called src of the CAR_SALES table:
Correct Answer: A 🗳️
Which of the following significantly improves the performance of selective point lookup queries
on a table?
• A. Clustering
• B. Materialized Views
• C. Zero-copy Cloning
• D. Search Optimization Service Most Voted
Correct Answer: A 🗳️
• A. Tables, secure views, and secure UDFs can be shared Most Voted
• B. Shares can be shared
• C. Data consumers can clone a new table from a share
• D. Access to a share cannot be revoked once granted
Correct Answer: A 🗳️
What are best practice recommendations for using the ACCOUNTADMIN system-defined role in
Snowflake? (Choose two.)
Correct Answer: AD 🗳️
In the query profiler view for a query, which components represent areas that can be used to
help optimize query performance? (Choose two.)
Correct Answer: BD 🗳️
What is the minimum Snowflake edition required for row level security?
• A. Standard
• B. Enterprise Most Voted
• C. Business Critical
• D. Virtual Private Snowflake
Correct Answer: B 🗳️
The is the minimum Fail-safe retention time period for transient tables?
• A. 1 day
• B. 7 days
• C. 12 hours
• D. 0 days Most Voted
Correct Answer: D 🗳️
What is a machine learning and data science partner within the Snowflake Partner Ecosystem?
• A. Informatica
• B. Power BI
• C. Adobe
• D. Data Robot Most Voted
Correct Answer: D 🗳️
Which statements are correct concerning the leveraging of third-party data from the Snowflake
Data Marketplace? (Choose two.)
Correct Answer: AD 🗳️
What impacts the credit consumption of maintaining a materialized view? (Choose two.)
Correct Answer: DE 🗳️
What COPY INTO SQL command should be used to unload data into multiple files?
• A. SINGLE=TRUE
• B. MULTIPLE=TRUE
• C. MULTIPLE=FALSE
• D. SINGLE=FALSE Most Voted
Correct Answer: C 🗳️
When cloning a database containing stored procedures and regular views, that have fully
qualified table references, which of the following will occur?
• A. The cloned views and the stored procedures will reference the cloned tables in the
cloned database.
• B. An error will occur, as views with qualified references cannot be cloned.
• C. An error will occur, as stored objects cannot be cloned.
• D. The stored procedures and views will refer to tables in the source database. Most Voted
Correct Answer: D 🗳️
When loading data into Snowflake, how should the data be organized?
Correct Answer: A 🗳️
Which of the following objects can be directly restored using the UNDROP command? (Choose
two.)
Correct Answer: AD 🗳️
Which Snowflake SQL statement would be used to determine which users and roles have
access to a role called MY_ROLE?
What is the MINIMUM edition of Snowflake that is required to use a SCIM security integration?
Correct Answer: D 🗳️
A user created a transient table and made several changes to it over the course of several
days. Three days after the table was created, the user would like to go back to the first version
of the table.
Correct Answer: B 🗳️
When reviewing the load for a warehouse using the load monitoring chart, the chart indicates
that a high volume of queries is always queuing in the warehouse.
According to recommended best practice, what should be done to reduce the queue volume?
(Choose two.)
Correct Answer: AD 🗳️
Which of the following features, associated with Continuous Data Protection (CDP), require
additional Snowflake-provided data storage? (Choose two.)
• A. Tri-Secret Secure
• B. Time Travel
• C. Fail-safe
• D. Data encryption
• E. External stages
Correct Answer: BC 🗳️
Where can a user find and review the failed logins of a specific user for the past 30 days?
Correct Answer: B 🗳️
Correct Answer: A 🗳️
Which of the following statements apply to Snowflake in terms of security? (Choose two.)
Correct Answer: AD 🗳️
A single user of a virtual warehouse has set the warehouse to auto-resume and auto-suspend
after 10 minutes. The warehouse is currently suspended and the user performs the following
actions:
• A. 4 minutes
• B. 10 minutes
• C. 14 minutes Most Voted
• D. 24 minutes
Correct Answer: C 🗳️
What can be used to view warehouse usage over time? (Choose two.)
Correct Answer: AD 🗳️
What actions will prevent leveraging of the ResultSet cache? (Choose two.)
Correct Answer: AD 🗳️
• A. A task can be called using a CALL statement to run a set of predefined SQL
commands.
• B. A task allows a user to execute a single SQL statement/command using a predefined
schedule. Most Voted
• C. A task allows a user to execute a set of SQL commands on a predefined schedule.
• D. A task can be executed using a SELECT statement to run a predefined SQL
command.
Correct Answer: C 🗳️
Which data types does Snowflake support when querying semi-structured data? (Choose two.)
• A. VARIANT Most Voted
• B. VARCHAR
• C. XML
• D. ARRAY Most Voted
• E. BLOB
Correct Answer: CD 🗳️
Correct Answer: D 🗳️
Correct Answer: D 🗳️
Correct Answer: D 🗳️
What happens when a virtual warehouse is resized?
• A. When increasing the size of an active warehouse the compute resource for all
running and queued queries on the warehouse are affected.
• B. When reducing the size of a warehouse the compute resources are removed only
when they are no longer being used to execute any current statements. Most Voted
• C. The warehouse will be suspended while the new compute resource is provisioned
and will resume automatically once provisioning is complete.
• D. Users who are trying to use the warehouse will receive an error message until the
resizing is complete.
Correct Answer: D 🗳️
What tasks can be completed using the COPY command? (Choose two.)
Correct Answer: CD 🗳️
• A. Database Storage
• B. Cloud Services
• C. Query Processing Most Voted
• D. Application Services
Correct Answer: D 🗳️
Query compilation occurs in which architecture layer of the Snowflake Cloud Data Platform?
• A. Compute layer
• B. Storage layer
• C. Cloud infrastructure layer
• D. Cloud services layer Most Voted
Correct Answer: A 🗳️
If a size Small virtual warehouse is made up of two servers, how many servers make up a Large
warehouse?
• A. 4
• B. 8 Most Voted
• C. 16
• D. 32
Correct Answer: D 🗳️
Correct Answer: C 🗳️
Correct Answer: B 🗳️
Correct Answer: A 🗳️
What data type should be used to store JSON data natively in Snowflake?
• A. JSON
• B. String
• C. Object
• D. VARIANT Most Voted
Correct Answer: A 🗳️
What should be considered when deciding to use a Secure View? (Choose two.)
• A. No details of the query execution plan will be available in the query profiler. Most Voted
• B. Once created there is no way to determine if a view is secure or not.
• C. Secure views do not take advantage of the same internal optimizations as standard
views. Most Voted
• D. It is not possible to create secure materialized views.
• E. The view definition of a secure view is still visible to users by way of the information
schema.
Correct Answer: CE 🗳️
The information schema provides storage information for which of the following objects?
(Choose two.)
• A. Users
• B. Databases Most VotedMost Voted
• C. Internal stages Most Voted
• D. Resource monitors
• E. Pipes Most Voted
Correct Answer: BD 🗳️
• A. Infrastructure management
• B. Metadata management
• C. Query execution Most Voted
• D. Query parsing and optimization
• E. Management of the storage layer
Correct Answer: C 🗳️
• A. Binary
• B. Float Most Voted
• C. Geography
• D. Variant
Correct Answer: D 🗳️
When unloading data to an external stage, which compression format can be used for Parquet
files with the COPY INTO command?
• A. BROTLI
• B. GZIP
• C. LZO Most Voted
• D. ZSTD
Correct Answer: B 🗳️
Which SQL command can be used to verify the privileges that are granted to a role?
Correct Answer: D 🗳️
Which Query Profile result indicates that a warehouse is sized too small?
Correct Answer: B 🗳️
Correct Answer: D 🗳️
Which of the following are best practice recommendations that should be considered when
loading data into Snowflake? (Choose two.)
Correct Answer: AD 🗳️
• A. ACCOUNT_USAGE
• B. READER_ACCOUNT_USAGE Most Voted
• C. INFORMATION_SCHEMA
• D. WAREHOUSE_USAGE_SCHEMA
Correct Answer: C 🗳️
• A. Disables the ability to use key pair and basic authentication (e.g.,
username/password) when connecting
• B. Allows dual Multi-Factor Authentication (MFA) when connecting to Snowflake
• C. Forces users to connect through a secure network proxy
• D. Allows users to connect using secure single sign-on (SSO) through an external
identity provider Most Voted
Correct Answer: D 🗳️
Which Snowflake partner category is represented at the top of this diagram (labeled 1)?
• A. Business Intelligence
• B. Machine Learning and Data Science
• C. Security and Governance
• D. Data Integration Most Voted
Correct Answer: D 🗳️
1. The fail-safe retention period is how many days?
• 1 day
• 7 days correct
• 45 days
• 90 days
--
2. True or False: A 4X-Large Warehouse may, at times, take longer to provision than a X-
Small Warehouse.
• True correct
• False
--
3. How would you determine the size of the virtual warehouse used for a task?
• Root task may be executed concurrently (i.e. multiple instances), it is recommended to
leave some margins in the execution window to avoid missing instances of execution
• Querying (select) the size of the stream content would help determine the warehouse
size. For example, if querying large stream content, use a larger warehouse size
• If using the stored procedure to execute multiple SQL statements, it's best to test run the
stored procedure separately to size the compute resource first
• Since task infrastructure is based on running the task body on schedule, it's
recommended to configure the virtual warehouse for automatic concurrency
handling using Multi-cluster warehouse (MCW) to match the task schedule correct
•
4. The Information Schema and Account Usage Share provide storage information for
which of the following objects? (Choose three.)
• Users correct
• Tables correct
• Databases correct
• Internal Stages correct
5. What is the default File Format used in the COPY command if one is not specified?
• CSV correct
• JSON
• Parquet
• XML
6. True or False: Reader Accounts are able to extract data from shared data objects for
use outside of Snowflake.
• True
• False correct
7. True or False: Loading data into Snowflake requires that source data files be no larger
than 16MB.
• True
• False correct
Explanation:
By default, COPY INTO location statements separate table data into a set of output files to take
advantage of parallel operations. The maximum size for each file is set using the
MAX_FILE_SIZE copy option. The default value is 16777216 (16 MB) but can be increased to
accommodate larger files. The maximum file size supported is 5 GB for Amazon S3, Google
Cloud Storage, or Microsoft Azure stages. To unload data to a single output file (at the potential
cost of decreased performance), specify the SINGLE = true copy option in your statement. You
can optionally specify a name for the file in the path.
9. True or False: When you create a custom role, it is a best practice to immediately grant
that role to ACCOUNTADMIN.
• True
• False correct
10. What are two ways to create and manage Data Shares in Snowflake? (Choose two.)
• Via the Snowflake Web Interface (Ul) correct
• Via the data_share=true parameter
• Via SQL commandscorrect
• Via Virtual Warehouses
Explanation: Separate and distinct from Time Travel, Fail-safe ensures historical data is
protected in the event of a system failure or other catastrophic event, e.g. a hardware failure or
security breach. Fail safe feature cannot be enabled or disabled from the user end .
12. True or False: It is possible for a user to run a query against the query result cache
without requiring an active Warehouse.
• True correct
• False
Explanation: Query result cache is all about fetching the data from cloud services layer and
saving the cost by not running the virtual warehouse.
13. A virtual warehouse's auto-suspend and auto-resume settings apply to which of the
following?
• The primary cluster in the virtual warehouse
• The entire virtual warehouse correct
• The database in which the virtual warehouse resides
• The Queries currently being run on the virtual warehouse
14. Which of the following Snowflake features provide continuous data protection
automatically? (Select TWO).
• Internal stages correct
• Incremental backups
• Time Travel correct
• Zero-copy clones
• Fail-safe correct
Explanation: Time travel and fail safe are the two continuous data protection features support the
recovery of data automatically. Snowflake provides powerful CDP features for ensuring the
maintenance and availability of your historical data (i.e. data that has been changed or deleted):
Querying, cloning, and restoring historical data in tables, schemas, and databases for up to 90
days through Snowflake Time Travel. Disaster recovery of historical data (by Snowflake) through
Snowflake Fail-safe.
15. Which of the following conditions must be met in order to return results from the
results cache? (Select TWO).
• The user has the appropriate privileges on the objects associated with the query
• Micro-partitions have been reclustered since the query was last run
• The new query is run using the same virtual warehouse as the previous query
• The query includes a User Defined Function (UDF)
• The query has been run within 24 hours of the previously-run query
17. What is the minimum Snowflake edition required to create a materialized view?
• Standard Edition
• Enterprise Edition correct
• Business Critical Edition
• Virtual Private Snowflake Edition
Explanation: Materialized views require Enterprise Edition. To inquire about upgrading, please
contact Snowflake Support
18. What happens to the underlying table data when a CLUSTER BY clause is added to a
Snowflake table?
• Data is hashed by the cluster key to facilitate fast searches for common data values
• Larger micro-partitions are created for common data values to reduce the number of
partitions that must be scanned
• Smaller micro-partitions are created for common data values to allow for more parallelism
• Data may be colocated by the cluster key within the micro-partitions to improve
pruning performance correct
19. Which feature is only available in the Enterprise or higher editions of Snowflake?
• Column-level security correct
• SOC 2 type II certification
• Multi-factor Authentication (MFA)
• Object-level access control
20. Which of the following are valid methods for authenticating users for access into
Snowflake? (Select THREE)
• SCIM correct
• Federated authentication correct
• TLS 1.2
• Key-pair authentication correct
• OAuth correct
• OCSP authentication
--
21. During periods of warehouse contention which parameter controls the maximum
length of time a warehouse will hold a query for processing?
• STATEMENT_TIMEOUT__IN__SECONDS
• STATEMENT_QUEUED_TIMEOUT_IN_SECONDS correct
• MAX_CONCURRENCY__LEVEL
• QUERY_TIMEOUT_IN_SECONDS
--
23. Which Snowflake object enables loading data from files as soon as they are available
in a cloud storage location?
• Pipe correct
• External stage
• Task
• Stream
--
Explanation: Snowpipe enables loading data from files as soon as they’re available in a stage.
This means you can load data from files in micro-batches, making it available to users within
minutes, rather than manually executing COPY statements on a schedule to load larger batches.
https://docs.snowflake.com/en/user-guide/data-load-snowpipe-intro.html
24. A user needs to create a materialized view in the schema MYDB.MYSCHEMA.
25. What is the default character set used when loading CSV files into Snowflake?
• UTF-8 correct
• UTF-16
• ISO S859-1
• ANSI_X3.A
--
For delimited files (CSV, TSV, etc.), the default character set is UTF-8. To use any other
characters sets, you must explicitly specify the encoding to use for loading. For the list of
supported character sets, see Supported Character Sets for Delimited Files (in this topic).
27. Which cache type is used to cache data output from SQL queries?
• Metadata cache
• Result cache correct
• Remote cache
• Local file cache
--
Explanation: Snowflake Time Travel enables accessing historical data (i.e. data that has been
changed or deleted) at any point within a defined period. It serves as a powerful tool for
performing the following tasks: Restoring data-related objects (tables, schemas, and databases)
that might have been accidentally or intentionally deleted. Duplicating and backing up data from
key points in the past. Analyzing data usage/manipulation over specified periods of time.
https://docs.snowflake.com/en/user-guide/data-time-travel.html
31. Which statement about billing applies to Snowflake credits?
• Credits are billed per-minute with a 60-minute minimum
• Credits are used to pay for cloud data storage usage
• Credits are consumed based on the number of credits billed for each hour that a
warehouse runs
• Credits are consumed based on the warehouse size and the time the warehouse is
running correct
--
Explanation: Snowflake credits are used to pay for the consumption of resources on Snowflake.
A Snowflake credit is a unit of measure, and it is consumed only when a customer is using
resources, such as when a virtual warehouse is running, the cloud services layer is performing
work, or serverless features are used.
32. What Snowflake features allow virtual warehouses to handle high concurrency
workloads? (Select TWO)
• The ability to scale up warehousescorrect
• The use of warehouse auto scaling
• The ability to resize warehouses
• Use of multi-clustered warehousescorrect
• The use of warehouse indexing
--
33. When reviewing the load for a warehouse using the load monitoring chart, the chart
indicates that a high volume of Queries are always queuing in the warehouse.
According to recommended best practice, what should be done to reduce the Queue
volume? (Select TWO).
• Use multi-clustered warehousing to scale out warehouse capacity. correct
• Scale up the warehouse size to allow Queries to execute faster.
• Stop and start the warehouse to clear the queued queries
• Migrate some queries to a new warehouse to reduce load correct
• Limit user access to the warehouse so fewer queries are run against it.
--
34. Which of the following objects can be shared through secure data sharing?
• Masking policy
• Stored procedure
• Task
• External table correct
--
Explanation: Secure Data Sharing enables sharing selected objects in a database in your
account with other Snowflake accounts. The following Snowflake database objects can be
shared:
✑ Tables
✑ External tables
✑ Secure views
✑ Secure UDFs
Snowflake enables the sharing of databases through shares, which are created by data providers
and “imported” by data consumers.
35. Which of the following commands cannot be used within a reader account?
• CREATE SHARE correct
• ALTER WAREHOUSE correct
• DROP ROLE correct
• SHOW SCHEMAS
• DESCRBE TABLE
--
36. A user unloaded a Snowflake table called mytable to an internal stage called mystage.
Which command can be used to view the list of files that has been uploaded to the
staged?
• list @mytable;
• list @%raytable;
• list @ %m.ystage;
• list @mystage; correct
--
37. Which of the following Snowflake capabilities are available in all Snowflake editions?
(Select TWO)
• Customer-managed encryption keys through Tri-Secret Secure correct
• Automatic encryption of all data correct
• Up to 90 days of data recovery through Time Travel
• Object-level access control correct
• Column-level security to apply data masking policies to tables and views
--
38. Which command is used to unload data from a Snowflake table into a file in a stage?
• COPY INTO correct
• GET
• WRITE
• EXTRACT INTO
--
40. What are value types that a VARIANT column can store? (Select TWO)
• STRUCT correct
• OBJECT correct
• BINARY
• ARRAY correct
• CLOB
--
Explanation: Characteristics of a VARIANT . A VARIANT can store a value of any other type,
including OBJECT and ARRAY. The maximum length of a VARIANT is 16 MB.
41. A user has an application that writes a new Tile to a cloud storage location every 5
minutes.
What would be the MOST efficient way to get the files into Snowflake?
• Create a task that runs a copy into operation from an external stage every 5 minutes
• Create a task that puts the files in an internal stage and automate the data loading wizard
• Create a task that runs a GET operation to intermittently check for new files
• Set up cloud provider notifications on the Tile location and use Snowpipe with
auto-ingest
correct
--
42. Which of the following are best practice recommendations that should be considered
when loading data into Snowflake? (Select TWO).
• Load files that are approximately 25 MB or smaller. correct
• Remove all dates and timestamps.
• Load files that are approximately 100-250 MB (or larger) correct
• Avoid using embedded characters such as commas for numeric data types correct
• Remove semi-structured data types
--
The ingest operation completes with no errors, using the following command: COPY INTO
my__table FROM @my__stage;
The next day the user adds 10 files to the stage so that now the stage contains a mixture
of new
customer data and updates to the previous data. The user did not remove the 10 original
files.
If the user runs the same copy into command what will happen?
• All data from all of the files on the stage will be appended to the table correct
• Only data about new customers from the new files will be appended to the table
• The operation will fail with the error uncertain files in stage.
• All data from only the newly-added files will be appended to the table.
--
44. A user has unloaded data from Snowflake to a stage.
Which SQL command should be used to validate which data was loaded into the stage?
• list @file__stage correct
• show @file__stage
• view @file__stage
• verify @file__stage
--
45. What happens when a cloned table is replicated to a secondary database? (Select
TWO)
• A read-only copy of the cloned tables is stored.
• The replication will not be successful.
• The physical data is replicated correct
• Additional costs for storage are charged to a secondary account
• Metadata pointers to cloned tables are replicated correct
--
46. Which data types does Snowflake support when querying semi-structured data?
(Select TWO)
• VARIANT correct
• ARRAY correct
• VARCHAR
• XML
• BLOB
--
Explanation: A VARIANT stores semi-structured data in Snowflake. It can store a value of any
other type, including OBJECT and ARRAY. The maximum length of a VARIANT is 16 MB.
A Snowflake ARRAY is similar to an array in many other programming languages. An ARRAY
contains 0 or more pieces of data. Each element is accessed by specifying its position in the
array.
47. Which of the following describes how multiple Snowflake accounts in a single
organization relate to various cloud providers?
• Each Snowflake account can be hosted in a different cloud vendor and region.
correct
• Each Snowflake account must be hosted in a different cloud vendor and region
• All Snowflake accounts must be hosted in the same cloud vendor and region
• Each Snowflake account can be hosted in a different cloud vendor, but must be in the
same region.
--
48. A user is loading JSON documents composed of a huge array containing multiple
records into Snowflake. The user enables the strip__outer_array file format option.
Explanation: Data Size Limitations. The VARIANT data type imposes a 16 MB size limit on
individual rows. For some semi-structured data formats (e.g. JSON), data sets are frequently a
simple concatenation of multiple documents. The JSON output from some software is composed
of a single huge array containing multiple records. There is no need to separate the documents
with line breaks or commas, though both are supported.
If the data exceeds 16 MB, enable the STRIP_OUTER_ARRAY file format option for the COPY
INTO <table> command to remove the outer array structure and load the records into separate
table rows: copy into <table>
from @~/<file>.json
49. What are the default Time Travel and Fail-safe retention periods for transient tables?
• Time Travel - 1 day. Fail-safe - 1 day
• Time Travel - 0 days. Fail-safe - 1 day
• Time Travel - 1 day. Fail-safe - 0 days
• Transient tables are retained in neither Fail-safe nor Time Travel correct
--
Explanation: When creating roles that will serve as the owners of securable objects in the
system, Snowflake recommends creating a hierarchy of custom roles, with the top-most custom
role assigned to the system role SYSADMIN. This role structure allows system administrators to
manage all objects in the account, such as warehouses and database objects, while restricting
management of users and roles to the USERADMIN role.
51. Which of the following Snowflake objects can be shared using a secure share? (Select
TWO).
• Materialized views correct
• Sequences
• Procedures
• Tablescorrect
• Secure User Defined Functions (UDFs)
--
52. Will data cached in a warehouse be lost when the warehouse is resized?
• Possibly, if the warehouse is resized to a smaller size and the cache no longer fits.
• Yes. because the compute resource is replaced in its entirety with a new compute
resource.
• No. because the size of the cache is independent from the warehouse size correct
• Yes. became the new compute resource will no longer have access to the cache
encryption key
--
53. Which Snowflake partner specializes in data catalog solutions?
• Alation correct
• DataRobot
• dbt
• Tableau
--
Explanation: Alation provides Data Cataloging functionality. They state they are the 'One Place to
Find, Understand, & Govern Data Across an Enterprise
54. What is the MOST performant file format for loading data in Snowflake?
• CSV (Unzipped)
• Parquet correct
• CSV (Gzipped)
• ORC
--
55. Which copy INTO command outputs the data into one file?
• SINGLE=TRUE
• MAX_FILE_NUMBER=1 correct
• FILE_NUMBER=1
• MULTIPLE=FAISE
--
56. Where would a Snowflake user find information about query activity from 90 days
ago?
• account__usage . query history view
• account__usage.query__history__archive View correct
• information__schema . cruery_history view
• information__schema - query history_by_ses s i on view
--
57. Which Snowflake technique can be used to improve the performance of a query?
• Clustering correct
• Indexing
• Fragmenting
• Using INDEX__HINTS
--
58. User-level network policies can be created by which of the following roles? (Select
TWO).
• ROLEADMIN
• ACCOUNTADMIN correct
• SYSADMIN
• SECURITYADMIN correct
• USERADMIN
--
59. Which command can be used to load data into an internal stage?
• LOAD
• copy
• GET
• PUT correct
--
60. What happens when an external or an internal stage is dropped? (Select TWO).
• When dropping an external stage, the files are not removed and only the stage is
dropped correct
• When dropping an external stage, both the stage and the files within the stage are
removed
• When dropping an internal stage, the files are deleted with the stage and the files are
recoverable
• When dropping an internal stage, the files are deleted with the stage and the files
are not recoverable correct
• When dropping an internal stage, only selected files are deleted with the stage and are
not recoverable
--
63. Which account__usage views are used to evaluate the details of dynamic data
masking? (Select TWO)
• ROLES correct
• POLICY_REFERENCES correct
• QUERY_HISTORY
• RESOURCE_MONIT ORS
• ACCESS_HISTORY correct
--
64. Query compilation occurs in which architecture layer of the Snowflake Cloud Data
Platform?
• Compute layer
• Storage layer
• Cloud infrastructure layer
• Cloud services layercorrect
--
65. Which is the MINIMUM required Snowflake edition that a user must have if they want to
use AWS/Azure Privatelink or Google Cloud Private Service Connect?
• Standard
• Premium
• Enterprise correct
• Business Critical
--
66. In the query profiler view for a query, which components represent areas that can be
used to help optimize query performance? (Select TWO)
• Bytes scanned correct
• Bytes sent over the network
• Number of partitions scanned correct
• Percentage scanned from cache
• External bytes scanned
--
67. A marketing co-worker has requested the ability to change a warehouse size on their
medium virtual warehouse called mktg__WH.
68. When reviewing a query profile, what is a symptom that a query is too large to fit into
the memory?
• A single join node uses more than 50% of the query time
• Partitions scanned is equal to partitions total
• An Aggregate Operacor node is present
• The query is spilling to remote storage correct
--
70. Which command can be used to stage local files from which Snowflake interface?
• SnowSQL correct
• Snowflake classic web interface (Ul)
• Snowsight
• .NET driver
--
71. What is the recommended file sizing for data loading using Snowpipe?
• A compressed file size greater than 100 MB, and up to 250 MB correct
• A compressed file size greater than 100 GB, and up to 250 GB
• A compressed file size greater than 10 MB, and up to 100 MB
• A compressed file size greater than 1 GB, and up to 2 GB
--
72. Which services does the Snowflake Cloud Services layer manage? (Select TWO).
• Compute resources correct
• Query execution
• Authentication correct
• Data storage
• Metadata correct
--
Explanation: The cloud services layer is a collection of services that coordinate activities across
Snowflake. These services tie together all of the different components of Snowflake in order to
process user requests, from login to query dispatch. The cloud services layer also runs on
compute instances provisioned by Snowflake from the cloud provider.
Services managed in this layer include:
✑ Authentication
✑ Infrastructure management
✑ Metadata management
✑ Access control
73. What data is stored in the Snowflake storage layer? (Select TWO).
• Snowflake parameters correct
• Micro-partitions correct
• Query history
• Persisted query results correct
• Standard and secure view results
--
74. In which scenarios would a user have to pay Cloud Services costs? (Select TWO).
• Compute Credits = 50 Credits Cloud Services = 10 correct
• Compute Credits = 80 Credits Cloud Services = 5
• Compute Credits = 10 Credits Cloud Services = 9
• Compute Credits = 120 Credits Cloud Services = 10
• Compute Credits = 200 Credits Cloud Services = 26 correct
--
75. What transformations are supported in a CREATE PIPE ... AS COPY ... FROM (....)
statement? (Select TWO.)
• Data can be filtered by an optional where clause correct
• Incoming data can be joined with other tables
• Columns can be reordered
• Columns can be omitted correct
• Row level access can be defined
--
79. A developer is granted ownership of a table that has a masking policy. The developer's
role is not able to see the masked data.
Will the developer be able to modify the table to read the masked data?
• Yes, because a table owner has full control and can unset masking policies.correct
• Yes, because masking policies only apply to cloned tables.
• No, because masking policies must always reference specific access roles.
• No, because ownership of a table does not include the ability to change masking policies
--
80. Which of the following describes how clustering keys work in Snowflake?
• Clustering keys update the micro-partitions in place with a full sort, and impact the DML
operations.
• Clustering keys sort the designated columns over time, without blocking DML
operations correct
• Clustering keys create a distributed, parallel data structure of pointers to a table's rows
and columns
• Clustering keys establish a hashed key on each node of a virtual warehouse to optimize
joins at run-time
--
81. What is a machine learning and data science partner within the Snowflake Partner
Ecosystem?
• Informatica
• Power Bl
• Adobe
• Data Robot correct
--
82. Which of the following is a valid source for an external stage when the Snowflake
account is located on Microsoft Azure?
• An FTP server with TLS encryption
• An HTTPS server with WebDAV
• A Google Cloud storage bucket
• A Windows server file share on Azure correct
--
83. Which data type can be used to store geospatial data in Snowflake?
• Variant
• Object
• Geometry
• Geography correct
--
84. What can be used to view warehouse usage over time? (Select Two).
• The load HISTORY view correct
• The Query history view
• The show warehouses command
• The WAREHOUSE_METERING__HISTORY View correct
• The billing and usage tab in the Snowflake web Ul correct
--
85. Which Snowflake feature is used for both querying and restoring data?
• Cluster keys
• Time Travel correct
• Fail-safe
• Cloning
--
Which application will the Snowflake users need to install on their devices in order to
connect with MFA?
• Okta Verify
• Duo Mobile
• Microsoft Authenticator
• Google Authenticator correct
--
87. Which Snowflake objects track DML changes made to tables, like inserts, updates, and
deletes?
• Pipes
• Streams correct
• Tasks
• Procedures
--
88. What tasks can be completed using the copy command? (Select TWO)
• Columns can be aggregated correct
• Columns can be joined with an existing table
• Columns can be reordered correct
• Columns can be omitted correct
• Data can be loaded without the need to spin up a virtual warehouse
--
89. What feature can be used to reorganize a very large table on one or more columns?
• Micro-partitions
• Clustering keys correct
• Key partitions
• Clustered partitions
--
90. What SQL command would be used to view all roles that were granted to user.1?
• show grants to user USER1; correct
• show grants of user USER1;
• describe user USER1;
• show grants on user USER1;
--
92. What Snowflake role must be granted for a user to create and manage accounts?
• ACCOUNTADMIN correct
• ORGADMIN
• SECURITYADMIN
• SYSADMIN
--
94. When is the result set cache no longer available? (Select TWO)
• When another warehouse is used to execute the query correct
• When another user executes the query
• When the underlying data has changed correct
• When the warehouse used to execute the query is suspended
• When it has been 24 hours since the last query correct
--
97. A company's security audit requires generating a report listing all Snowflake logins
(e.g.. date and user) within the last 90 days.
98. Which semi-structured file formats are supported when unloading data from a table?
(Select TWO).
• ORC correct
• XML
• Avro
• Parquet correct
• JSON correct
--
100. A user created a new worksheet within the Snowsight Ul and wants to share this with
teammates How can this worksheet be shared?
• Create a zero-copy clone of the worksheet and grant permissions to teammates
• Create a private Data Exchange so that any teammate can use the worksheetwrong
• Share the worksheet with teammates within Snowsight correct
• Create a database and grant all permissions to teammates
1. What privileges are required to create a task?
• The global privilege create task is required to create a new task.
• Tasks are created at the Application level and can only be created by the Account Admin
role.
• Many Snowflake DDLs are metadata operations only, and create task DDL can be
executed without virtual warehouse requirement or task specific grants.
• The role must have access to the target schema and the create task privilege on
the schema itself. correct
--
Explanation: All tasks in a simple tree must have the same task owner (i.e. a single role must
have the OWNERSHIP privilege on all of the tasks in the tree). All tasks in a simple tree must
exist in the same schema.
2. True or False: AWS Private Link provides a secure connection from the Customer’s on-
premise data center to the Snowflake.
• True correct
• False
--
Explanation: AWS PrivateLink provides a secure connection from the customer's on-premise
data center to Snowflake. It establishes a private connection between the customer's VPC
(Virtual Private Cloud) and Snowflake, ensuring that the data is transferred securely over the
Amazon network and not exposed to the public internet.
Explanation:
When a pipe is recreated using the CREATE OR REPLACE PIPE command, the following will
happen:
5. True or False: Query ID’s are unique across all Snowflake deployments and can be
used in communication with Snowflake Support to help troubleshoot issues.
• True correct
• False
--
6. True or False: During data unloading, only JSON and CSV files can be compressed.
• True
• False correct
--
7. What is the minimum Snowflake edition that customers planning on storing protected
information in Snowflake should consider for regulatory compliance?
• Standard
• Premier
• Enterprise
• Business Critical Edition correct
--
8. What is the minimum Snowflake edition that provides multi-cluster warehouses and up
to 90 days of Time Travel?
• Standard
• Premier
• Enterprise correct
• Business Critical Edition
--
9. True or False: Snowflake allows its customers to directly access the micro-partition
files that make up its tables.
• True
• False correct
--
Explanation: Snowflake supports creating temporary tables for storing non-permanent, transitory
data (e.g. ETL data, session-specific data). Temporary tables only exist within the session in
which they were created and persist only for the remainder of the session.
13. True or False: When a new Snowflake object is created, it is automatically owned by
the user who created it.
• True
• False correct
--
Explanation: When a new Snowflake object is created, it is not automatically owned by the user
who created it. Instead, the object is owned by the role that was active for the user when the
object was created. This means that ownership and associated privileges belong to the role,
allowing for easier management of access control and permissions across multiple users within
an organization.
14. Which of the following statements are true of Snowflake releases: (Choose two.)
• They happen approximately weekly correct
• They roll up and release approximately monthly, but customers can request early release
application
• During a release, new customer requests/queries/connections transparently move
over to the newer version correct
• A customer is assigned a 30 minute window (that can be moved anytime within a week)
during which the system will be unavailable and customer is upgraded
--
Explanation:
Snowflake supports federated authentication only in the Enterprise and higher editions.
Federated authentication allows users to authenticate with Snowflake using their existing identity
provider (IdP) credentials, enabling single sign-on (SSO) and simplifying the management of
user access and authentication. If federated authentication is a requirement for a customer, they
need to consider at least the Enterprise edition of Snowflake.
16. Which of the following statements are true of Virtual Warehouses? (Choose all that
apply.)
• Customers can change the size of the Warehouse after creation
• A Warehouse can be resized while running
• A Warehouse can be configured to suspend after a period of inactivity
• A Warehouse can be configured to auto-resume when new queries are submitted
--
17. True or False: Snowflake charges a premium for storing semi-structured data.
• True
• False correct
--
18. True or False: Reader Accounts incur no additional Compute costs to the Data
Provider since they are simply reading the shared data without making changes.
• True
• False correct
--
19. True or False: The user has to specify which cluster a query will run on in multi-
clustering Warehouse.
• True
• False correct
--
22. True or False: A Snowflake account is charged for data stored in both Internal and
External Stages.
• True
• False correct
--
Explanation:
A Snowflake account is charged for data stored in Internal Stages. However, data stored in
External Stages such as Amazon S3, Google Cloud Storage, or Microsoft Azure Blob Storage is
not billed by Snowflake. Storage costs for External Stages are billed separately by the respective
cloud storage provider.
23. Which is true of Snowflake network policies? A Snowflake network policy: (Choose
two.)
• Is available to all Snowflake Editions correct
• Is only available to customers with Business Critical Edition
• Restricts or enables access to specific IP addresses correct
• Is activated using an “ALTER DATABASE” command
--
24. True or False: A Virtual Warehouse consumes Snowflake credits even when inactive.
• True
• False correct
--
25. The number of queries that a Virtual Warehouse can concurrently process is
determined by: Choose 2 answers
• The complexity of each query correct
• The CONCURRENT_QUERY_UMIT parameter set on the Snowflake account
• The size of the data required for each query correct
• The tool that s executing the query
--
26. True or False: A customer using SnowSQL / native connectors will be unable be
unable to able to also use the Snowflake Web interface (UI) unless access to the UI is
explicitly granted by supported.
• True
• False correct
--
28. Each incremental increase in Virtual Warehouse size (e,g. Medium to Large) generally
results in what?
Select one.
• More micro-partitions
• Better query scheduling
• Double the numbers of servers In the compute duster correct
• Higher storage costs
--
29. True or False: It is possible to query data from an Internal or named External stage
without loading the data into Snowflake.
• True correct
• False
--
30. True or False: Data Providers can share data with only the Data Consumer.
• True
• False correct
--
31. True or False: Snowflake’s Global Services Layer gathers and maintains statistics on
all columns in all micro-partitions.
• True correct
• False
--
Explanation:
Snowflake is a single, integrated platform delivered as-a-service. It features storage, compute,
and global services layers that are physically separated but logically integrated.
32. True or False: It is possible to load data into Snowflake without creating a named File
Format object.
• True correct
• False
--
33. True or False: A table in Snowflake can only be queried using the Virtual Warehouse
that was used to load the data.
• True
• False correct
--
34. True or False: A single database can exist in more than one Snowflake account.
• True
• False correct
--
35. True or False: Some queries can be answered through the metadata cache and do not
require an active Virtual Warehouse.
• True correct
• False
--
Explanation:Some aggregate queries are answered thru micro partitions metadata only not
requiring any VW spin ups.
36. Which interfaces can be used to create and/or manage Virtual Warehouses?
• The Snowflake Web Interface (UI)
• SQL commands
• Data integration tools
• All of the above correct
--
39. Which of the following are true of multi-cluster Warehouses? Select all that apply
below.
• A multi-cluster Warehouse can add clusters automatically based on query activity
• A multi-cluster Warehouse can automatically turn itself off after a period of
inactivity
• A multi-cluster Warehouse can scale down when query activity slows
• A multi-cluster Warehouse can automatically turn itself on when a query is
executed against it
--
43. Credit Consumption by the Compute Layer (Virtual Warehouses) is based on: (Choose
two.)
• Number of users
• Warehouse size
• Amount of data processed
• # of clusters for the Warehouse
--
Explanation: "Snowflake credits are charged based on the number of virtual warehouses you
use, how long they run, and their size."
44. The Query History in the Snowflake Web Interface (UI) is kept for approximately:
• 60 minutes
• 24 hours
• 14 days correct
• 30 days
• 1 year
--
45. A role is created and owns 2 tables. This role is then dropped. Who will now own the
two tables?
• The tables are now orphaned correct
• The user that deleted the role
• SYSADMIN
• The assumed role that dropped the role
--
Explanation:
When a role is dropped in Snowflake, any objects owned by the role, such as tables, become
orphaned. Orphaned objects cannot be accessed or altered by any user or role until ownership is
transferred to an existing role. To transfer ownership, a user with the appropriate privileges
(typically a user with the ACCOUNTADMIN or SECURITYADMIN role) can use the ALTER
command to change the object's ownership: ALTER TABLE table_name SET OWNER_ROLE =
new_owner_role;
46. Which of the following statements are true about Schemas in Snowflake? (Choose
two.)
• A Schema may contain one or more Databases
• A Database may contain one or more Schemas
• A Schema is a logical grouping of Database Objects
• Each Schema is contained within a Warehouse
--
47. When scaling out by adding clusters to a multi-cluster warehouse, you are primarily
scaling for improved:
• Concurrency
• Performance
--
48. What are the three layers that make up Snowflake’s architecture? Choose 3 answer
• Compute correct
• Tri-Secret Secure
• Storage correct
• Cloud Services correct
--
Explanation:
The following objects can be cloned in Snowflake:
Tables: You can create a clone of a table using the CREATE TABLE ... CLONE statement. This
creates a new table with the same structure and data as the source table at the time of cloning.
Schemas: You can create a clone of a schema using the CREATE SCHEMA ... CLONE
statement. This creates a new schema with the same structure and objects as the source
schema at the time of cloning.
Databases: You can create a clone of a database using the CREATE DATABASE ... CLONE
statement. This creates a new database with the same structure and objects as the source
database at the time of cloning.
Users: While you cannot directly clone a user, you can create a new user with the same
properties, roles, and privileges as an existing user by using the CREATE USER ... COPY
statement.
Named File Formats (B) and Shares (D) cannot be cloned directly using a single statement in
Snowflake.
50. True or False: An active warehouse is required to run a COPY INTO statement.
• True correct
• False
--
53. Which of the following roles is recommended to be used to create and manage users
and roles?
• SYSADMIN
• SECURITYADMIN correct
• PUBLIC
• ACCOUNTADMIN
--
Explanation: "Security admin: Role that can manage any object grant globally, as well as create,
monitor, and manage users and roles"
54. Which of the following are valid Snowflake Virtual Warehouse Scaling Policies?
(Choose two.)
• Custom correct
• Economy correct
• Optimized
• Standard correct
--
56. True or False: It is possible to unload structured data to semi-structured formats such
as JSON and parquet.
• True
• False correct
--
Explanation:
It is not possible to unload structured data directly to semi-structured formats such as JSON and
Parquet using Snowflake's native functionality. Snowflake's COPY INTO statement, which is
used for unloading data, currently supports exporting data to CSV, TSV, and other delimited text
formats, as well as to Avro format (a row-based binary format).
If you need to unload structured data to JSON or Parquet, you may need to use third-party tools
or custom code to transform the exported data into the desired format after unloading it from
Snowflake. Alternatively, you can consider using cloud-native ETL services that support such
transformations, like AWS Glue, Azure Data Factory, or Google Cloud Dataflow.
57. Which of the following terms best describes Snowflake’s database architecture?
• Columnar shared nothing
• Shared disk
• Multi-cluster, shared data correct
• Cloud-native shared memory
--
Explanation:
Built from the ground up for the cloud, Snowflake’s unique multi-cluster shared data architecture
delivers the performance, scale, elasticity, and concurrency today’s organizations require.
59. Which of the following statements would be used to export/unload data from
Snowflake?
• COPY INTO @stage correct
• EXPORT TO @stage
• INSERT INTO @stage
• EXPORT_TO_STAGE(stage = > @Wage, select = > 'select * from t1);
--
60. Which of the following are options when creating a Virtual Warehouse?
• Auto-suspend correct
• Auto-resume correct
• Local SSD size
• User count
--
61. True or False: The COPY command must specify a File Format in order to execute.
• True
• False correct
--
Explanation:
Create Stage:
62. True or False: Bulk unloading of data from Snowflake supports the use of a SELECT
statement.
• True correct
• False
--
64. True or False: You can query the files in an External Stage directly without having to
load the data into a table.
• True correct
• False
--
Explanation:
External tables are read-only, therefore no DML operations can be performed on them; however,
external tables can be used for query and join operations. Views can be created against external
tables.
65. True or False: Snowflake’s data warehouse was built from the ground up for the cloud
in lieu of using an existing database or a platform, like Hadoop, as a base.
• True correct
• False
--
Explanation:
Snowflake's data warehouse was built from the ground up specifically for the cloud, instead of
using an existing database or platform like Hadoop as a base. This design choice allowed
Snowflake to create a unique architecture that takes advantage of the elasticity, scalability, and
flexibility of cloud resources, resulting in a fully managed, highly performant, and cost-effective
data warehouse solution. Snowflake's multi-cluster, shared data architecture separates compute
and storage resources, allowing for independent scaling and optimized performance for various
workloads and use cases.
67. What are the three things customers want most from their enterprise data warehouse
solution? Choose 3 answers
• On-premise availability correct
• Simplicity correct
• Open source based
• Concurrency correct
• Performance correct
--
68. When reviewing a query profile, what is a symptom that a query is too large to fit into
the memory?
• A single join node uses more than 50% of the query time
• Partitions scanned is equal to partitions total
• An AggregateOperacor node is present
• The query is spilling to remote storage correct
--
69. Which of the following are valid methods for authenticating users for access into
Snowflake? (Select THREE)
• SCIM correct
• Federated authentication correct
• TLS 1.2
• Key-pair authentication correct
• OAuth correct
• OCSP authentication
--
70. A developer is granted ownership of a table that has a masking policy. The developer's
role is not able to see the masked data.
Will the developer be able to modify the table to read the masked data?
• Yes, because a table owner has full control and can unset masking policies.correct
• Yes, because masking policies only apply to cloned tables.
• No, because masking policies must always reference specific access roles.
• No, because ownership of a table does not include the ability to change masking policies
--
71. A virtual warehouse's auto-suspend and auto-resume settings apply to which of the
following?
• The primary cluster in the virtual warehouse
• The entire virtual warehouse correct
• The database in which the virtual warehouse resides
• The Queries currently being run on the virtual warehouse
--
72. When reviewing the load for a warehouse using the load monitoring chart, the chart
indicates that a high volume of Queries are always queuing in the warehouse
According to recommended best practice, what should be done to reduce the Queue
volume? (Select TWO).
• Use multi-clustered warehousing to scale out warehouse capacity .correct
• Scale up the warehouse size to allow Queries to execute faster.
• Stop and start the warehouse to clear the queued queries
• Migrate some queries to a new warehouse to reduce load correct
• Limit user access to the warehouse so fewer queries are run against it.
--
73. Which account__usage views are used to evaluate the details of dynamic data
masking? (Select TWO)
• ROLES correct
• POLICY_REFERENCES correct
• QUERY_HISTORY
• RESOURCE_MONIT ORS
• ACCESS_HISTORY correct
--
Explanation:
To evaluate the details of dynamic data masking in Snowflake, you can use the following account
usage views:
POLICY_REFERENCES: This view provides information about the masking policies applied to
various objects (tables, columns, etc.) within the account. It includes details like policy name,
object name, and the masking policy expression.
ACCESS_HISTORY: This view provides a historical record of access attempts to objects (tables,
columns, etc.) in the account, including whether the access was allowed or blocked, and if
masking was applied during the access. It can help you understand how data masking policies
are being enforced and identify any potential issues with the masking configuration.
74. How would you determine the size of the virtual warehouse used for a task?
• Root task may be executed concurrently (i.e. multiple instances), it is recommended to
leave some margins in the execution window to avoid missing instances of execution
• Querying (select) the size of the stream content would help determine the warehouse
size. For example, if querying large stream content, use a larger warehouse size
• If using the stored procedure to execute multiple SQL statements, it's best to test
run the stored procedure separately to size the compute resource first correct
• Since task infrastructure is based on running the task body on schedule, it's
recommended to configure the virtual warehouse for automatic concurrency handling
using Multi-cluster warehouse (MCW) to match the task schedule
--
75. Which of the following are best practice recommendations that should be considered
when loading data into Snowflake? (Select TWO).
• Load files that are approximately 25 MB or smaller. correct
• Remove all dates and timestamps.
• Load files that are approximately 100-250 MB (or larger) correct
• Avoid using embedded characters such as commas for numeric data types correct
• Remove semi-structured data types
--
76. A company's security audit requires generating a report listing all Snowflake logins
(e.g.. date and user) within the last 90 days.
77. What feature can be used to reorganize a very large table on one or more columns?
• Micro-partitions
• Clustering keys correct
• Key partitions
• Clustered partitions
--
78. In the query profiler view for a query, which components represent areas that can be
used to help optimize query performance? (Select TWO)
• Bytes scanned correct
• Bytes sent over the network
• Number of partitions scanned correct
• Percentage scanned from cache
• External bytes scanned
--
79. What is a machine learning and data science partner within the Snowflake Partner
Ecosystem?
• Informatica
• Power Bl
• Adobe
• Data Robot correct
--
80. Which command can be used to load data into an internal stage?
• LOAD
• copy
• GET
• PUT correct
--
81. A user is loading JSON documents composed of a huge array containing multiple
records into Snowflake. The user enables the strip__outer_array file format option
Explanation:
Data Size Limitations
For some semi-structured data formats (e.g. JSON), data sets are frequently a simple
concatenation of multiple documents. The JSON output from some software is composed of a
single huge array containing multiple records. There is no need to separate the documents with
line breaks or commas, though both are supported.
If the data exceeds 16 MB, enable the STRIP_OUTER_ARRAY file format option for the COPY
INTO <table> command to remove the outer array structure and load the records into separate
table rows:
from @~/<file>.json
82. Which copy INTO command outputs the data into one file?
• SINGLE=TRUE
• MAX_FILE_NUMBER=1 correct
• FILE_NUMBER=1
• MULTIPLE=FAISE
--
84. What happens to the underlying table data when a CLUSTER BY clause is added to a
Snowflake table?
• Data is hashed by the cluster key to facilitate fast searches for common data values
• Larger micro-partitions are created for common data values to reduce the number of
partitions that must be scanned
• Smaller micro-partitions are created for common data values to allow for more parallelism
• Data may be colocated by the cluster key within the micro-partitions to improve
pruning performance correct
--
Which SQL command should be used to validate which data was loaded into the stage?
• list @file__stage correct
• show @file__stage
• view @file__stage
• verify @file__stage
--
86. A user has an application that writes a new Tile to a cloud storage location every 5
minutes.
What would be the MOST efficient way to get the files into Snowflake?
• Create a task that runs a copy into operation from an external stage every 5 minutes
• Create a task that puts the files in an internal stage and automate the data loading wizard
• Create a task that runs a GET operation to intermittently check for new files
• Set up cloud provider notifications on the Tile location and use Snowpipe with
auto-ingest correct
--
87. What is the default File Format used in the COPY command if one is not specified?
• CSV correct
• JSON
• Parquet
• XML
--
Explanation:
Snowflake Time Travel is a feature that enables you to access historical data within a specified
period, known as the data retention period. It allows you to query data as it existed at a specific
point in time, which can be useful for point-in-time analysis, BI reporting, or data recovery in case
of accidental data changes or deletions.
Explanation:
Snowflake applies egress charges in cases where data is transferred out of Snowflake's
infrastructure, such as when retrieving query results. These charges are typically imposed by the
cloud service provider (e.g., AWS, Azure, or Google Cloud Platform) for data transfer out of their
respective data centers.
91. Which feature is only available in the Enterprise or higher editions of Snowflake?
• Column-level security correct
• SOC 2 type II certification
• Multi-factor Authentication (MFA)
• Object-level access control
--
92. Which of the following is a valid source for an external stage when the Snowflake
account is located on Microsoft Azure?
• An FTP server with TLS encryption
• An HTTPS server with WebDAV
• A Google Cloud storage bucket
• A Windows server file share on Azure correct
--
Explanation:
Snowflake credits are used to pay for the consumption of resources on Snowflake. A Snowflake
credit is a unit of measure, and it is consumed only when a customer is using resources, such as
when a virtual warehouse is running, the cloud services layer is performing work, or serverless
features are used. https://docs.snowflake.com/en/user-guide/what-are-credits.html
94. Which Snowflake technique can be used to improve the performance of a query?
• Clustering correct
• Indexing
• Fragmenting
• Using INDEX__HINTS
--
95. What are the default Time Travel and Fail-safe retention periods for transient tables?
• Time Travel - 1 day. Fail-safe - 1 day
• Time Travel - 0 days. Fail-safe - 1 day
• Time Travel - 1 day. Fail-safe - 0 days
• Transient tables are retained in neither Fail-safe nor Time Travel correct
--
Explanation:
Transient tables in Snowflake do not have a Time Travel retention period or a Fail-safe retention
period. These tables are designed for temporary storage and intermediate processing, and as
such, they do not maintain historical data versions or offer data protection through Fail-safe.
Once the data is deleted from a transient table, it cannot be recovered using Time Travel or Fail-
safe features.
Explanation:
Micro-partitioning is a key feature of Snowflake's architecture, providing several benefits:
Micro-partitions are immutable objects that support the use of Time Travel. Each time data is
loaded or modified, new micro-partitions are created, preserving the previous versions of data to
enable Time Travel and historical data querying.
Micro-partitions can reduce the amount of I/O from object storage to virtual warehouses.
Snowflake's query optimizer leverages metadata about micro-partitions, such as the range of
values within them, to prune unnecessary micro-partitions from query scans. This reduces the
amount of data read and improves query performance.
97. Which of the following conditions must be met in order to return results from the
results cache? (Select TWO).
• The user has the appropriate privileges on the objects associated with the query
correct
• Micro-partitions have been reclustered since the query was last run
• The new query is run using the same virtual warehouse as the previous query
• The query includes a User Defined Function (UDF)
• The query has been run within 24 hours of the previously-run query correct
--
98. What happens when a cloned table is replicated to a secondary database? (Select
TWO)
• A read-only copy of the cloned tables is stored. correct
• The replication will not be successful.
• The physical data is replicated
• Additional costs for storage are charged to a secondary account
• Metadata pointers to cloned tables are replicated correct
--
Explanation:
When a cloned table is replicated to a secondary database in Snowflake, the following occurs:
A read-only copy of the cloned tables is stored in the secondary database. This ensures that the
replicated data is consistent with the primary database and can be used for reporting, querying,
or other read-only operations.
Metadata pointers to cloned tables are replicated, rather than the physical data. Snowflake's
cloning operation is metadata-based, which means that cloning a table does not create a full
copy of the data. Instead, the new table references the same underlying data as the original
table. When replicating a cloned table, Snowflake replicates these metadata pointers, ensuring
efficient replication without duplicating the physical data.
99. What can be used to view warehouse usage over time? (Select Two).
• The load HISTORY view correct
• The Query history view
• The show warehouses command
• The WAREHOUSE_METERING__HISTORY View correct
• The billing and usage tab in the Snowflake web Ul correct
--
100. Will data cached in a warehouse be lost when the warehouse is resized?
• Possibly, if the warehouse is resized to a smaller size and the cache no longer fits.
• Yes. because the compute resource is replaced in its entirety with a new compute
resource.
• No. because the size of the cache is independent from the warehouse size correct
• Yes. became the new compute resource will no longer have access to the cache
encryption key
--
2. What are ways to create and manage data shares in Snowflake? (Select TWO)
• Through the Snowflake web interface (Ul)correct
• Through the DATA_SHARE=TRUE parameter
• Through SQL commandscorrect
• Through the enable__share=true parametercorrect
• Using the CREATE SHARE AS SELECT * TABLE command
--
4. Which is the MINIMUM required Snowflake edition that a user must have if they want to
use AWS/Azure Privatelink or Google Cloud Private Service Connect?
• Standard
• Premium
• Enterprise
• Business Criticalcorrect
--
Explanation:
Snowflake Time Travel enables accessing historical data (i.e. data that has been changed or
deleted) at any point within a defined period.
✑ Restoring data-related objects (tables, schemas, and databases) that might have been
accidentally or intentionally deleted.
Explanation:
Characteristics of a VARIANT
A VARIANT can store a value of any other type, including OBJECT and ARRAY. The maximum
length of a VARIANT is 16 MB.
9. True or False: It is possible for a user to run a query against the query result cache
without requiring an active Warehouse.
• Truecorrect
• False
--
Explanation:
Query result cache is all about fetching the data from cloud services layer and saving the cost by
not running the virtual warehouse.
10. Which of the following indicates that it may be appropriate to use a clustering key for a
table? (Select TWO).
• The table contains a column that has very low cardinalitycorrect
• DML statements that are being issued against the table are blocked
• The table has a small number of micro-partitions
• Queries on the table are running slower than expectedcorrect
• The clustering depth for the table is largecorrect
--
12. Query compilation occurs in which architecture layer of the Snowflake Cloud Data
Platform?
• Compute layer
• Storage layer
• Cloud infrastructure layer
• Cloud services layercorrect
--
Explanation:
For query execution, Snowflake uses the Virtual Warehouse. The query processing layer is
separated from the disk storage layer in the Snowflake data architecture. You can use the data
from the storage layer to run queries in this layer
13. In which scenarios would a user have to pay Cloud Services costs? (Select TWO).
• Compute Credits = 50 Credits Cloud Services = 10correct
• Compute Credits = 80 Credits Cloud Services = 5
• Compute Credits = 10 Credits Cloud Services = 9
• Compute Credits = 120 Credits Cloud Services = 10
• Compute Credits = 200 Credits Cloud Services = 26correct
--
Explanation:
Alation provides Data Cataloging functionality. They state they are the 'One Place to Find,
Understand, & Govern Data Across an Enterprise. https://docs.snowflake.com/en/user-
guide/ecosystem-all.html
15. Which Snowflake technique can be used to improve the performance of a query?
• Clusteringcorrect
• Indexing
• Fragmenting
• Using INDEX__HINTS
--
16. Which of the following objects can be shared through secure data sharing?
• Masking policy
• Stored procedure
• Task
• External tablecorrect
--
Explanation:
Secure Data Sharing enables sharing selected objects in a database in your account with other
Snowflake accounts.
✑ Tables
✑ External tables
✑ Secure views
✑ Secure UDFs
Snowflake enables the sharing of databases through shares, which are created by data providers
and “imported” by data consumers.
17. Which semi-structured file formats are supported when unloading data from a table?
(Select TWO).
• ORCcorrect
• XML
• Avro
• Parquetcorrect
• JSONcorrect
--
Explanation:
Semi-structured - JSON, Parquet
18. Which account__usage views are used to evaluate the details of dynamic data
masking? (Select TWO)
• ROLEScorrect
• POLICY_REFERENCEScorrect
• QUERY_HISTORYcorrect
• RESOURCE_MONIT ORS
• ACCESS_HISTORY
--
19. True or False: A 4X-Large Warehouse may, at times, take longer to provision than a X-
Small Warehouse.
• Truecorrect
• False
--
Explanation:
You can experiment the same with snowflake UI.
20. What happens to the underlying table data when a CLUSTER BY clause is added to a
Snowflake table?
• Data is hashed by the cluster key to facilitate fast searches for common data values
• Larger micro-partitions are created for common data values to reduce the number of
partitions that must be scanned
• Smaller micro-partitions are created for common data values to allow for more parallelism
• Data may be colocated by the cluster key within the micro-partitions to improve pruning
performancecorrect
--
Explanation:
There are several limitations to using materialized views:
22. What can be used to view warehouse usage over time? (Select Two).
• The load HISTORY viewcorrect
• The Query history view
• The show warehouses command
• The WAREHOUSE_METERING__HISTORY Viewcorrect
• The billing and usage tab in the Snowflake web Ulcorrect
--
23. What are the default Time Travel and Fail-safe retention periods for transient tables?
• Time Travel - 1 day. Fail-safe - 1 day
• Time Travel - 0 days. Fail-safe - 1 day
• Time Travel - 1 day. Fail-safe - 0 dayscorrect
• Transient tables are retained in neither Fail-safe nor Time Travel
--
24. Which of the following is a valid source for an external stage when the Snowflake
account is located on Microsoft Azure?
• An FTP server with TLS encryption
• An HTTPS server with WebDAV
• A Google Cloud storage bucketcorrect
• A Windows server file share on Azure
--
25. Which of the following describes how multiple Snowflake accounts in a single
organization relate to various cloud providers?
• Each Snowflake account can be hosted in a different cloud vendor and region.correct
• Each Snowflake account must be hosted in a different cloud vendor and region
• All Snowflake accounts must be hosted in the same cloud vendor and region
• Each Snowflake account can be hosted in a different cloud vendor, but must be in the
same region.
--
26. When reviewing a query profile, what is a symptom that a query is too large to fit into
the memory?
• A single join node uses more than 50% of the query time
• Partitions scanned is equal to partitions total
• An AggregateOperacor node is present
• The query is spilling to remote storagecorrect
--
27. A user unloaded a Snowflake table called mytable to an internal stage called mystage.
Which command can be used to view the list of files that has been uploaded to the
staged?
• list @mytable;
• list @%raytable;
• list @ %m.ystage;
• list @mystage;correct
--
28. Which Snowflake object enables loading data from files as soon as they are available
in a cloud storage location?
• Pipecorrect
• External stage
• Task
• Stream
--
Explanation:
Snowpipe enables loading data from files as soon as they’re available in astage. This means you
can load data from files in micro-batches, making it available tousers within minutes, rather than
manually executing COPY statements on a schedule toload larger batches.
29. What SQL command would be used to view all roles that were granted to user.1?
• show grants to user USER1;correct
• show grants of user USER1;
• describe user USER1;
• show grants on user USER1;
--
Explanation:
When creating roles that will serve as the owners of securable objects in the system, Snowflake
recommends creating a hierarchy of custom roles, with the top-most custom role assigned to the
system role SYSADMIN. This role structure allows system administrators to manage all objects
in the account, such as warehouses and database objects, while restricting management of
users and roles to the USERADMIN role.
31. True or False: Loading data into Snowflake requires that source data files be no larger
than 16MB.
• True
• Falsecorrect
--
Explanation:
By default, COPY INTO location statements separate table data into a set of output files to take
advantage of parallel operations. The maximum size for each file is set using the
MAX_FILE_SIZE copy option. The default value is 16777216 (16 MB) but can be increased to
accommodate larger files. The maximum file size supported is 5 GB for Amazon S3, Google
Cloud Storage, or Microsoft Azure stages. To unload data to a single output file (at the potential
cost of decreased performance), specify the SINGLE = true copy option in your statement. You
can optionally specify a name for the file in the path.
32. Which command can be used to stage local files from which Snowflake interface?
• SnowSQLcorrect
• Snowflake classic web interface (Ul)
• Snowsight
• .NET driver
--
33. Which of the following conditions must be met in order to return results from the
results cache? (Select TWO).
• The user has the appropriate privileges on the objects associated with the querycorrect
• Micro-partitions have been reclustered since the query was last run
• The new query is run using the same virtual warehouse as the previous query
• The query includes a User Defined Function (UDF)
• The query has been run within 24 hours of the previously-run querycorrect
--
34. When is the result set cache no longer available? (Select TWO)
• When another warehouse is used to execute the querycorrect
• When another user executes the query
• When the underlying data has changedcorrect
• When the warehouse used to execute the query is suspended
• When it has been 24 hours since the last querycorrect
--
35. A developer is granted ownership of a table that has a masking policy. The developer's
role is not able to see the masked data. Will the developer be able to modify the table to
read the masked data?
• Yes, because a table owner has full control and can unset masking policies.
• Yes, because masking policies only apply to cloned tables.
• No, because masking policies must always reference specific access roles.
• No, because ownership of a table does not include the ability to change masking
policiescorrect
--
36. Which of the following describes how clustering keys work in Snowflake?
• Clustering keys update the micro-partitions in place with a full sort, and impact the DML
operations.
• Clustering keys sort the designated columns over time, without blocking DML
operationscorrect
• Clustering keys create a distributed, parallel data structure of pointers to a table's rows
and columns
• Clustering keys establish a hashed key on each node of a virtual warehouse to optimize
joins at run-time
--
37. What feature can be used to reorganize a very large table on one or more columns?
• Micro-partitions
• Clustering keys correct
• Key partitions
• Clustered partitions
--
Explanation:
Bulk data load . Stored in the metadata of the target table for 64 days. Available upon completion
of theCOPY statement as the statement output.
40. What Snowflake role must be granted for a user to create and manage accounts?
• ACCOUNTADMINcorrect
• ORGADMIN
• SECURITYADMIN
• SYSADMIN
--
41. What data is stored in the Snowflake storage layer? (Select TWO).
• Snowflake parameterscorrect
• Micro-partitionscorrect
• Query history
• Persisted query resultscorrect
• Standard and secure view results
--
42. A company's security audit requires generating a report listing all Snowflake logins
(e.g.. date and user) within the last 90 days.
43. How would you determine the size of the virtual warehouse used for a task?
• Root task may be executed concurrently (i.e. multiple instances), it is recommended to
leave some margins in the execution window to avoid missing instances of execution
• Querying (select) the size of the stream content would help determine the warehouse
size. For example, if querying large stream content, use a larger warehouse size
• If using the stored procedure to execute multiple SQL statements, it's best to test run the
stored procedure separately to size the compute resource firstcorrect
• Since task infrastructure is based on running the task body on schedule, it's
recommended to configure the virtual warehouse for automatic concurrency handling
using Multi-cluster warehouse (MCW) to match the task schedule
--
44. In the query profiler view for a query, which components represent areas that can be
used to help optimize query performance? (Select TWO)
• Bytes scannedcorrect
• Bytes sent over the network
• Number of partitions scanned
• Percentage scanned from cachecorrect
• External bytes scanned
--
45. User-level network policies can be created by which of the following roles? (Select
TWO).
• ROLEADMINcorrect
• ACCOUNTADMINcorrect
• SYSADMIN
• SECURITYADMINcorrect
• USERADMIN
--
Explanation:
By default, Snowflake allows users to connect to the service from any computer or device IP
address. A security administrator (or higher) can create a network policy to allow or deny access
to a single IP address or a list of addresses. Network policies currently support only Internet
Protocol version 4 (i.e. IPv4) addresses.
An administrator with sufficient permissions can create any number of network policies.
The ingest operation completes with no errors, using the following command:
The next day the user adds 10 files to the stage so that now the stage contains a mixture
of new customer data and updates to the previous data. The user did not remove the 10
original files.
If the user runs the same copy into command what will happen?
• All data from all of the files on the stage will be appended to the table
• Only data about new customers from the new files will be appended to the table
• The operation will fail with the error uncertain files in stage.
• All data from only the newly-added files will be appended to the table.correct
--
47. What happens when a cloned table is replicated to a secondary database? (Select
TWO)
• A read-only copy of the cloned tables is stored.correct
• The replication will not be successful.
• The physical data is replicatedcorrect
• Additional costs for storage are charged to a secondary accountcorrect
• Metadata pointers to cloned tables are replicated
--
Explanation:
Cloned objects are replicated physically rather than logically to secondary databases. That is,
cloned tables in a standard database do not contribute to the overall data storage unless or until
DML operations on the clone add to or modify existing data.
However, when a cloned table is replicated to a secondary database, the physical data is also
replicated, increasing the data storage usage for your account.
48. Which feature is only available in the Enterprise or higher editions of Snowflake?
• Column-level securitycorrect
• SOC 2 type II certification
• Multi-factor Authentication (MFA)
• Object-level access control
--
Explanation:
https://docs.snowflake.com/en/user-guide/intro-editions.html
49. Which of the following Snowflake objects can be shared using a secure share? (Select
TWO).
• Materialized viewscorrect
• Sequences
• Procedures
• Tablescorrect
• Secure User Defined Functions (UDFs)correct
--
Explanation:
Secure Data Sharing enables sharing selected objects in a database in your account with other
Snowflake accounts.
✑ Tables
✑ External tables
✑ Secure views
✑ Secure materialized views
✑ Secure UDFs
Explanation:
Snowflake credits are used to pay for the consumption of resources on Snowflake. A Snowflake
credit is a unit of measure, and it is consumed only when a customer is using resources, such as
when a virtual warehouse is running, the cloud services layer is performing work, or serverless
features are used. https://docs.snowflake.com/en/user-guide/what-are-credits.html
52. Which copy INTO command outputs the data into one file?
• SINGLE=TRUEcorrect
• MAX_FILE_NUMBER=1
• FILE_NUMBER=1
• MULTIPLE=FAISE
--
53. Which of the following compute resources or features are managed by Snowflake?
(Select TWO).
• Execute a COPY commandcorrect
• Updating data
• Snowpipecorrect
• AUTOMATIC__CLUSTERINGcorrect
• Scaling up a warehouse
--
Which SQL command should be used to validate which data was loaded into the stage?
• list @file__stagecorrect
• show @file__stage
• view @file__stage
• verify @file__stage
--
55. Which of the following are valid methods for authenticating users for access into
Snowflake? (Select THREE)
• SCIMcorrect
• Federated authenticationcorrect
• TLS 1.2
• Key-pair authenticationcorrect
• OAuthcorrect
• OCSP authentication
--
56. Which Snowflake objects track DML changes made to tables, like inserts, updates, and
deletes?
• Pipes
• Streamscorrect
• Tasks
• Procedures
--
57. What is the recommended file sizing for data loading using Snowpipe?
• A compressed file size greater than 100 MB, and up to 250 MBcorrect
• A compressed file size greater than 100 GB, and up to 250 GB
• A compressed file size greater than 10 MB, and up to 100 MB
• A compressed file size greater than 1 GB, and up to 2 GB
--
59. Which data type can be used to store geospatial data in Snowflake?
• Variant
• Object
• Geometrycorrect
• Geography
--
60. Which of the following are best practice recommendations that should be considered
when loading data into Snowflake? (Select TWO).
• Load files that are approximately 25 MB or smaller.correct
• Remove all dates and timestamps.
• Load files that are approximately 100-250 MB (or larger)correct
• Avoid using embedded characters such as commas for numeric data typescorrect
• Remove semi-structured data types
--
Explanation:
Cloud providers apply data egress charges in either of the following use cases:
✑ Data is transferred from one region to another within the same cloud platform.
Explanation:
All Snowflake-managed keys are automatically rotated by Snowflake when they are more than
30 days old. Active keys are retired, and new keys are created. When Snowflake determines the
retired key is no longer needed, the key is automatically destroyed.
63. What is a machine learning and data science partner within the Snowflake Partner
Ecosystem?
• Informatica
• Power Bl
• Adobe
• Data Robotcorrect
--
64. Which Snowflake feature is used for both querying and restoring data?
• Cluster keys
• Time Travelcorrect
• Fail-safe
• Cloning
--
66. Which services does the Snowflake Cloud Services layer manage? (Select TWO).
• Compute resourcescorrect
• Query execution
• Authenticationcorrect
• Data storage
• Metadatacorrect
--
Explanation:
https://docs.snowflake.com/en/user-guide/intro-key-concepts.html The cloud services layer is a
collection of services that coordinate activities across Snowflake. These services tie together all
of the different components of Snowflake in order to process user requests, from login to query
dispatch. The cloud services layer also runs on compute instances provisioned by Snowflake
from the cloud provider.
✑ Authentication
✑ Infrastructure management
✑ Metadata management
✑ Access control
67. What happens when an external or an internal stage is dropped? (Select TWO).
• When dropping an external stage, the files are not removed and only the stage is
droppedcorrect
• When dropping an external stage, both the stage and the files within the stage are
removed
• When dropping an internal stage, the files are deleted with the stage and the files are
recoverable
• When dropping an internal stage, the files are deleted with the stage and the files are not
recoverablecorrect
• When dropping an internal stage, only selected files are deleted with the stage and are
not recoverable
--
68. What is the default File Format used in the COPY command if one is not specified?
• CSVcorrect
• JSON
• Parquet
• XML
--
70. Where would a Snowflake user find information about query activity from 90 days
ago?
• account__usage . query history viewcorrect
• account__usage.query__history__archive View
• information__schema . cruery_history view
• information__schema - query history_by_ses s i on view
--
71. Which of the following Snowflake capabilities are available in all Snowflake editions?
(Select TWO)
• Customer-managed encryption keys through Tri-Secret Securecorrect
• Automatic encryption of all datacorrect
• Up to 90 days of data recovery through Time Travel
• Object-level access controlcorrect
• Column-level security to apply data masking policies to tables and views
--
72. What is the minimum Snowflake edition required to create a materialized view?
• Standard Edition
• Enterprise Editioncorrect
• Business Critical Edition
• Virtual Private Snowflake Edition
--
Explanation:
Materialized views require Enterprise Edition. To inquire about upgrading,please contact
Snowflake Support
73. The Information Schema and Account Usage Share provide storage information for
which of the following objects? (Choose three.)
• Userscorrect
• Tablescorrect
• Databasescorrect
• Internal Stagescorrect
--
75. A user is loading JSON documents composed of a huge array containing multiple
records into Snowflake. The user enables the strip__outer_array file format option
For some semi-structured data formats (e.g. JSON), data sets are frequently a simple
concatenation of multiple documents. The JSON output from some software is composed of a
single huge array containing multiple records. There is no need to separate the documents with
line breaks or commas, though both are supported.
If the data exceeds 16 MB, enable the STRIP_OUTER_ARRAY file format option forthe COPY
INTO <table> command to remove the outer array structure and load therecords into separate
table rows:
from @~/<file>.json
76. Which of the following Snowflake features provide continuous data protection
automatically? (Select TWO).
• Internal stagescorrect
• Incremental backups
• Time Travelcorrect
• Zero-copy clones
• Fail-safecorrect
--
Explanation:
Time travel and fail safe are the two continuous data protection features support the recovery of
data automatically.
Snowflake provides powerful CDP features for ensuring the maintenance and availability of your
historical data (i.e. data that has been changed or deleted):
✑ Querying, cloning, and restoring historical data in tables, schemas, and databases for up to 90
days through Snowflake Time Travel.
https://docs.snowflake.com/en/user-guide/data-availability.html
77. During periods of warehouse contention which parameter controls the maximum
length of time a warehouse will hold a query for processing?
• STATEMENT_TIMEOUT__IN__SECONDS
• STATEMENT_QUEUED_TIMEOUT_IN_SECONDScorrect
• MAX_CONCURRENCY__LEVEL
• QUERY_TIMEOUT_IN_SECONDS
--
Explanation:
The parameter STATEMENT_QUEUED_TIMEOUT_IN_SECONDS sets thelimit for a query to
wait in the queue in order to get its chance of running on the warehouse. The query will quit after
reaching this limit. By default, the value of this parameter is 0 whichmean the queries will wait
indefinitely in the waiting queue
Explanation:
Separate and distinct from Time Travel, Fail-safe ensures historical data is protected in the event
of a system failure or other catastrophic event, e.g. a hardware failure or security breach. Fail
safe feature cannot be enabled or disabled from the user end .
Explanation:
External functions are user-defined functions that are stored and executed outside of Snowflake.
External functions make it easier to access external API services such as geocoders, machine
learning models, and other custom code running outside of Snowflake. This feature eliminates
the need to export and reimport data when using third-party services, significantly simplifying
your data pipelines..
80. Which data types does Snowflake support when querying semi-structured data?
(Select TWO)
• VARIANTcorrect
• ARRAYcorrect
• VARCHAR
• XML
• BLOB
--
Explanation:
A VARIANT stores semi-structured data in Snowflake. It can store a value of any other type,
including OBJECT and ARRAY.
81. What is the default character set used when loading CSV files into Snowflake?
• UTF-8correct
• UTF-16
• ISO S859-1
• ANSI_X3.A
--
Explanation:
For delimited files (CSV, TSV, etc.), the default character set is UTF-8. To use any
othercharacters sets, you must explicitly specify the encoding to use for loading. For the list
ofsupported character sets, see Supported Character Sets for Delimited Files (in this topic).
82. What are two ways to create and manage Data Shares in Snowflake? (Choose two.)
• Via the Snowflake Web Interface (Ul)correct
• Via the data_share=true parameter
• Via SQL commandscorrect
• Via Virtual Warehouses
--
83. A marketing co-worker has requested the ability to change a warehouse size on their
medium virtual warehouse called mktg__WH.
Explanation:
Reference: https://docs.snowflake.com/en/user-guide/warehouses-tasks.html#effects-of-resizing-
a-suspended-warehouse
85. When reviewing the load for a warehouse using the load monitoring chart, the chart
indicates that a high volume of Queries are always queuing in the warehouse
According to recommended best practice, what should be done to reduce the Queue
volume? (Select TWO).
• Use multi-clustered warehousing to scale out warehouse capacity.correct
• Scale up the warehouse size to allow Queries to execute faster.
• Stop and start the warehouse to clear the queued queries
• Migrate some queries to a new warehouse to reduce loadcorrect
• Limit user access to the warehouse so fewer queries are run against it.
--
86. A user has an application that writes a new Tile to a cloud storage location every 5
minutes.
What would be the MOST efficient way to get the files into Snowflake?
• Create a task that runs a copy into operation from an external stage every 5 minutes
• Create a task that puts the files in an internal stage and automate the data loading wizard
• Create a task that runs a GET operation to intermittently check for new files
• Set up cloud provider notifications on the Tile location and use Snowpipe with auto-
ingestcorrect
--
Which application will the Snowflake users need to install on their devices in order to
connect with MFA?
• Okta Verify
• Duo Mobilecorrect
• Microsoft Authenticator
• Google Authenticator
--
89. True or False: Reader Accounts are able to extract data from shared data objects for
use outside of Snowflake.
• Truecorrect
• False
--
90. What is the MOST performant file format for loading data in Snowflake?
• CSV (Unzipped)correct
• Parquet
• CSV (Gzipped)
• ORC
--
91. Which cache type is used to cache data output from SQL queries?
• Metadata cache
• Result cachecorrect
• Remote cache
• Local file cache
--
92. True or False: When you create a custom role, it is a best practice to immediately grant
that role to ACCOUNTADMIN.
• True
• Falsecorrect
--
93. What tasks can be completed using the copy command? (Select TWO)
• Columns can be aggregatedcorrect
• Columns can be joined with an existing table
• Columns can be reorderedcorrect
• Columns can be omittedcorrect
• Data can be loaded without the need to spin up a virtual warehouse
--
94. Which of the following commands cannot be used within a reader account?
• CREATE SHAREcorrect
• ALTER WAREHOUSE
• DROP ROLE
• SHOW SCHEMAS
• DESCRBE TABLE
--
Explanation:
A reader account is intended primarily for querying data shared by the provider of the account.
Adding new data to the account and/or updating shared data in the account is not supported.
Changing the configuration of virtual warehouses is also not permitted as those resources are
owned and managed by the provider of the account which is sharing the data.
96. Which command is used to unload data from a Snowflake table into a file in a stage?
• COPY INTOcorrect
• GET
• WRITE
• EXTRACT INTO
--.
97. Which command can be used to load data into an internal stage?
• LOAD
• copy
• GET
• PUTcorrect
--
98. What Snowflake features allow virtual warehouses to handle high concurrency
workloads? (Select TWO)
• The ability to scale up warehousescorrect
• The use of warehouse auto scalingcorrect
• The ability to resize warehouses
• Use of multi-clustered warehousescorrect
• The use of warehouse indexing
--
99. What transformations are supported in a CREATE PIPE ... AS COPY ... FROM
(....)statement? (Select TWO.)
• Data can be filtered by an optional where clausecorrect
• Incoming data can be joined with other tables
• Columns can be reorderedcorrect
• Columns can be omittedcorrect
• Row level access can be defined
--
100. What types of data listings are available in the Snowflake Data Marketplace? (Choose
two.)
• Reader correct
• Consumer
• Vendor
• Standard correct
• Personalized correct
--
1. True or False: When a user creates a role, they are initially assigned ownership of the
role and they maintain ownership until it is transferred to another user.
• Truecorrect
• False
--
2. True or False: You can define multiple columns within a clustering key on a table.
• Truecorrect
• False
--
3. True or False: When active, a pipe requires a dedicated Virtual Warehouse to execute.
• True
• Falsecorrect
--
5. which of the following are valid approaches to loading data into a snowflake table?
select all the below that apply.
• Bulk copy from an External Stagecorrect
• Continuous load using Snowpipe REST APIcorrect
• The Snowflake Web Interface (UT) data loading wizardcorrect
• Bulk copy from an Internal Stage
--
6. Which of the following are valid methods for authenticating users for access into
Snowflake? (Select THREE)
• SCIMcorrect
• Federated authenticationcorrect
• TLS 1.2
• Key-pair authenticationcorrect
• OAuthcorrect
• OCSP authentication
--
7. Snowflake recommends, as a minimize, that all users with the following roles(s) should
be enrolled in Multi-Factor Authentication (MFA):
• SECURITYADMIN, ACCOUNTADMIN, PUBLIC, SYSADMIN
• SECURITYADMIN ACCOUNTADMIN, SYSADMIN
• SECURITYADMIN, ACCOUNTADMIN
• ACCOUNTADMINcorrect
--
9. True or False: A third-party tool that supports standard JDBC or ODBC but has no
Snowflake-specific driver will be unable to connect to Snowflake.
• True
• Falsecorrect
--
Explanation:
Snowflake provides a JDBC type 4 driver that supports core JDBC functionality. The JDBC driver
must be installed in a 64-bit environment and requires Java 1.8 (or higher). The driver can be
used with most client tools/applications that support JDBC for connecting to a database server.
10. What parameter controls if the Virtual warehouse starts immediately after the CREATE
WAREHOUSE statement?
• INITTIALLY_SUSPENDED = TRUE/FALSEcorrect
• START_AFTCR_CREATE = TRUE/FALSE
• START_TTIME = 60 // (seconds from now)
• STAR
• TIME = CURREN
• DATE()
--
11. What is the minimum Snowflake edition required to create a materialized view?
• Standard Edition
• Enterprise Editioncorrect
• Business Critical Edition
• Virtual Private Snowflake Edition
--
Explanation:
Materialized views require Enterprise Edition. To inquire about upgrading, please contact
Snowflake Support
• Temporarycorrect
• Translent
• Provisional
• Permanent
--
Explanation:
Snowflake supports creating temporary tables for storing non-permanent, transitory data (e.g.
ETL data, session-specific data). Temporary tables only exist within the session in which they
were created and persist only for the remainder of the session.
13. True or False: It is possible to query data from an Internal or named External stage
without loading the data into Snowflake.
• Truecorrect
• False
--
14. What are ways to create and manage data shares in Snowflake? (Select TWO)
• Through the Snowflake web interface (Ul)correct
• Through the DATA_SHARE=TRUE parameter
• Through SQL commandscorrect
• Through the enable__share=true parametercorrect
• Using the CREATE SHARE AS SELECT * TABLE command
--
15. What are the three things customers want most from their enterprise data warehouse
solution? Choose 3 answers
• On-premise availabilitycorrect
• Simplicitycorrect
• Open source based
• Concurrencycorrect
• Performancecorrect
--
16. True or False: Snowflake charges a premium for storing semi-structured data.
• True
• Falsecorrect
--
17. Which of the following statements are true about Schemas in Snowflake? (Choose
two.)
• A Schema may contain one or more Databasescorrect
• A Database may contain one or more Schemascorrect
• A Schema is a logical grouping of Database Objectscorrect
• Each Schema is contained within a Warehouse
--
18. What is the minimum duration charged when starting a virtual warehouse?
• 1 second
• 1 minutecorrect
• 1 hour
• 1 day
--
19. The number of queries that a Virtual Warehouse can concurrently process is
determined by: Choose 2 answers
• The complexity of each querycorrect
• The CONCURRENT_QUERY_UMIT parameter set on the Snowflake account
• The size of the data required for each querycorrect
• The tool that s executing the query
--
20. On which of the following cloud platform can a Snowflake account be hosted? Choose
2 answers
• Amazon Web Servicescorrect
• Private Virtual Cloud
• Oracle Cloud
• Microsoft Azure Cloudcorrect
--
21. True or False: All Snowflake table types include fail-safe storage.
• True
• Falsecorrect
--
22. True or False: AWS Private Link provides a secure connection from the Customer’s
on-premise data center to the Snowflake.
• True
• Falsecorrect
--
23. True or False: It is possible to unload structured data to semi-structured formats such
as JSON and parquet.
• Truecorrect
• False
--
24. What transformations are supported in a CREATE PIPE ... AS COPY ... FROM (....)
statement? (Select TWO.)
• Data can be filtered by an optional where clausecorrect
• Incoming data can be joined with other tables
• Columns can be reorderedcorrect
• Columns can be omittedcorrect
• Row level access can be defined
--
25. What is the minimum Snowflake edition that provides data sharing?
• Standardcorrect
• Premier
• Enterprise
• Business Critical Edition
--
Explanation:
Bulk data load
Stored in the metadata of the target table for 64 days. Available upon completion of the COPY
statement as the statement output.
28. True or False: Snowflake’s data warehouse was built from the ground up for the cloud
in lieu of using an existing database or a platform, like Hadoop, as a base.
• True
• Falsecorrect
--
29. What happens when a Data Provider revokes privileges to a Share on an object in their
source database?
• The object immediately becomes unavailable for all Data Consumerscorrect
• Any additional data arriving after this point in time will not be visible to Data Consumers
• The Data Consumers stop seeing data updates and become responsible for storage
charges for the object
• A static copy of the object at the time the privilege was revoked is created In the Data
Consumers' accounts
--
30. Which of the following connectors are available in the Downloads section of the
Snowflake web Interface (UI) Choose 2 answers
• SnowSQLcorrect
• ODBCcorrect
• R
• HIVE
--
31. What privileges are required to create a task?
• The global privilege create task is required to create a new task.
• Tasks are created at the Application level and can only be created by the Account Admin
role.
• Many Snowflake DDLs are metadata operations only, and create task DDL can be
executed without virtual warehouse requirement or task specific grants.
• The role must have access to the target schema and the create task privilege on the
schema itself.correct
--
Explanation:
All tasks in a simple tree must have the same task owner (i.e. a single role must have the
OWNERSHIP privilege on all of the tasks in the tree). All tasks in a simple tree must exist in the
same schema.
32. Which data types does Snowflake support when querying semi-structured data?
(Select TWO)
• VARIANTcorrect
• ARRAYcorrect
• VARCHAR
• XML
• BLOB
--
Explanation:
A VARIANT stores semi-structured data in Snowflake. It can store a value of any other type,
including OBJECT and ARRAY. The maximum length of a VARIANT is 16 MB.
34. What feature can be used to reorganize a very large table on one or more columns?
• Micro-partitions
• Clustering keyscorrect
• Key partitions
• Clustered partitions
--
35. What are two ways to create and manage Data Shares in Snowflake? (Choose two.)
• Via the Snowflake Web Interface (Ul)correct
• Via the data_share=true parameter
• Via SQL commandscorrect
• Via Virtual Warehouses
--
36. Which copy INTO command outputs the data into one file?
• SINGLE=TRUE
• MAX_FILE_NUMBER=1correct
• FILE_NUMBER=1
• MULTIPLE=FAISE
--
38. What command is used to load files into an Internal Stage within Snowflake?
• PUTcorrect
• COPY INTO
• TRANSFER
• INSERT
--
Explanation:
You must specify an internal stage in the PUT command when uploading files to Snowflake. You
must specify the same stage in the COPY INTO <table> command when loading data into a
table from the staged files.
39. True or False: The user has to specify which cluster a query will run on in multi-
clustering Warehouse.
• True
• Falsecorrect
--
40. Why would a customer size a Virtual Warehouse from an X-Small to a Medium?
• To accommodate more queries
• To accommodate more users
• To accommodate fluctuations in workload
• To accommodate a more complex workloadcorrect
--
41. How would you determine the size of the virtual warehouse used for a task?
• Root task may be executed concurrently (i.e. multiple instances), it is recommended to
leave some margins in the execution window to avoid missing instances of execution
• Querying (select) the size of the stream content would help determine the warehouse
size. For example, if querying large stream content, use a larger warehouse size
• If using the stored procedure to execute multiple SQL statements, it's best to test run the
stored procedure separately to size the compute resource firstcorrect
• Since task infrastructure is based on running the task body on schedule, it's
recommended to configure the virtual warehouse for automatic concurrency handling
using Multi-cluster warehouse (MCW) to match the task schedule
--
42. Which of the following are examples of operations that require a Virtual Warehouse to
complete, assuming no quires have been executed previously? Choose 3 answers
• MIN(< < column value>>)correct
• COPYcorrect
• SUM(<< column value >>)correct
• UPDATEcorrect
--
43. What are the default Time Travel and Fail-safe retention periods for transient tables?
• Time Travel - 1 day. Fail-safe - 1 day
• Time Travel - 0 days. Fail-safe - 1 day
• Time Travel - 1 day. Fail-safe - 0 dayscorrect
• Transient tables are retained in neither Fail-safe nor Time Travel
--
Explanation:
Alation provides Data Cataloging functionality. They state they are the 'One Place to Find,
Understand, & Govern Data Across an Enterprise.
47. A virtual warehouse's auto-suspend and auto-resume settings apply to which of the
following?
50. Which of the following statements would be used to export/unload data from
Snowflake?
• COPY INTO @stagecorrect
• EXPORT TO @stage
• INSERT INTO @stage
• EXPORT_TO_STAGE (stage = > @Wage, select = > 'select * from t1);
--
51. What is a machine learning and data science partner within the Snowflake Partner
Ecosystem?
• Informatica
• Power Bl
• Adobe
• Data Robotcorrect
--
53. Which of the following roles is recommended to be used to create and manage users
and roles?
• SYSADMIN
• SECURITYADMIN correct
• PUBLIC
• ACCOUNTADMIN
--
Explanation:
"Security admin: Role that can manage any object grant globally, as well as create, monitor, and
manage users and roles"
54. Which of the following statements are true of VALIDATION_MODE in Snowflake?
(Choose two.)
• The validation_mode option is used when creating an Internal Stagecorrect
• validation_mode=return_all_errors is a parameter of the copy commandcorrect
• The validation_mode option will validate data to be loaded by the copy statement while
completing the load and will return the rows that could not be loaded without error
• The validation_mode option will validate data to be loaded by the copy statement without
completing the load and will return possible errors correct
--
57. Which of the following describes how clustering keys work in Snowflake?
• Clustering keys update the micro-partitions in place with a full sort, and impact the DML
operations.
• Clustering keys sort the designated columns over time, without blocking DML operations
correct
• Clustering keys create a distributed, parallel data structure of pointers to a table's rows
and columns
• Clustering keys establish a hashed key on each node of a virtual warehouse to optimize
joins at run-time
--
59. Which interfaces can be used to create and/or manage Virtual Warehouses?
• The Snowflake Web Interface (UI)
• SQL commands
• Data integration tools
• All of the abovecorrect
--
60. When scaling out by adding clusters to a multi-cluster warehouse, you are primarily
scaling for improved:
• Concurrencycorrect
• Performance
--
61. True or False: You can query the files in an External Stage directly without having to
load the data into a table.
• Truecorrect
• False
--
External tables are read-only, therefore no DML operations can be performed on them; however,
external tables can be used for query and join operations. Views can be created against external
tables.
63. Query results are stored in the Result Cache for how long after they are last accessed,
assuming no data changes have occurred?
• 1 Hour
• 3 Hours
• 12 hours
• 24 hourscorrect
--
64. Which Snowflake object enables loading data from files as soon as they are available
in a cloud storage location?
• Pipecorrect
• External stage
• Task
• Stream
--
Explanation:
Snowpipe enables loading data from files as soon as they’re available in a stage. This means
you can load data from files in micro-batches, making it available to users within minutes, rather
than manually executing COPY statements on a schedule to load larger batches.
65. Which of the following are best practices for users with the
SYSADMIN/ACCOUNTADMIN roles? Choose 3 answers
• Their default role should be set to SYSTEMADMIN (the lower of the two)correct
• They should not set up multi_Factor Authentication (MFA)―as administrator they may
need to change the MFA settings and those enrolled in MFA are unable to do so
• They should only access and 'step into' the ACCOUNTADMIN role temporarily, as
needed to complete a specific operationcorrect
• They should ensure all database objects in the account are owned by the
ACCOUNTADMIN role
• They should use the SYSADMIN role to perform administrative work on database
objectscorrect
--
66. Which statement best describes Snowflake tables?
• Snowflake tables are logical representations of underlying physical datacorrect
• Snowflake tables ate the physical instantiation of data loaded Into Snowflake
• Snowflake tables require that clustering keys be defined to perform optimally
• Snowflake tables are owned by a user
--
67. True or False: A table in Snowflake can only be queried using the Virtual Warehouse
that was used to load the
data.
• True
• Falsecorrect
--
68. What is the MOST performant file format for loading data in Snowflake?
• CSV (Unzipped)correct
• Parquet
• CSV (Gzipped)
• ORC
--
69. True or False: Snowflake’s Global Services Layer gathers and maintains statistics on
all columns in all micro-partitions.
• Truecorrect
• False
--
Explanation:
Snowflake is a single, integrated platform delivered as-a-service. It features storage, compute,
and global services layers that are physically separated but logically integrated.
70. True or False: Users are able to see the result sets of queries executed by other users
that share their same role.
• True
• Falsecorrect
--
71. What Snowflake role must be granted for a user to create and manage accounts?
• ACCOUNTADMINcorrect
• ORGADMIN
• SECURITYADMIN
• SYSADMIN
--
Which application will the Snowflake users need to install on their devices in order to
connect with MFA?
• Okta Verify
• Duo Mobilecorrect
• Microsoft Authenticator
• Google Authenticator
--
73. True or False: Snowflake bills for a minimum of five minutes each time a Virtual
Warehouse is started.
• True
• Falsecorrect
--
75. What SQL command would be used to view all roles that were granted to user.1?
• show grants to user USER1;correct
• show grants of user USER1;
• describe user USER1;
• show grants on user USER1;
--
Explanation:
All Snowflake-managed keys are automatically rotated by Snowflake when they are more than
30 days old. Active keys are retired, and new keys are created. When Snowflake determines the
retired key is no longer needed, the key is automatically destroyed.
77. The FLATEEN function is used to query which type of data in Snowflake?
• Structured data
• Semi-structured datacorrect
• Both of the above
• None of the above
--
FLATTEN is used to unnest semi-structured data. Don't see an application for structured data as
by definition it shouldn't be nested.
78. Which of the following is true of Snowpipe via REST API? Choose 2 answers
• you can only use it on internal Stagescorrect
• All COPY INTO options are available fluting pipe creation
• Snowflake automatically manages the compute required to execute the Pipe's copy into
commandscorrect
• Snowpipe keeps track of which files it has loadedcorrect
--
Explanation:
Snowflake credits are used to pay for the consumption of resources on Snowflake. A Snowflake
credit is a unit of measure, and it is consumed only when a customer is using resources, such as
when a virtual warehouse is running, the cloud services layer is performing work, or serverless
features are used.
80. Which is true of Snowflake network policies? A Snowflake network policy: (Choose
two.)
• Is available to all Snowflake Editionscorrect
• Is only available to customers with Business Critical Edition
• Restricts or enables access to specific IP addressescorrect
• Is activated using an “ALTER DATABASE” command
--
81. Which of the following is a valid source for an external stage when the Snowflake
account is located on Microsoft Azure?
• An FTP server with TLS encryption
• An HTTPS server with WebDAV
• A Google Cloud storage bucket
• A Windows server file share on Azurecorrect
--
82. When reviewing the load for a warehouse using the load monitoring chart, the chart
indicates that a high volume of Queries are always queuing in the warehouse.
According to recommended best practice, what should be done to reduce the Queue
volume? (Select TWO).
• Use multi-clustered warehousing to scale out warehouse capacity.correct
• Scale up the warehouse size to allow Queries to execute faster.
• Stop and start the warehouse to clear the queued queries
• Migrate some queries to a new warehouse to reduce loadcorrect
• Limit user access to the warehouse so fewer queries are run against it.
--
Explanation:
There are several limitations to using materialized views:
84. What are value types that a VARIANT column can store? (Select TWO)
• STRUCTcorrect
• OBJECTcorrect
• BINARY
• ARRAYcorrect
• CLOB
--
Explanation:
Characteristics of a VARIANT
A VARIANT can store a value of any other type, including OBJECT and ARRAY. The maximum
length of a VARIANT is 16 MB.
85. Which Snowflake objects track DML changes made to tables, like inserts, updates, and
deletes?
• Pipes
• Streamscorrect
• Tasks
• Procedures
--
86. Which services does the Snowflake Cloud Services layer manage? (Select TWO).
• Compute resourcescorrect
• Query execution
• Authenticationcorrect
• Data storage
• Metadatacorrect
--
Explanation:
https://docs.snowflake.com/en/user-guide/intro-key-concepts.html The cloud services layer is a
collection of services that coordinate activities across Snowflake. These services tie together all
of the different components of Snowflake in order to process user requests, from login to query
dispatch. The cloud services layer also runs on compute instances provisioned by Snowflake
from the cloud provider.
✑ Authentication
✑ Infrastructure management
✑ Metadata management
✑ Access control
87. Select the three types of tables that exist within Snowflake. Choose 3 answers
• Temporarycorrect
• Transientcorrect
• Provisioned
• Permanentcorrect
--
88. Which of the following are common use cases for zero-copy cloning? Choose 3
answers
• Quick provisioning of Dev and Test/QA environmentscorrect
• Data backupscorrect
• Point in time snapshotscorrect
• Performance optimization
--
89. Which of the following statements are true of Virtual Warehouses? (Choose all that
apply.)
• Customers can change the size of the Warehouse after creationcorrect
• A Warehouse can be resized while runningcorrect
• A Warehouse can be configured to suspend after a period of inactivitycorrect
• A Warehouse can be configured to auto-resume when new queries are submittedcorrect
--
90. What happens when a cloned table is replicated to a secondary database? (Select
TWO)
• A read-only copy of the cloned tables is stored.correct
• The replication will not be successful.
• The physical data is replicatedcorrect
• Additional costs for storage are charged to a secondary accountcorrect
• Metadata pointers to cloned tables are replicated
--
Explanation:
Cloned objects are replicated physically rather than logically to secondary databases. That is,
cloned tables in a standard database do not contribute to the overall data storage unless or until
DML operations on the clone add to or modify existing data.However, when a cloned table is
replicated to a secondary database, the physical data is also replicated, increasing the data
storage usage for your account.
91. True or False: It is possible for a user to run a query against the query result cache
without requiring an active Warehouse.
• Truecorrect
• False
--
Explanation:
Query result cache is all about fetching the data from cloud services layer and saving the cost by
not running the virtual warehouse.
93. Which of the following items does the Cloud services Layer manage? Choose 4
answers
• user authenticationcorrect
• Metadatacorrect
• Query compilation and optimizationcorrect
• external blob storage
• Data securitycorrect
--
94. If 3 size Small virtual warehouse is made up of two servers, how many servers make
up a Large warehouse?
• 4
• 8correct
• 16
• 32
--
Explanation:
Size specifies the amount of compute resources available per cluster in a warehouse.Snowflake
supports the following warehouse sizes:
A)
B)
C)
D)
• Option Acorrect
• Option B
• Option C
• Option D
--
96. A developer is granted ownership of a table that has a masking policy. The developer's
role is not able to see the masked data.
Will the developer be able to modify the table to read the masked data?
• Yes, because a table owner has full control and can unset masking policies.correct
• Yes, because masking policies only apply to cloned tables.
• No, because masking policies must always reference specific access roles.
• No, because ownership of a table does not include the ability to change masking policies
--
97. Which of the following are valid Snowflake Virtual Warehouse Scaling Policies?
(Choose two.)
• Custom correct
• Economy correct
• Optimized
• Standard correct
--
98. True or False: When you create a custom role, it is a best practice to immediately grant
that role to ACCOUNTADMIN.
• True
• False correct
--
99. In which scenarios would a user have to pay Cloud Services costs? (Select TWO).
• Compute Credits = 50 Credits Cloud Services = 10 correct
• Compute Credits = 80 Credits Cloud Services = 5 correct
• Compute Credits = 10 Credits Cloud Services = 9
• Compute Credits = 120 Credits Cloud Services = 10 correct
• Compute Credits = 200 Credits Cloud Services = 26
--
100. When is the result set cache no longer available? (Select TWO)
• When another warehouse is used to execute the query correct
• When another user executes the query
• When the underlying data has changed correct
• When the warehouse used to execute the query is suspended
• When it has been 24 hours since the last query correct
--
1. When a pipe is recreated using the CREATE on REPLAC PIPE command: Select one.
• The Pipe load history is reset to emptycorrect
• The REFRESH parameter is set to TRUE
• Previously loaded files will be ignored
• All of the above
--
2. Select the different types of Internal Stages: (Choose three.)
• Named Stagecorrect
• User Stagecorrect
• Table Stagecorrect
• Schema Stage
--
4. What happens when a Data Provider revokes privileges to a Share on an object in their
source database?
• The object immediately becomes unavailable for all Data Consumerscorrect
• Any additional data arriving after this point in time will not be visible to Data Consumers
• The Data Consumers stop seeing data updates and become responsible for storage
charges for the object
• A static copy of the object at the time the privilege was revoked is created In the Data
Consumers' accounts
--
5. True or False: Snowpipe via RFST API can only reference External Stages as source.
• True
• Falsecorrect
--
6. True or False: When active, a pipe requires a dedicated Virtual Warehouse to execute.
• True
• Falsecorrect
--
7. True or False: You can resize a Virtual Warehouse while queries are running.
• Truecorrect
• False
--
8. Which of the following statements about data sharing are true? choose 2 answers
• New objects created by a Data Provider are automatically shared with existing Data
Consumers & Reader Accountscorrect
• All database objects can be included In a shared database
• Reader Accounts are created and funded by Data Prowlerscorrect
• Shared databases are read-onlycorrect
--
10. What is the recommended method for loading data into Snowflake?
• Load row by row
• Load data in batchcorrect
• Load data by writing it In the Snowflake Web Interface (UI)
• Load data via frequent, angle row DML's
--
11. What is the lowest Snowflake edition that offers Time Travel up to 90 days.
• standard Edition
• Premier Edition
• Enterprise Editioncorrect
• Business Critical Edition
--
12. Which of the following statements are true of Virtual Warehouses? (Choose all that
apply.)
• Customers can change the size of the Warehouse after creationcorrect
• A Warehouse can be resized while runningcorrect
• A Warehouse can be configured to suspend after a period of inactivitycorrect
• A Warehouse can be configured to auto-resume when new queries are submitted
--
13. True or False: Users are able to see the result sets of queries executed by other users
that share their same role.
• True
• Falsecorrect
--
14. Select the three types of tables that exist within Snowflake.
Choose 3 answers
• Temporarycorrect
• Transientcorrect
• Provisioned
• Permanentcorrect
--
15. Snowflake recommends, as a minimize, that all users with the following roles(s)
should be enrolled in Multi-Factor Authentication (MFA):
• SECURITYADMIN, ACCOUNTADMIN, PUBLIC, SYSADMIN
• SECURITYADMIN ACCOUNTADMIN, SYSADMIN
• SECURITYADMIN, ACCOUNTADMIN
• ACCOUNTADMINcorrect
--
16. Which of the following are valid Snowflake Virtual Warehouse Scaling Policies?
(Choose two.)
• Customcorrect
• Economycorrect
• Optimized
• Standardcorrect
--
17. True or False: Each worksheet in the Snowflake Web Interface (UI) can be associated
with different roles, databases, schemas, and Virtual Warehouses.
• True
• Falsecorrect
--
18. True or False: Reader Accounts are able to extract data from shared data objects for
use outside of Snowflake.
• Truecorrect
• False
--
21. True or False: Data in Fail-safe can be deleted by a user or the Snowflake team before
it expires.
• True
• Falsecorrect
--
26. True or False: During data unloading, only JSON and CSV files can be compressed.
• True
• Falsecorrect
--
28. Which of the following are options when creating a virtual Warehouse? Choose 2
answers
• Auto-dropcorrect
• Auto resize
• Auto-resumecorrect
• Auto-suspendcorrect
--
29. True or False: When you create a custom role, it is a practice to immediately grant that
role to ACCOUNTADMIN.
• True
• Falsecorrect
--
32. For a multi-cluster Warehouse, the number of credits billed is calculated on: Select
one.
• The number of queries that ran using the Warehouse.
• The size of the Warehouse and the number of clusters that ran within a given time
period.correct
• The sue of the Warehouse and the maximum number of clusters configured for the
Warehouse.
• The number of users who accessed the Warehouse.
--
33. When loading data into Snowflake, the COPY command supports: Choose 2 answers
• Joinscorrect
• Fitters
• Data type conversionscorrect
• Column reorderingcorrect
• Aggregates
--
34. Which is true of Snowflake network policies? A Snowflake network policy: (Choose
two.)
• Is available to all Snowflake Editionscorrect
• Is only available to customers with Business Critical Edition
• Restricts or enables access to specific IP addressescorrect
• Is activated using an “ALTER DATABASE” command
--
35. True or False: A 4X-Large Warehouse may, at times, take longer to provision than a X-
Small Warehouse.
• True
• Falsecorrect
--
36. True or False: It is possible to unload structured data to semi-structured formats such
as JSON and parquet.
• Truecorrect
• False
--
37. Query results are stored in the Result Cache for how long after they are last accessed,
assuming no data changes have occurred?
• 1 Hour
• 3 Hours
• 12 hours
• 24 hourscorrect
--
38. Which formats are supported for unloading data from Snowflake? Choose 2 answers
39. True or False: When a user creates a role, they are initially assigned ownership of the
role and they maintain ownership until it is transferred to another user.
• Truecorrect
• False
40. Data storage for individual tables can be monitored using which commands and/or
object(s)? Choose 2 answers
• SHOW TABLES;correct
• SHOW STORAGE BY TABLE;
• Information Schema -> TABLE_STORAGE_METRICS
• Information Schema -> TASLE_HISTORYcorrect
--
41. Which of the following statements are true about Schemas in Snowflake? (Choose
two.)
• A Schema may contain one or more Databasescorrect
• A Database may contain one or more Schemascorrect
• A Schema is a logical grouping of Database Objectscorrect
• Each Schema is contained within a Warehouse
--
42. Which of the following statements are true of Snowflake releases: (Choose two.)
• They happen approximately weeklycorrect
• They roll up and release approximately monthly, but customers can request early release
application
• During a release, new customer requests/queries/connections transparently move over to
the newer versioncorrect
• A customer is assigned a 30 minute window (that can be moved anytime within a week)
during which the system will be unavailable and customer is upgradedcorrect
--
43. True or False: Snowflake allows its customers to directly access the micro-partition
files that make up its tables.
• True
• Falsecorrect
--
44. True or False: AWS Private Link provides a secure connection from the Customer’s
on-premise data center to the Snowflake.
• True
• Falsecorrect
--
46. What is the most granular object that the Time Travel retention period can be defined
on?
• Accountcorrect
• Database
• Schema
• Table
--
47. True or False: The COPY command must specify a File Format in order to execute.
• Truecorrect
• False
--
48. What parameter controls if the Virtual warehouse starts immediately after the CREATE
WAREHOUSE statement?
• INITTIALLY_SUSPENDED = TRUE/FALSEcorrect
• START_AFTCR_CREATE = TRUE/FALSE
• START_TTIME = 60 // (seconds from now)
• STAR
• TIME = CURREN
• DATE()
--
49. True or False: It is possible for a user to run a query against the query result cache
without requiring an active Warehouse.
• True
• Falsecorrect
--
51. Which of the following accurately represents how a table fits into Snowflake’s logical
container hierarchy?
• Account -> Schema -> Database -> Table
• Account -> Database -> Schema -> Tablecorrect
• Database -> Schema -> Table -> Account
• Database -> Table -> Schema -> Account
--
Choose 2 answers
• UPDATEcorrect
• INSERTcorrect
• MERGE
• COPYcorrect
--
55. The Information Schema and Account Usage Share provide storage information for
which of the following objects? (Choose three.)
• Userscorrect
• Tablescorrect
• Databasescorrect
• Internal Stages
--
56. What is the minimum duration charged when starting a virtual warehouse?
• 1 second
• 1 minutecorrect
• 1 hour
• 1 day
--
57. Which of the following are examples of operations that require a Virtual Warehouse to
complete, assuming no quires have been executed previously? Choose 3 answers
• MIN(< < column value>>)correct
• COPYcorrect
• SUM(<< column value >>)correct
• UPDATEcorrect
--
58. Which of the following commands sets the Virtual Warehouse for a session?
• COPT WAREHOUSE FROM <<Config file> ;
• SET warehouse = <<warehouse name>>;
• USE WAREHOUSE <<warehouse name>>;correct
• USE VIRTUAL_WAREHOUSE <<warehouse name>>;
--
59. True or False: A single database can exist in more than one Snowflake account.
• True
• Falsecorrect
--
1. True or false: Snowflake enforces unique, primary key, and foreign key constraints
during DML operations.
• True
• Falsecorrect
--
2. Which of the following are best practices for users with the
SYSADMIN/ACCOUNTADMIN roles? Choose 3 answers
• Their default role should be set to SYSTEMADMIN (the lower of the two)correct
• They should not set up multi_Factor Authentication (MFA)―as administrator they may
need to change the MFA settings and those enrolled in MFA are unable to do so
• They should only access and 'step into' the ACCOUNTADMIN role temporarily, as
needed to complete a specific operationcorrect
• They should ensure all database objects in the account are owned by the
ACCOUNTADMIN role
• They should use the SYSADMIN role to perform administrative work on database
objectscorrect
--
3. True or False: A third-party tool that supports standard JDBC or ODBC but has no
Snowflake-specific driver will be unable to connect to Snowflake.
• Truecorrect
• False
--
4. Which of the following are common use cases for zero-copy cloning? Choose 3
answers
• Quick provisioning of Dev and Test/QA environmentscorrect
• Data backupscorrect
• Point in time snapshots
• Performance optimization
--
5. True or False: Snowflake allows its customers to directly access the micro-partition
files that make up its tables.
• True
• Falsecorrect
--
6. If a Small Warehouse is made up of 2 servers/cluster, how many servers/cluster make
up a Medium
Warehouse?
• 4correct
• 16
• 32
• 128
--
8. What command is used to load files into an Internal Stage within Snowflake? Select
one.
• PUTcorrect
• COPY INTO
• TRANSFER
• INSERT
--
9. When loading data into Snowflake, the COPY command supports: Choose 2 answers
• Joins
• Fitters
• Data type conversionscorrect
• Column reorderingcorrect
• Aggregates
--
10. Which of the following are main sections of the top navigation of the Snowflake web
Interface (UI)?
• Databasecorrect
• Tables
• Warehousescorrect
• Worksheets
--
11. Which is true of Snowflake network policies? A Snowflake network policy: (Choose
two.)
• Is available to all Snowflake Editionscorrect
• Is only available to customers with Business Critical Edition
• Restricts or enables access to specific IP addresses
• Is activated using an “ALTER DATABASE” command
--
12. Which of the following are examples of operations that require a Virtual Warehouse to
complete, assuming no quires have been executed previously? Choose 3 answers
• MIN(< < column value>>)
• COPYcorrect
• SUM(<< column value >>)correct
• UPDATEcorrect
--
13. When a Pipe is recreated using the CREATE OR REPLACE PIPE command:
• The Pipe load history is reset to empty
• The REFRESH parameter is set to TRUEcorrect
• Previously loaded files will be ignored
• All of the above
--
14. True or False: The user has to specify which cluster a query will run on in multi-
clustering Warehouse.
• Truecorrect
• False
--
17. For a multi-cluster Warehouse, the number of credits billed is calculated on:
Select one.
• The number of queries that ran using the Warehouse.
• The size of the Warehouse and the number of clusters that ran within a given time
period.correct
• The sue of the Warehouse and the maximum number of clusters configured for the
Warehouse.
• The number of users who accessed the Warehouse.
--
20. Which object allows you to limit the number of credits consumed within a Snowflake
account?
Select one.
• Account usage Trackingcorrect
• Resource Monitorcorrect
• Warehouse Limit Parameter
• Credit Consumption Tracker
--
21. True or False: When a user creates a role, they are initially assigned ownership of the
role and they maintain ownership until it is transferred to another user.
• Truecorrect
• False
--
23. True or False: It is possible to query data from an Internal or named External stage
without loading the data into Snowflake.
• Truecorrect
• False
--
24. True or False: When data share is established between a Data Provider and a data
Consumer, the Data Consumer can extend that data share to other Data Consumers.
• Truecorrect
• Falsecorrect
--
26. Which item in the Data Warehouse migration process does not apply in Snowflake>
• Migrate Users
• Migrate Schemas
• Migrate Indexescorrect
• Build the Data pipeline
--
27. When can a Virtual Warehouse start running queries?
• 12am-5am
• Only during administrator defined time slotscorrect
• When its provisioning is complete
• After replication
--
29. When scaling up Virtual Warehouse by increasing Virtual Warehouse t-shirt size, you
are primarily scaling for improved: Select one.
• Concurrency
• Performancecorrect
--
30. The number of queries that a Warehouse can concurrently process is determined by:
Choose 2 answers
• The complexity of each querycorrect
• The CONCURRENT_QUERY_UMIT parameter set on the Snowflake account
• The size of the data required for each querycorrect
• The tool that s executing the query
--
32. True or False: All Snowflake table types include fail-safe storage.
• True
• Falsecorrect
--
33. Which formats are supported for unloading data from Snowflake? Choose 2 answers
• Delimited (CSV, TSV, etc.)correct
• Avro
• ISONcorrect
• ORC
--
37. True or False: The COPY command must specify a File Format in order to execute.
• Truecorrect
• False
--
38. Which of the following are true of multi-cluster Warehouses? Select all that apply
below.
• A multi-cluster Warehouse can add clusters automatically based on query activitycorrect
• A multi-cluster Warehouse can automatically turn itself off after a period of
inactivitycorrect
• A multi-cluster Warehouse can scale down when query activity slowscorrect
• A multi-cluster Warehouse can automatically turn itself on when a query is executed
against itcorrect
--
39. True or False: Reader Accounts incur no additional Compute costs to the Data
Provider since they are simply reading the shared data without making changes.
• Truecorrect
• Falsecorrect
--
40. Which of the following are options when creating a Virtual Warehouse?
• Auto-suspend correct
• Auto-resume correct
• Local SSD size
• User count
--
2. True or False: The user has to specify which cluster a query will run on in multi-
clustering Warehouse.
• True
• False correct
--
3. Data storage for individual tables can be monitored using which commands and/or
object(s)?
Choose 2 answers
• SHOW TABLES; correct
• SHOW STORAGE BY TABLE;
• Information Schema -> TABLE_STORAGE_METRICS
• Information Schema -> TASLE_HISTORY correct
--
5. True or False: When Snowflake is configured to use Single Sign-on (sso), Snowflake
receive the usernames and credentials from the sso service and loads them into the
customer's Snowflake account.
• True
• false:correct
--
7. True or False: Snowflake allows its customers to directly access the micro-partition
files that make up its tables.
• True
• False correct
--
8. True or False: Each worksheet in the Snowflake Web Interface (UI) can be associated
with different roles, databases, schemas, and Virtual Warehouses.
• True correct
• False
--
9. Which of the following statements are true of Snowflake data loading? Choose 3
answers
• VARIANT "nut" values are not the same as SQL Null values correct
• It is recommended to do frequent, single row DMLS
• It is recommended to validate the data before loading into the Snowflake target table
correct
• It is recommended to use staging tables to manage MERGE statements correct
--
10. True or false: Snowflake enforces unique, primary key, and foreign key constraints
during DML operations.
• True
• False correct
--
11. which of the following are valid approaches to loading data into a snowflake table?
select
12. As a best practice, clustering keys should only be defined on tables of which minimum
size?
• Multi-Kilobyte (KB) Range
• Multi-Megabyte (MB) Range
• Multi-Gigabyte (GB) Range
• Multi-Terabyte (TB) Range correct
--
14. In which layer of its architecture does Snowflake store its metadata statistics?
• Storage Layer
• Compute Layer
• Database Layer
• Cloud Service Layer correct
--
15. True or False: All Snowflake table types include fail-safe storage.
• True
• False correct
--
16. Which of the following languages can be used to implement Snowflake User Defined
Functions (UDFs)? Choose 2 answers
• Java
• JavaScript correct
• SQLcorrect
• Python
--
18. True or False: A third-party tool that supports standard JDBC or ODBC but has no
Snowflake-specific driver will be unable to connect to Snowflake.
• True
• False correct
--
19. When scaling up Virtual Warehouse by increasing Virtual Warehouse t-shirt size, you
are primarily scaling for improved:
• Concurrency
• Performance correct
--
20. Which of the following are main sections of the top navigation of the Snowflake web
Interface (UI)?
• Database correct
• Tables
• Warehouses correct
• Worksheets correct
--
21. The number of queries that a Warehouse can concurrently process is determined by:
Choose 2 answers
• The complexity of each query correct
• The CONCURRENT_QUERY_UMIT parameter set on the Snowflake account
• The size of the data required for each query correct
• The tool that s executing the query
--
22. What happens when a Data Provider revokes privileges to a Share on an object in their
source database?
• The object immediately becomes unavailable for all Data Consumers correct
• Any additional data arriving after this point in time will not be visible to Data Consumers
• The Data Consumers stop seeing data updates and become responsible for storage
charges for the object
• A static copy of the object at the time the privilege was revoked is created In the Data
Consumers' accounts
--
27. On which of the following cloud platform can a Snowflake account be hosted? Choose
2 answers
• Amazon Web Servicescorrect
• Private Virtual Cloud
• Oracle Cloud
• Microsoft Azure Cloudcorrect
--
28. Which of the following are common use cases for zero-copy cloning? Choose 3
answers
• Quick provisioning of Dev and Test/QA environments correct
• Data backups correct
• Point in time snapshots correct
• Performance optimization
--
30. Which of the following accurately represents how a table fits into Snowflake’s logical
container hierarchy?
• Account -> Schema -> Database -> Table
• Account -> Database -> Schema -> Table correct
• Database -> Schema -> Table -> Account
• Database -> Table -> Schema -> Account
31. True or False: When data share is established between a Data Provider and a data
Consumer, the Data Consumer can extend that data share to other Data Consumers.
• True
• False correct
33. Which of the following connectors are available in the Downloads section of the
Snowflake web Interface (UI) Choose 2 answers
• SnowSQL correct
• ODBC correct
• R
• HIVE
34. Which of the following items does the Cloud services Layer manage? Choose 4
answers
• user authentication correct
• Metadatacorrect
• Query compilation and optimization correct
• external blob storage
• Data security correct
35. Each incremental increase in Virtual Warehouse size (e,g. Medium to Large) generally
results in what?
• More micro-partitions
• Better query scheduling
• Double the numbers of servers In the compute duster correct
• Higher storage costs
36. For a multi-cluster Warehouse, the number of credits billed is calculated on:
• The number of queries that ran using the Warehouse.
• The size of the Warehouse and the number of clusters that ran within a given time
period. correct
• The sue of the Warehouse and the maximum number of clusters configured for the
Warehouse.
• The number of users who accessed the Warehouse.
37. Which of the following are options when creating a Virtual Warehouse?
• Auto-suspend correct
• Auto-resume correct
• Local SSD size
• User count
38. The FLATEEN function is used to query which type of data in Snowflake?
• Structured data
• Semi-structured data correct
• Both of the above
• None of the above
Question 1
To run a Multi-Cluster Warehouse in auto-scale mode, a user would:
Answer : D
Question 2
Which of the following terms best describes Snowflake's database architecture?
• Answer : B
Question 3
Which of the following are options when creating a Virtual Warehouse? (Choose two.)
• A. Auto-drop
• B. Auto-resize
• C. Auto-resume
• D. Auto-suspend
Answer : CD
Question 4
A Virtual Warehouse's auto-suspend and auto-resume settings apply to:
Answer : B
Question 5
Fail-safe is unavailable on which table types? (Choose two.)
• A. Temporary
• B. Transient
• C. Provisional
• D. Permanent
Answer : AB
Question 6
Which of the following objects is not covered by Time Travel?
• A. Tables
• B. Schemas
• C. Databases
• D. Stages
Answer : D
Question 7
True or False: Micro-partition metadata enables some operations to be completed
without requiring Compute.
• A. True
• B. False
Answer : A
Question 8
Which of the following commands are not blocking operations? (Choose two.)
• A. UPDATE
• B. INSERT
• C. MERGE
• D. COPY
Answer : BD
Question 9
Which of the following is true of Snowpipe via REST API? (Choose two.)
Answer : BD
Question 10
Snowflake recommends, as a minimum, that all users with the following role(s) should
be enrolled in Multi-Factor Authentication (MFA):
Answer : D
Question 11
When can a Virtual Warehouse start running queries?
• A. 12am-5am
• B. Only during administrator defined time slots
• C. When its provisioning is complete
• D. After replication
Answer : B
Question 12
True or False: Users are able to see the result sets of queries executed by other users
that share their same role.
• A. True
• B. False
Answer : B
Question 13
True or False: The user has to specify which cluster a query will run on in a multi-
cluster Warehouse.
• A. True
• B. False
Answer : B
Question 14
True or False: Pipes can be suspended and resumed.
• A. True
• B. False
Answer : B
Question 15
Which of the following languages can be used to implement Snowflake User Defined
Functions (UDFs)? (Choose two.)
• A. Java
• B. Javascript
• C. SQL
• D. Python
Answer : BC
A TO_CHAR
B TO_VARIANT
C TO_VARCHAR
Question #2
What are the least privileges needed to view and modify resource monitors? (Select TWO).
A SELECT
B OWNERSHIP
C MONITOR
D MODIFY
E USAGE
Correct Answer: C, D
To view and modify resource monitors, the least privileges needed are MONITOR and
MODIFY.These privileges allow a user to monitor credit usage and make changes to resource
monitors3.
Question #3
A View
B Materialized view
C External table
Viewscan be used with Secure Data Sharing in Snowflake.Materialized views, external tables,
and UDFs are not typically shared directly for security and performance reasons2.
Question #4
What are the least privileges needed to view and modify resource monitors? (Select TWO).
A SELECT
B OWNERSHIP
C MONITOR
D MODIFY
E USAGE
Correct Answer: C, D
To view and modify resource monitors, the least privileges needed are MONITOR and
MODIFY.These privileges allow a user to monitor credit usage and make changes to resource
monitors3.
Question #5
A tag object has been assigned to a table (TABLE_A) in a schema within a Snowflake database.
Which CREATE object statement will automatically assign the TABLE_A tag to a target object?
Correct Answer: C
When a tag object is assigned to a table, using the statementCREATE TABLE <table_name> AS
SELECT * FROM TABLE_Awill automatically assign the TABLE_A tag to the newly created
table2.
Question #1
What does Snowflake attempt to do if any of the compute resources for a virtual warehouse
fail to provision during start-up?
Correct Answer: B
Question #2
When snaring data in Snowflake. what privileges does a Provider need to grant along with a
share? (Select TWO).
D USAGE on the database and the schema containing the tables to share
E OPEBATE on the database and the schema containing the tables to share.
Correct Answer: C, D
Question #3
A ACCESS_HOSTORY
B LOGIN_HISTORY
C SESSIONS
D QUERY HISTORY
Correct Answer: B
Question #4
When snaring data in Snowflake. what privileges does a Provider need to grant along with a
share? (Select TWO).
D USAGE on the database and the schema containing the tables to share
E OPEBATE on the database and the schema containing the tables to share.
Correct Answer: C, D
Question #5
What does Snowflake recommend a user do if they need to connect to Snowflake with a tool
or technology mat is not listed in Snowflake partner ecosystem?
Correct Answer: D
Question 1
Among the options listed, which is not a valid Snowflake Warehouse Size?
A) S
B) XL
C) M
D) XXS
Question 2
Does the Fail-Safe period for temporary and transient tables equal 0?
A) True
B) False
Question 3
Does the time travel feature in Snowflake preserve data at the expense of running
continuous backups?
A) True
B) False
Question 4
A) 1
B) 100
C) 1000
D) 5
Question 5
Question 6
A) External stages.
B) Mix stages.
C) Internal Stages.
D) Provider Stages.
Question 7
Will Snowpipe reload a file with the same name if it was modified 15 days after the
A) No. Snowpipe ignores any files that have already been loaded.
B) Yes it will copy the file causing potential duplicates. This is because Snowpipe
A) Yes
B) No
Question 9
A) Yes
B) No
Question 10
Which of the following scenarios represents scaling out for a Snowflake virtual
Warehouse?
Question 1
D) XXS.
Explanation:
There is no XXS (Double Extra Small) Warehouse Size. The smallest available
Warehouse Size is X-Small (XS). You can view the various sizes in the provided image
Question 2
A) True.
Explanation:
The Fail-Safe retention period for temporary and transient tables in Snowflake is
indeed 0. Unlike regular tables, they do not have a Fail-Safe retention period.
Question 3
B) False.
Explanation:
The Time Travel feature in Snowflake allows you to access historical data, including
data that has been altered or deleted, within a specified time frame. While using Time
Travel may result in additional storage costs, this feature is not dependent on
When any DML operations are performed on a table, Snowflake retains previous
versions of the table data for a defined period of time. This enables querying earlier
Question 4
B) 100
Explanation:
A single task can have a maximum of 100 predecessor tasks and 100 child tasks.
Question 5
Explanation:
Snowpipe enables loading data from files as soon as they’re available in a stage
Question 6
A) External stages.
C) Internal Stages.
Explanation:
A) External Stages: These stages reference data files that are stored externally, such as
in a cloud storage service like AWS S3, Azure Blob Storage, or Google Cloud Storage.
C) Internal Stages: These stages store data files internally within Snowflake, making
Question 7
B) Yes it will copy the file causing potential duplicates. This is because Snowpipe
Explanation:
Snowpipe ignores modified files that are staged again. To reload modified data files, it
is currently necessary to recreate the pipe object using the CREATE OR REPLACE
PIPE syntax.
Snowpipe loads the data again, potentially resulting in duplicate records in the target
table.
Question 8
A) Yes
Explanation:
All database objects shared between accounts are read-only (i.e. the objects cannot
Sharing accounts cannot modify or delete objects or data within the shared database.
The primary purpose of sharing databases is to enable other accounts to query the
Question 9
A) Yes
Explanation:
A single storage integration can support multiple external stages. The URL in the
stage definition must align with the storage location specified for the
STORAGE_ALLOWED_LOCATIONS parameter.
STORAGE_ALLOWED_LOCATIONS = (‘cloud_specific_url’)
Explicitly limits external stages that use the integration to reference one or more
storage locations (i.e. S3 bucket, GCS bucket, or Azure container). Supports a comma-
separated list of URLs for existing buckets and, optionally, paths used to store data
files for loading/unloading. Alternatively supports the * wildcard, meaning “allow
Question 10
Explanation:
Scaling out in Snowflake involves adding more virtual warehouses of the same size to
work in parallel. This approach increases concurrency and allows for more
of an existing warehouse to provide more compute power, while the option related to
requesting more data storage from the cloud provider is unrelated to scaling virtual
warehouses.
Question 11
Options:
Question 12
1. Yes
2. No
Question 13
Can you use the GET command in Snowflake to download files from an external
stage?
A) Yes, the GET command can be used to download files from external stages.
B) No, Snowflake does not support the GET command for downloading files.
C) The GET command is used for querying data, not for file downloads.
D) The DOWNLOAD command should be used instead of GET for file retrieval from
external stages.
Question 14
Options:
A) User.
B) Table.
C) Named internal.
D) Named external.
E) Account.
Question 15
A) GET.
B) PUT.
C) UNLOAD.
D) INSERT INTO.
Question 16
A) FLATTEN.
B) CHECK_JSON.
C) PARSE_JSON.
Question 17
What method does Snowflake employ to restrict the retrieval of a minimal number of
A) Pruning.
B) Clustering.
C) Indexing.
D) Computing.
Question 18
In which of the following situations is it advisable to utilize Snowpipe for loading data
into Snowflake?
Question 19
A) Yes
B) No
Question 20
Does Snowflake trigger the re-clustering of a table only when it would benefit from
the operation?
A) True
B) False
Answer Key
Question 11
data type within the tables. VARIANT columns can accommodate semi-structured
data, including JSON, with a maximum size limit of 16MB. This allows for efficient
storage and querying of JSON data using JSON notation, enabling the storage of
Question 12
Solution: B) No
Question 13
Solution: B) No, Snowflake does not support the GET command for downloading
files.
Explanation:
Get command downloads data files from one of the following Snowflake stages to a
download files from external stages, use the utilities provided by the cloud
service.
Question 14
Solution:
A) User.
B) Table.
C) Named internal.
D) Named external.
and retrieval. These include User stages, Table stages (associated with individual
tables), Named internal stages (for internal storage), and Named external stages (for
external data sources). Each type of stage serves distinct purposes in managing and
stages.
Question 15
Solution: A) GET.
Snowflake internal stage, such as a user stage, table stage, or named internal stage, to
a directory or folder on a client machine. This operation can be performed using
SnowSQL, allowing for easy retrieval of data loaded using the COPY INTO command.
Question 16
Solution: A) FLATTEN.
VARIANT, OBJECT, or ARRAY columns and produces a lateral view, facilitating the
Question 17
Solution: A) Pruning.
Explanation: Snowflake utilizes query pruning to analyze and access the minimum
Question 18
Explanation: Snowpipe enables loading data from files as soon as they’re available
in a stage. This means you can load data from files in micro-batches, making it
available to users within minutes, rather than manually executing COPY statements
Solution: A) Yes
historical data may be recoverable by Snowflake. Storage fees are incurred for
maintaining historical data during both the Time Travel and Fail-safe periods.
Question 20
A) True.
clustered table can become less optimally clustered. To address this, Snowflake
Importantly, Snowflake only initiates re-clustering for a table when it would result in
Question 1
BThe user will encounter an error, and will not be able to log in.
Reveal Answer
Answer : A
Next Question
Question 2
Which Snowflake feature provides increased login security for users connecting
to Snowflake that is powered by Duo Security service?
AOAuth
BNetwork policies
Reveal Answer
Answer : D
A column named "Data" contains VARIANT data and stores values as follows:
How will Snowflake extract the employee's name from the column data?
AData:employee.name
BDATA:employee.name
Cdata:Employee.name
Ddata:employee.name
Reveal Answer
Answer : D
In Snowflake, to extract a specific value from a VARIANT column, you use the
column name followed by a colon and then the key. The keys are case-sensitive.
Therefore, to extract the employee's name from the ''Data'' column, the correct
syntax isdata:employee.name.
Next Question
Question 4
Reveal Answer
Answer : C
The purpose of the Snowflake SPLIT_TO_TABLE function is to split a string based on a
specified delimiter and flatten the results into rows.This table function is useful for
transforming a delimited string into a set of rows that can be further processed or queried5.
Next Question
Question 5
BInsert a colon (:) between the VARIANT column name and any second-level
element. C. Insert a double colon (: :) between the VARIANT column name and
any first-level element.
DInsert a double colon (: :) between the VARIANT column name and any second-
level element.
Which Snowflake feature allows a user to track sensitive data for compliance,
discovery, protection, and resource usage?
ATags
BComments
CInternal tokenization
Reveal Answer
Answer : A
Tags in Snowflake allow users to track sensitive data for compliance, discovery, protection,
and resource usage.They enable the categorization and tracking of data, supporting
compliance with privacy regulations678. Reference: [COF-C02] SnowPro Core Certification
Exam Study Guide
Next Question
Question 7
What does the LATERAL modifier for the FLATTEN function do?
ACasts the values of the flattened data
Reveal Answer
Answer : C
The LATERAL modifier for the FLATTEN function allows joining information outside the object
(such as other columns in the source table) with the flattened data, creating a lateral view that
correlates with the preceding tables in the FROM clause2345. Reference: [COF-C02] SnowPro
Core Certification Exam Study Guide
Options:
A The user will be unable to enter a password.
B The user will encounter an error, and will not be able to log in.
D After entering the username and password, the user will be redirected to an Identity Provider
(IdP) login page.
Answer: A
Explanation:
When federated authentication is activated in Snowflake, users
authenticate via an external identity provider (IdP) rather than using
Snowflake-managed credentials. Therefore, a user with a password
defined by Snowflake will be unable to enter a password and must
use their IdP credentials to log in.
Options:
A OAuth
B Network policies
Answer: D
Explanation:
Multi-Factor Authentication (MFA) provides increased login security
for users connecting to Snowflake. Snowflake's MFA is powered by
Duo Security service, which adds an additional layer of security
during the login process.
How will Snowflake extract the employee's name from the column data?
Options:
A Data:employee.name
B DATA:employee.name
C data:Employee.name
D data:employee.name
Answer: D
Explanation:
In Snowflake, to extract a specific value from a VARIANT column,
you use the column name followed by a colon and then the key.
The keys are case-sensitive. Therefore, to extract the employee's
name from the ''Data'' column, the correct syntax
isdata:employee.name.
Options:
A To count the number of characters in a string
Answer: C
Explanation:
The purpose of the Snowflake SPLIT_TO_TABLE function is to split
a string based on a specified delimiter and flatten the results into
rows.This table function is useful for transforming a delimited string
into a set of rows that can be further processed or queried5.
Options:
A Insert a colon (:) between the VARIANT column name and any first-level element.
B Insert a colon (:) between the VARIANT column name and any second-level element. C. Insert
a double colon (: :) between the VARIANT column name and any first-level element.
D Insert a double colon (: :) between the VARIANT column name and any second-level element.
Answer: A
Explanation:
To traverse semi-structured data in Snowflake, a user can insert a
colon (:) between the VARIANT column name and any first-level
element.This path syntax is used to retrieve elements in a
VARIANT column4.
01. Which copy option is used to delete the file from the Snowflake stage
when data from staged files are loaded successfully?
a) DELETE = TRUE
b) DEL = TRUE
c) PURGE = TRUE
d) REMOVE = TRUE
02. A clustering key is added or modified for a large table. Which type of
queries will likely see performance improvement?
a) Queries that select all rows in the table
b) Queries that sort on the columns which are part of the cluster key
c) Queries that join on the columns which are part of the cluster key
d) Queries that select all columns in the table
e) Queries that group on the columns which are part of the cluster key
f) Queries that filter on the columns which are part of the cluster key
07. When the Virtual Warehouse data cache gets filled up, in which fashion
does the data get flushed out from the data cache?
a) LEAST-RECENTLY USED (LRU)
b) First In First Out (FIFO)
c) Last In Last Out (LILO)
d) MOST-RECENTLY USED (MRU)
08. External tables are a good solution for which of the following is true?
a) Typically only a subset of data is being accessed
b) Data is in binary format and can not be loaded into Snowflake.
c) The data is not accessed frequently.
d) Data is already in a data lake on a cloud platform (e.g., S3, Azure Blob Storage)
09. You need to see the history of all queries executed in the last 60 minutes.
Which of the following method should you use?
a) Request Snowflake support to provide query history
b) Use the QUERY_HISTORY table function in the INFORMATION schema
c) Use the QUERY_HISTORY view in the ACCOUNT_USAGE schema
d) View the historical queries using the history tab
10. What all options are available for data transformation while loading data
into a table using the COPY command?
a) Column omission
b) Column reordering
c) Casts
d) Truncation of Text Strings
e) Join
Questions 1
Which kind of Snowflake table stores file-level metadata for each file in a stage?
Options:
A. Directory
B. External
C. Temporary
D. Transient
Show Answer
Answer:
Explanation:
Explanation:
The kind of Snowflake table that stores file-level metadata for each file in a stage is a directory
table. A directory table is an implicit object layered on a stage and stores file-level metadata about
the data files in the stage3.
Questions 2
What is the minimum Snowflake edition needed for database failover and fail-back between
Snowflake accounts for business continuity and disaster recovery?
Options:
A. Standard
B. Enterprise
C. Business Critical
Show Answer
Answer:
Explanation:
Explanation:
The minimum Snowflake edition required for database failover and fail-back between Snowflake
accounts for business continuity and disaster recovery is the Business Critical edition. References:
Snowflake Documentation3.
Questions 3
Which database objects can be shared with the Snowflake secure data sharing feature? (Choose
two.)
Options:
A. Files
B. External tables
D. Sequences
E. Streams
Show Answer
Answer:
B, C
Explanation:
Explanation:
Snowflake’s secure data sharing feature allows sharing of certain database objects with other
Snowflake accounts. Among the options provided, external tables and secure UDFs can be shared
Question 1
In the Snowflake access control model, which entity owns an object by default?
Answer:D
Question 2
What is the default access of a securable object until other access is granted?
• A. No access
• B. Read access
• C. Write access
• D. Full access
Answer:A
Question 3
A user created a new worksheet within the Snowsight UI and wants to share this with
teammates.
Answer:B
Question 4
• A. ALTER
• B. CREATE ... CLONE
• C. COPY INTO
• D. CREATE REPLICATION GROUP
Answer:D
Question 5
Which sequence (order) of object privileges should be used to grant a custom role
read-only access on a table?
• D. None
Answer:C
Question 6
Which of the following objects can be directly restored using the UNDROP
command? (Choose two.)
• A. Schema
• B. View
• C. Internal stage
• D. Table
• E. User
• F. Role
Answer: AD
Question 7
What is the default compression typo when unloading data from Snowflake?
• A. Brotli
• B. bzip2
• C. Zstandard
• D. gzip
Answer: D
Question 8
• A. SYSADMIN only
• B. ORGADMIN only
• C. SECURITYADMIN or higher roles
• D. A role with the CREATE NETWORK POLICY privilege
• E. A role with the CREATE SECURITY INTEGRATION privilege
Answer: CD
Question 9
Which table function should be used to view details on a Directed Acyclic Graph
(DAG) run that is presently scheduled or is executing?
• A. TASK_HISTORY
• B. TASK_DEPENDENTS
• C. CURRENT_TASK_GRAPHS
• D. COMPLETE_TASK_GRAPHS
Answer: C
Question 10
What is used to extract the content of PDF files stored in Snowflake stages?
• A. FLATTEN function
• B. Window function
• C. HyperLogLog (HLL) function
• D. Java User-Defined Function (UDF)
Answer: D
Question 11
Answer: B
Question 12
• A. ACCOUNTADMIN
• B. SECURITYADMIN
• C. SYSADMIN
• D. ORGADMIN
Answer: D
Question 13
• A. Clustering
• B. Client redirect
• C. Fail-safe
• D. Search optimization service
• E. Account and database replication
Answer: BE
Question 14
What is the maximum total Continuous Data Protection (CDP) charges incurred for a
temporary table?
• A. 30 days
• B. 7 days
• C. 48 hours
• D. 24 hours
Answer: D
Question 15
What happens to the shared objects for users in a consumer account from a share,
once a database has been created in that account?
Answer: B
Question 16
What services does Snowflake automatically provide for customers that they may
have been responsible for with their on-premise system? (Choose all that apply.)
Answer: ABD
Question 17
Which COPY INTO statement accurately describes how to unload data from a
Snowflake table?
Question 18
What tasks can an account administrator perform in the Data Exchange? (Choose
two.)
Answer: AC
Question 19
• A. SQL functions like the GET command can be used to copy the unstructured
data to a location on the client.
• B. SQL functions can be used to create different types of URLs pointing to the
unstructured data. These URLs can be used to download the data to a client.
• C. SQL functions can be used to retrieve the data from the query results cache.
When the query results are output to a client. the unstructured data will be
output to the client as files.
• D. SQL functions can call on different web extensions designed to display
different types of files as a web page. The web extensions will allow the files to
be downloaded to the client.
Answer: B
• A. Tasks
• B. Pipes
• C. Secure User-Defined Functions (UDFs)
• D. Stored Procedures
Answer: C
Questions 4
Options:
A.
When increasing the size of an active warehouse the compute resource for all running and
queued queries on the warehouse are affected
B.
When reducing the size of a warehouse the compute resources are removed only when they are
no longer being used to execute any current statements.
C.
The warehouse will be suspended while the new compute resource is provisioned and will resume
automatically once provisioning is complete.
D.
Users who are trying to use the warehouse will receive an error message until the resizing is
complete
Answer: A
Explanation:
Explanation:
When a virtual warehouse in Snowflake is resized, specifically when it is increased in size, the
additional compute resources become immediately available to all running and queued queries.
This means that the performance of these queries can improve due to the increased
resources. Conversely, when the size of a warehouse is reduced, the compute resources are not
removed until they are no longer being used by any current operations1.
Questions 5
Which Snowflake objects track DML changes made to tables, like inserts, updates, and deletes?
Options:
A.
Pipes
B.
Streams
C.
Tasks
D.
Procedures
Answer:
Explanation:
Explanation:
In Snowflake, Streams are the objects that track Data Manipulation Language (DML) changes
made to tables, such as inserts, updates, and deletes. Streams record these changes along with
metadata about each change, enabling actions to be taken using the changed data. This process is
known as change data capture (CDC)2.
Questions 6
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
When unloading data to a stage, it is recommended to define an individual file format. This
ensures that the data is unloaded in a consistent and expected format, which can be crucial for
downstream processing and analysis2
Questions 7
Options:
A.
Possibly, if the warehouse is resized to a smaller size and the cache no longer fits.
B.
Yes. because the compute resource is replaced in its entirety with a new compute resource.
C.
No. because the size of the cache is independent from the warehouse size
D.
Yes. became the new compute resource will no longer have access to the cache encryption key
Answer:
Explanation:
Explanation:
When a Snowflake virtual warehouse is resized, the data cached in the warehouse is not lost. This
is because the cache is maintained independently of the warehouse size. Resizing a warehouse,
whether scaling up or down, does not affect the cached data, ensuring that query performance is
not impacted by such changes.
References:
Questions 8
Options:
A.
Standard Edition
B.
Enterprise Edition
C.
Business Critical Edition
D.
Answer:
Explanation:
Explanation:
Materialized views in Snowflake are a feature that allows for the pre-computation and storage of
query results for faster query performance. This feature is available starting from the Enterprise
Edition of Snowflake. It is not available in the Standard Edition, and while it is also available in
higher editions like Business Critical and Virtual Private Snowflake, the Enterprise Edition is the
minimum requirement.
Questions 9
What can be used to view warehouse usage over time? (Select Two).
Options:
A.
B.
C.
D.
E.
Answer:
B, D
Explanation:
Explanation:
To view warehouse usage over time, the Query history view and the
WAREHOUSE_METERING__HISTORY View can be utilized. The Query history view allows users
to monitor the performance of their queries and the load on their warehouses over a specified
period1. The WAREHOUSE_METERING__HISTORY View provides detailed information about
the workload on a warehouse within a specified date range, including average running and
queued loads2. References: [COF-C02] SnowPro Core Certification Exam Study Guide
Questions 10
Which command can be used to stage local files from which Snowflake interface?
Options:
A.
SnowSQL
B.
C.
Snowsight
D.
.NET driver
Answer:
A
Explanation:
Explanation:
SnowSQL is the command-line client for Snowflake that allows users to execute SQL queries and
perform all DDL and DML operations, including staging files for bulk data loading. It is specifically
designed for scripting and automating tasks.
Questions 11
Options:
A.
Variant
B.
Object
C.
Geometry
D.
Geography
Answer:
Explanation:
Explanation:
Questions 12
What are two ways to create and manage Data Shares in Snowflake? (Choose two.)
Options:
A.
B.
C.
D.
Answer:
A, C
Explanation:
Explanation:
In Snowflake, Data Shares can be created and managed in two primary ways:
• Via the Snowflake Web Interface (UI): Users can create and manage shares through the
graphical interface provided by Snowflake, which allows for a user-friendly experience.
• Via SQL commands: Snowflake also allows the creation and management of shares using
SQL commands. This method is more suited for users who prefer scripting or need to
automate the process.
Questions 13
What transformations are supported in a CREATE PIPE ... AS COPY ... FROM (....) statement?
(Select TWO.)
Options:
A.
B.
C.
D.
E.
Answer:
A, D
Explanation:
Explanation:
In a CREATE PIPE ... AS COPY ... FROM (....) statement, the supported transformations include
filtering data using an optional WHERE clause and omitting columns. The WHERE clause allows
for the specification of conditions to filter the data that is being loaded, ensuring only relevant
data is inserted into the table. Omitting columns enables the exclusion of certain columns from
the data load, which can be useful when the incoming data contains more columns than are
needed for the target table.
Questions 14
What tasks can be completed using the copy command? (Select TWO)
Options:
A.
B.
C.
D.
E.
Answer:
C, D
Explanation:
Explanation:
The COPY command in Snowflake allows for the reordering of columns as they are loaded into a
table, and it also permits the omission of columns from the source file during the load process.
This provides flexibility in handling the schema of the data being ingested. References: [COF-C02]
SnowPro Core Certification Exam Study Guide
Questions 15
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
Snowflake is designed to optimize query performance by utilizing metadata for certain types of
queries. When executing a COUNT query, Snowflake can often fulfill the request by accessing
metadata about the table’s row count, rather than scanning the entire table or micro-partitions.
This is particularly efficient for large tables like FCT_SALES with a significant number of records.
The metadata layer maintains statistics about the table, including the row count, which enables
Snowflake to quickly return the result of a COUNT query without the need to perform a full scan.
Questions 16
Options:
A.
1 day
B.
7 days
C.
45 days
D.
90 days
Answer:
Explanation:
Explanation:
Fail-safe is a feature in Snowflake that provides an additional layer of data protection. After the
Time Travel retention period ends, Fail-safe offers a non-configurable 7-day period during which
historical data may be recoverable by Snowflake. This period is designed to protect against
accidental data loss and is not intended for customer access.
Questions 17
Which command is used to unload data from a Snowflake table into a file in a stage?
Options:
A.
COPY INTO
B.
GET
C.
WRITE
D.
EXTRACT INTO
Explanation:
Explanation:
The COPY INTO command is used in Snowflake to unload data from a table into a file in a stage.
This command allows for the export of data from Snowflake tables into flat files, which can then
be used for further analysis, processing, or storage in external systems.
Questions 18
Which command can be used to load data into an internal stage?
Options:
A.
LOAD
B.
copy
C.
GET
D.
PUT
Answer:
Explanation:
Explanation:
The PUT command is used to load data into an internal stage in Snowflake. This command
uploads data files from a local file system to a named internal stage, making the data available for
subsequent loading into a Snowflake table using the COPY INTO command.
Questions 19
Which of the following conditions must be met in order to return results from the results cache?
(Select TWO).
Options:
A.
The user has the appropriate privileges on the objects associated with the query
B.
Micro-partitions have been reclustered since the query was last run
C.
The new query is run using the same virtual warehouse as the previous query
D.
E.
The query has been run within 24 hours of the previously-run query
Answer:
A, E
Explanation:
Explanation:
To return results from the results cache in Snowflake, certain conditions must be met:
• Privileges: The user must have the appropriate privileges on the objects associated with
the query. This ensures that only authorized users can access cached data.
• Time Frame: The query must have been run within 24 hours of the previously-run query.
Snowflake’s results cache is designed to store the results of queries for a short period,
typically 24 hours, to improve performance for repeated queries.
Questions 20
True or False: It is possible for a user to run a query against the query result cache without
requiring an active Warehouse.
Options:
A.
True
B.
False
Answer:
Explanation:
Explanation:
Snowflake’s architecture allows for the use of a query result cache that stores the results of
queries for a period of time. If the same query is run again and the underlying data has not
changed, Snowflake can retrieve the result from this cache without needing to re-run the query
on an active warehouse, thus saving on compute resources.
Questions 21
A company's security audit requires generating a report listing all Snowflake logins (e.g.. date and
user) within the last 90 days. Which of the following statements will return the required
information?
Options:
A.
FROM ACCOUNT_USAGE.USERS;
B.
FROM table(information_schema.login_history_by_user())
C.
FROM ACCOUNT_USAGE.ACCESS_HISTORY;
D.
FROM ACCOUNT_USAGE.LOGIN_HISTORY;
Answer:
Explanation:
Explanation:
To generate a report listing all Snowflake logins within the last 90 days,
the ACCOUNT_USAGE.LOGIN_HISTORY view should be used. This view provides information
about login attempts, including successful and unsuccessful logins, and is suitable for security
audits4.
Questions 22
Options:
A.
Alation
B.
DataRobot
C.
dbt
D.
Tableau
Show Answer Buy Now
Answer:
Explanation:
Explanation:
Alation is known for specializing in data catalog solutions and is a partner of Snowflake. Data
catalog solutions are essential for organizations to effectively manage their metadata and make it
easily accessible and understandable for users, which aligns with the capabilities provided by
Alation.
Questions 23
Options:
A.
B.
C.
D.
E.
Answer:
C, E
Explanation:
Explanation:
The result set cache in Snowflake is invalidated and no longer available when the underlying data
of the query results has changed, ensuring that queries return the most current data. Additionally,
the cache expires after 24 hours to maintain the efficiency and accuracy of data retrieval1.
Questions 24
Which of the following describes how multiple Snowflake accounts in a single organization relate
to various cloud providers?
Options:
A.
Each Snowflake account can be hosted in a different cloud vendor and region.
B.
Each Snowflake account must be hosted in a different cloud vendor and region
C.
All Snowflake accounts must be hosted in the same cloud vendor and region
D.
Each Snowflake account can be hosted in a different cloud vendor, but must be in the same
region.
Answer:
Explanation:
Explanation:
Snowflake’s architecture allows for flexibility in account hosting across different cloud vendors
and regions. This means that within a single organization, different Snowflake accounts can be set
up in various cloud environments, such as AWS, Azure, or GCP, and in different geographical
regions. This allows organizations to leverage the global infrastructure of multiple cloud providers
and optimize their data storage and computing needs based on regional requirements, data
sovereignty laws, and other considerations.
https://docs.snowflake.com/en/user-guide/intro-regions.html
Questions 25
Options:
A.
Querying data-related objects that were created within the past 365 days
B.
Restoring data-related objects that have been deleted within the past 90 days
C.
D.
Answer:
B, C
Explanation:
Explanation:
Snowflake Time Travel is a powerful feature that allows users to access historical data within a
defined period. It enables two key capabilities:
• B. Restoring data-related objects that have been deleted within the past 90 days: Time
Travel can be used to restore tables, schemas, and databases that have been accidentally
or intentionally deleted within the Time Travel retention period.
• C. Conducting point-in-time analysis for BI reporting: It allows users to query historical
data as it appeared at a specific point in time within the Time Travel retention period,
which is crucial for business intelligence and reporting purposes.
While Time Travel does allow querying of past data, it is limited to the retention period set for the
Snowflake account, which is typically 1 day for standard accounts and can be extended up to 90
days for enterprise accounts. It does not enable querying or restoring objects created or deleted
beyond the retention period, nor does it provide analysis over all periods of time.
Questions 26
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
Materialized Views in Snowflake are designed to store the result of a query and can be refreshed
to maintain up-to-date data. However, they have certain limitations, one of which is that they
cannot be defined using a JOIN clause. This means that a Materialized View can only be created
based on a single source table and cannot combine data from multiple tables using JOIN
operations.
Questions 27
Which command can be used to list all the file formats for which a user has access privileges?
Options:
A.
LIST
B.
C.
D.
Answer:
Explanation:
Explanation:
The command to list all the file formats for which a user has access privileges in Snowflake
is SHOW FILE FORMATS. This command provides a list of all file formats defined in the user's
current session or specified database/schema, along with details such as the name, type, and
creation time of each file format. It is a valuable tool for users to understand and manage the file
formats available for data loading and unloading operations.
Questions 28
Options:
A.
ARRAY
B.
CHARACTER
C.
STRING
D.
VARCHAR
E.
VARIANT
Answer:
A, E
Explanation:
Explanation:
In Snowflake, semi-structured data is optimally stored using specific data types that are designed
to handle the flexibility and complexity of such data. The VARIANT data type can store structured
and semi-structured data types, including JSON, Avro, ORC, Parquet, or XML, in a single column.
The ARRAY data type, on the other hand, is suitable for storing ordered sequences of elements,
which can be particularly useful for semi-structured data types like JSON arrays. These data types
provide the necessary flexibility to store and query semi-structured data efficiently in Snowflake.
Questions 29
Which function will provide the proxy information needed to protect Snowsight?
Options:
A.
SYSTEMADMIN_TAG
B.
SYSTEM$GET_PRIVATELINK
C.
SYSTEMSALLONTLIST
D.
SYSTEMAUTHORIZE
Answer:
Explanation:
Explanation:
Questions 30
Which view can be used to determine if a table has frequent row updates or deletes?
Options:
A.
TABLES
B.
TABLE_STORAGE_METRICS
C.
STORAGE_DAILY_HISTORY
D.
STORAGE USAGE
Answer:
B
Explanation:
Explanation:
The TABLE_STORAGE_METRICS view can be used to determine if a table has frequent row
updates or deletes. This view provides detailed metrics on the storage utilization of tables within
Snowflake, including metrics that reflect the impact of DML operations such as updates and
deletes on table storage. For example, metrics related to the number of active and deleted rows
can help identify tables that experience high levels of row modifications, indicating frequent
updates or deletions.
Questions 31
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
The search optimization service in Snowflake supports use cases involving conjunctions (AND) of
multiple equality predicates. This service enhances the performance of queries that include
multiple equality conditions by utilizing search indexes to quickly filter data without scanning
entire tables or partitions. It's particularly beneficial for improving the response times of complex
queries that rely on specific data matching across multiple conditions.
Questions 32
Why would a Snowflake user decide to use a materialized view instead of a regular view?
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
A Snowflake user would decide to use a materialized view instead of a regular view primarily
when the base tables do not change frequently. Materialized views store the result of the view
query and update it as the underlying data changes, making them ideal for situations where the
data is relatively static and query performance is critical. By precomputing and storing the query
results, materialized views can significantly reduce query execution times for complex
aggregations, joins, and calculations.
References:
Questions 33
What is the only supported character set for loading and unloading data from all supported file
formats?
Options:
A.
UTF-8
B.
UTF-16
C.
ISO-8859-1
D.
WINDOWS-1253
Answer:
Explanation:
Explanation:
UTF-8 is the only supported character set for loading and unloading data from all supported file
formats in Snowflake. UTF-8 is a widely used encoding that supports a large range of characters
from various languages, making it suitable for internationalization and ensuring data compatibility
across different systems and platforms.
Questions 34
What happens when a network policy includes values that appear in both the allowed and
blocked IP address list?
Options:
A.
Those IP addresses are allowed access to the Snowflake account as Snowflake applies the allowed
IP address list first.
B.
Those IP addresses are denied access lei the Snowflake account as Snowflake applies the blocked
IP address list first.
C.
Snowflake issues an alert message and adds the duplicate IP address values lo both 'he allowed
and blocked IP address lists.
D.
Snowflake issues an error message and adds the duplicate IP address values to both the allowed
and blocked IP address list
Answer:
Explanation:
Explanation:
In Snowflake, when setting up a network policy that specifies both allowed and blocked IP
address lists, if an IP address appears in both lists, access from that IP address will be denied. The
reason is that Snowflake prioritizes security, and the presence of an IP address in the blocked list
indicates it should not be allowed regardless of its presence in the allowed list. This ensures that
access controls remain stringent and that any potentially unsafe IP addresses are not
inadvertently permitted access.
Questions 35
If a virtual warehouse runs for 61 seconds, shut down, and then restart and runs for 30 seconds,
for how many seconds is it billed?
Options:
A.
60
B.
91
C.
120
D.
121
Answer:
Explanation:
Explanation:
Snowflake bills virtual warehouse usage in one-minute increments, rounding up to the nearest
minute for any partial minute of compute time used. If a virtual warehouse runs for 61 seconds
and then, after being shut down, restarts and runs for an additional 30 seconds, the total time
billed would be 120 seconds or 2 minutes. The first 61 seconds are rounded up to 2 minutes, and
the subsequent 30 seconds are within a new minute, which is also rounded up to the nearest
minute.
Questions 36
What are valid sub-clauses to the OVER clause for a window function? (Select TWO).
Options:
A.
GROUP BY
B.
LIMIT
C.
ORDER BY
D.
PARTITION BY
E.
UNION ALL
Answer:
C, D
Explanation:
Explanation:
Valid sub-clauses to the OVER clause for a window function in SQL are:
• C. ORDER BY: This clause specifies the order in which the rows in a partition are
processed by the window function. It is essential for functions that depend on the row
order, such as ranking functions.
• D. PARTITION BY: This clause divides the result set into partitions to which the window
function is applied. Each partition is processed independently of other partitions, making it
crucial for functions that compute values across sets of rows that share common
characteristics.
These clauses are fundamental to defining the scope and order of data over which the window
function operates, enabling complex analytical computations within SQL queries.
Questions 37
Which SQL command can be used to verify the privileges that are granted to a role?
Options:
A.
B.
SHOW ROLES
C.
D.
Answer:
Explanation:
Explanation:
To verify the privileges that have been granted to a specific role in Snowflake, the correct SQL
command is SHOW GRANTS TO ROLE <Role Name>. This command lists all the privileges
granted to the specified role, including access to schemas, tables, and other database objects. This
is a useful command for administrators and users with sufficient privileges to audit and manage
role permissions within the Snowflake environment.
Questions 38
Options:
A.
Cloud services
B.
Query processing
C.
Elastic memory
D.
Database storage
Answer:
Explanation:
Explanation:
The layer of Snowflake's architecture associated with virtual warehouses is the Query Processing
layer. Virtual warehouses in Snowflake are dedicated compute clusters that execute SQL queries
against the stored data. This layer is responsible for the entire query execution process, including
parsing, optimization, and the actual computation. It operates independently of the storage layer,
enabling Snowflake to scale compute and storage resources separately for efficiency and cost-
effectiveness.
References:
Questions 39
Options:
A.
B.
C.
D.
E.
Answer:
B, C
Explanation:
Explanation:
Transient tables in Snowflake are designed for temporary or intermediate workloads with the
following characteristics:
• B. Transient tables can be cloned to permanent tables: This feature allows users to create
copies of transient tables for permanent use, providing flexibility in managing data
lifecycles.
• C. Transient tables persist until they are explicitly dropped: Unlike temporary tables that
exist for the duration of a session, transient tables remain in the database until explicitly
removed by a user, offering more durability for short-term data storage needs.
References:
Questions 40
Which function is used to convert rows in a relational table to a single VARIANT column?
Options:
A.
ARRAY_AGG
B.
OBJECT_AGG
C.
ARRAY_CONSTRUCT
D.
OBJECT_CONSTRUCT
Answer:
Explanation:
Explanation:
The OBJECT_CONSTRUCT function in Snowflake is used to convert rows in a relational table
into a single VARIANT column that represents each row as a JSON object. This function
dynamically creates a JSON object from a list of key-value pairs, where each key is a column
name and each value is the corresponding column value for a row. This is particularly useful for
aggregating and transforming structured data into semi-structured JSON format for further
processing or analysis.
References:
Questions 41
Options:
A.
Secure view
B.
Materialized view
C.
Temporary table
D.
Transient table
Answer:
Explanation:
Explanation:
Temporary tables in Snowflake do not consume storage costs. They are designed for transient
data that is needed only for the duration of a session. Data stored in temporary tables is held in
the virtual warehouse's cache and does not persist beyond the session's lifetime, thereby not
incurring any storage charges.
References:
Questions 42
When floating-point number columns are unloaded to CSV or JSON files, Snowflake truncates the
values to approximately what?
Options:
A.
(12,2)
B.
(10,4)
C.
(14,8)
D.
(15,9)
Answer:
Explanation:
Explanation:
When unloading floating-point number columns to CSV or JSON files, Snowflake truncates the
values to approximately 15 significant digits with 9 digits following the decimal point, which can
be represented as (15,9). This ensures a balance between accuracy and efficiency in representing
floating-point numbers in text-based formats, which is essential for data interchange and
processing applications that consume these files.
References:
• Snowflake Documentation: Data Unloading Considerations
Questions 43
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
The Query Profile in Snowflake provides a graphical representation and statistics for each
component of the query's execution plan. This includes details such as the execution time, the
number of rows processed, and the amount of data scanned for each operation within the query.
The Query Profile is a crucial tool for understanding and optimizing the performance of queries,
as it helps identify potential bottlenecks and inefficiencies.
References:
Questions 44
A.
B.
C.
A single reader account can consume data from multiple provider accounts.
D.
Data consumers are responsible for reader account setup and data usage costs.
E.
Reader accounts enable data consumers to access and query data shared by the provider.
Answer:
A, E
Explanation:
Explanation:
• A. Reader account users cannot add new data to the account: Reader accounts are
intended for data consumption only. Users of these accounts can query and analyze the
data shared with them but cannot upload or add new data to the account.
• E. Reader accounts enable data consumers to access and query data shared by the
provider: One of the primary purposes of reader accounts is to allow data consumers to
access and perform queries on the data shared by another Snowflake account, facilitating
secure and controlled data sharing.
References:
What is it called when a customer managed key is combined with a Snowflake managed key to
create a composite key for encryption?
Options:
A.
B.
Client-side encryption
C.
D.
Answer:
Explanation:
Explanation:
Tri-secret secure encryption is a security model employed by Snowflake that involves combining a
customer-managed key with a Snowflake-managed key to create a composite key for encrypting
data. This model enhances data security by requiring both the customer-managed key and the
Snowflake-managed key to decrypt data, thus ensuring that neither party can access the data
independently. It represents a balanced approach to key management, leveraging both customer
control and Snowflake's managed services for robust data encryption.
References:
Questions 46
Which file function generates a SnowFlake-hosted URL that must be authenticated when used?
Options:
A.
GET_STATE_LOCATION
B.
GET_PRESENT_URL
C.
BUILD_SCOPED_FILE_URL
D.
BUILD_STAGE_FILE_URL
Answer:
Explanation:
Explanation:
Questions 47
There are two Snowflake accounts in the same cloud provider region: one is production and the
other is non-production. How can data be easily transferred from the production account to the
non-production account?
Options:
A.
Clone the data from the production account to the non-production account.
B.
Create a data share from the production account to the non-production account.
C.
Create a subscription in the production account and have it publish to the non-production
account.
D.
Create a reader account using the production account and link the reader account to the non-
production account.
Answer:
Explanation:
Explanation:
References:
Questions 48
Options:
A.
ALTER ROLE
B.
REVOKE ROLE
C.
USE ROLE
D.
Answer:
Explanation:
Explanation:
The REVOKE ROLE command is used to remove a role from another role or a user in Snowflake.
This command is part of Snowflake's role-based access control system, allowing administrators to
manage permissions and access to database objects efficiently by adding or removing roles from
users or other roles.
References:
Questions 49
Options:
A.
A single executable statement can call only two stored procedures. In contrast, a single SQL
statement can call multiple UDFs.
B.
A single executable statement can call only one stored procedure. In contrast, a single SQL
statement can call multiple UDFs.
C.
A single executable statement can call multiple stored procedures. In contrast, multiple SQL
statements can call the same UDFs.
D.
Multiple executable statements can call more than one stored procedure. In contrast, a single SQL
statement can call multiple UDFs.
Answer:
Explanation:
Explanation:
In Snowflake, stored procedures and User-Defined Functions (UDFs) have different invocation
patterns within SQL:
• Option B is correct: A single executable statement can call only one stored procedure due
to the procedural and potentially transactional nature of stored procedures. In contrast, a
single SQL statement can call multiple UDFs because UDFs are designed to operate more
like functions in traditional programming, where they return a value and can be embedded
within SQL queries.References: Snowflake documentation comparing the operational
differences between stored procedures and UDFs.
Questions 50
Regardless of which notation is used, what are considerations for writing the column name and
element names when traversing semi-structured data?
Options:
A.
B.
D.
Answer:
Explanation:
Explanation:
When querying semi-structured data in Snowflake, the behavior towards case sensitivity is
distinct between column names and the names of elements within the semi-structured data.
Column names follow the general SQL norm of being case-insensitive, meaning you can reference
them in any case without affecting the query. However, element names within JSON, XML, or
other semi-structured data are case-sensitive. This distinction is crucial for accurate data retrieval
and manipulation in Snowflake, especially when working with JSON objects where the case of
keys can significantly alter the outcome of queries.
References:
Questions 51
While clustering a table, columns with which data types can be used as clustering keys? (Select
TWO).
Options:
A.
BINARY
B.
GEOGRAPHY
C.
GEOMETRY
D.
OBJECT
E.
VARIANT
Answer:
A, C
Explanation:
Explanation:
A clustering key can be defined when a table is created by appending a CLUSTER Where each
clustering key consists of one or more table columns/expressions, which can be of any data type,
except GEOGRAPHY, VARIANT, OBJECT, or ARRAY https://docs.snowflake.com/en/user-
guide/tables-clustering-keys
Questions 52
A Snowflake user is writing a User-Defined Function (UDF) that includes some unqualified object
names.
Options:
A.
B.
Snowflake will only check the schema the UDF belongs to.
C.
Snowflake will first check the current schema, and then the schema the previous query used
D.
Snowflake will first check the current schema, and them the PUBLIC schema of the current
database.
Answer:
Explanation:
Explanation:
• Object Name Resolution: When unqualified object names (e.g., table name without
schema) are used in a UDF, Snowflake follows a specific hierarchy to resolve them. Here's
the order:
• Note: The SEARCH_PATH parameter influences object resolution for queries, not within
UDFs.
References:
Questions 53
Which function should be used to insert JSON format string data inot a VARIANT field?
Options:
A.
FLATTEN
B.
CHECK_JSON
C.
PARSE_JSON
D.
TO_VARIANT
Show Answer Buy Now
Answer:
Explanation:
Explanation:
To insert JSON formatted string data into a VARIANT field in Snowflake, the correct function to
use is PARSE_JSON. The PARSE_JSON function is specifically designed to interpret a JSON
formatted string and convert it into a VARIANT type, which is Snowflake's flexible format for
handling semi-structured data like JSON, XML, and Avro. This function is essential for loading and
querying JSON data within Snowflake, allowing users to store and manage JSON data efficiently
while preserving its structure for querying purposes. This function's usage and capabilities are
detailed in the Snowflake documentation, providing users with guidance on how to handle semi-
structured data effectively within their Snowflake environments.
References:
Questions 54
What is the MAXIMUM number of clusters that can be provisioned with a multi-cluster virtual
warehouse?
Options:
A.
B.
C.
10
D.
100
Show Answer Buy Now
Answer:
Explanation:
Explanation:
In Snowflake, the maximum number of clusters that can be provisioned within a multi-cluster
virtual warehouse is 10. This allows for significant scalability and performance management by
enabling Snowflake to handle varying levels of query load by adjusting the number of active
clusters within the warehouse.References: Snowflake documentation on virtual warehouses,
particularly the scalability options available in multi-cluster configurations.
Questions 55
Which data formats are supported by Snowflake when unloading semi-structured data? (Select
TWO).
Options:
A.
B.
C.
Comma-separated JSON
D.
E.
B, D
Explanation:
Explanation:
Snowflake supports a variety of file formats for unloading semi-structured data, among which
Parquet and Newline Delimited JSON (NDJSON) are two widely used formats.
• B. Binary file in Parquet: Parquet is a columnar storage file format optimized for large-scale
data processing and analysis. It is especially suited for complex nested data structures.
• D. Newline Delimited JSON (NDJSON): This format represents JSON records separated by
newline characters, facilitating the storage and processing of multiple, separate JSON
objects in a single file.
These formats are chosen for their efficiency and compatibility with data analytics tools and
ecosystems, enabling seamless integration and processing of exported data.
References:
Questions 56
Options:
A.
1 hour
B.
2 hours
C.
4 hours
D.
8 hours
Show Answer Buy Now
Answer:
Explanation:
Explanation:
Questions 57
Options:
A.
File URL
B.
Scoped URL
C.
Pre-signed URL
D.
Answer:
Explanation:
Explanation:
A Pre-signed URL provides access to files stored in Snowflake without requiring authorization at
the time of access. This feature allows users to generate a URL with a limited validity period that
grants temporary access to a file in a secure manner. It's particularly useful for sharing data with
external parties or applications without the need for them to authenticate directly with
Snowflake.
References:
Questions 58
Which of the following statements apply to Snowflake in terms of security? (Choose two.)
Options:
A.
B.
C.
D.
Snowflake can run within a user's own Virtual Private Cloud (VPC).
E.
Answer:
A, C
Explanation:
Explanation:
Snowflake uses a Role-Based Access Control (RBAC) model to manage access to data and
resources. Additionally, Snowflake ensures that all data is encrypted, both at rest and in transit, to
provide a high level of security for data stored within the platform. References: [COF-C02]
SnowPro Core Certification Exam Study Guide
Questions 59
Which services does the Snowflake Cloud Services layer manage? (Choose two.)
Options:
A.
Compute resources
B.
Query execution
C.
Authentication
D.
Data storage
E.
Metadata
Answer:
C, E
Explanation:
Explanation:
The Snowflake Cloud Services layer manages various services, including authentication and
metadata management. This layer ties together all the different components of Snowflake to
process user requests, manage sessions, and control access3.
Questions 60
Which Snowflake SQL statement would be used to determine which users and roles have access
to a role called MY_ROLE?
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
The SQL statement SHOW GRANTS TO ROLE MY_ROLE is used to determine which users and
roles have access to a role called MY_ROLE. This statement lists all the privileges granted to the
role, including which roles and users can assume MY_ROLE. References: [COF-C02] SnowPro
Core Certification Exam Study Guide
Questions 61
Options:
A.
VIEWS_HISTORY
B.
OBJECT_HISTORY
C.
ACCESS_HISTORY
D.
LOGIN_HISTORY
Answer:
Explanation:
Explanation:
Questions 62
Options:
A.
B.
C.
To validate a file for errors before it gets executed using a COPY command
D.
To return errors from the last executed COPY command into table t1 in the current session
Explanation:
Explanation:
The SQL command Select * from table(validate(t1, job_id => '_last')); is used to return errors from
the last executed COPY command into table t1 in the current session. It checks the results of the
most recent data load operation and provides details on any errors that occurred during that
process1.
Questions 63
Options:
A.
ORGADMIN
B.
SECURITYADMIN
C.
SHAREADMIN
D.
ACCOUNTADMIN
Answer:
Explanation:
Explanation:
By default, the Snowflake role required to create a share is ACCOUNTADMIN (D). This role has
the necessary privileges to perform administrative tasks, including creating shares for data sharing
purposes
Questions 64
Why does Snowflake recommend file sizes of 100-250 MB compressed when loading data?
Options:
A.
Optimizes the virtual warehouse size and multi-cluster setting to economy mode
B.
C.
Increases the latency staging and accuracy when loading the data
D.
Answer:
Explanation:
Explanation:
Snowflake recommends file sizes between 100-250 MB compressed when loading data to
optimize parallel processing. Smaller, compressed files can be loaded in parallel, which maximizes
the efficiency of the virtual warehouses and speeds up the data loading process
Questions 65
Snowflake supports the use of external stages with which cloud platforms? (Choose three.)
Options:
A.
Amazon Web Services
B.
Docker
C.
IBM Cloud
D.
E.
F.
Oracle Cloud
Answer:
A, D, E
Explanation:
Explanation:
Snowflake supports the use of external stages with Amazon Web Services (AWS), Microsoft
Azure Cloud, and Google Cloud Platform (GCP). These platforms allow users to stage data
externally and integrate with Snowflake for data loading operations
Questions 66
Which columns are part of the result set of the Snowflake LATERAL FLATTEN command?
(Choose two.)
Options:
A.
CONTENT
B.
PATH
C.
BYTE_SIZE
D.
INDEX
E.
DATATYPE
Answer:
B, D
Explanation:
Explanation:
The LATERAL FLATTEN command in Snowflake produces a result set that includes several
columns, among which PATH and INDEX are includedPATH indicates the path to the element
within a data structure that needs to be flattened, and INDEX represents the index of the element
if it is an array2.
Questions 67
warehouse_size = MEDIUM
min_cluster_count = 1
max_cluster_count = 1
auto_suspend = 60
auto_resume = true;
The image below is a graphical representation of the warehouse utilization across two days.
What action should be taken to address this situation?
Options:
A.
B.
C.
D.
Answer:
C
Explanation:
Explanation:
Questions 68
Options:
A.
Role
B.
Schema
C.
User
D.
Database
E.
Account
F.
Tables
Answer:
C, E
Explanation:
Explanation:
Network policies in Snowflake can be set at the user level and at the account level2.
Questions 69
When loading data into Snowflake via Snowpipe what is the compressed file size
recommendation?
Options:
A.
10-50 MB
B.
100-250 MB
C.
300-500 MB
D.
1000-1500 MB
Answer:
Explanation:
Explanation:
For loading data into Snowflake via Snowpipe, the recommended compressed file size is between
100-250 MB. This size range is optimal for balancing the performance of parallel processing and
minimizing the overhead associated with handling many small files2.
Questions 70
Which minimum Snowflake edition allows for a dedicated metadata store?
Options:
A.
Standard
B.
Enterprise
C.
Business Critical
D.
Answer:
Explanation:
Explanation:
The Enterprise edition of Snowflake allows for a dedicated metadata store, providing additional
features designed for large-scale enterprises
Questions 71
What impacts the credit consumption of maintaining a materialized view? (Choose two.)
Options:
A.
B.
D.
E.
Answer:
C, D
Explanation:
Explanation:
The credit consumption for maintaining a materialized view is impacted by how often the base
table changes © and whether the materialized view has a cluster key defined (D). Changes to the
base table can trigger a refresh of the materialized view, consuming credits. Additionally, having a
cluster key defined can optimize the performance and credit usage during the materialized view’s
maintenance. References: SnowPro Core Certification materialized view credit consumption
Questions 72
Options:
A.
1 day
B.
7 days
C.
14 days
D.
64 days
Answer:
Explanation:
Explanation:
Snowpipe retains load history for 14 days. This allows users to view and audit the data that has
been loaded into Snowflake using Snowpipe within this time frame3.
Questions 73
Options:
A.
SYSADMIN
B.
ORGADMIN
C.
ACCOUNTADMIN
D.
SECURITYADMIN
Answer:
Explanation:
Explanation:
In a Snowflake role hierarchy, the top-level role is ACCOUNTADMIN. This role has the highest
level of privileges and is capable of performing all administrative functions within the Snowflake
account
Questions 74
What types of data listings are available in the Snowflake Data Marketplace? (Choose two.)
Options:
A.
Reader
B.
Consumer
C.
Vendor
D.
Standard
E.
Personalized
Answer:
C, E
Explanation:
Explanation:
In the Snowflake Data Marketplace, the types of data listings available include ‘Vendor’, which
refers to the providers of data, and ‘Personalized’, which indicates customized data offerings
tailored to specific consumer needs45.
Questions 75
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
Shares in Snowflake are named objects that encapsulate all the information required to share
databases, schemas, tables, secure views, and secure UDFs. These objects can be added to a
share by granting privileges on them to the share via a database role
Questions 76
Options:
A.
B.
C.
Answer:
Explanation:
Explanation:
The Snowflake Cloud Services layer is a collection of services that coordinate activities across
Snowflake, tying together all the different components to process user requests, from login to
query dispatch1.
Questions 77
What features that are part of the Continuous Data Protection (CDP) feature set in Snowflake do
not require additional configuration? (Choose two.)
Options:
A.
B.
C.
Data encryption
D.
Time Travel
E.
External tokenization
Show Answer Buy Now
Answer:
C, D
Explanation:
Explanation:
Data encryption and Time Travel are part of Snowflake’s Continuous Data Protection (CDP)
feature set that do not require additional configuration. Data encryption is automatically applied
to all files stored on internal stages, and Time Travel allows for querying and restoring data
without any extra setup
Questions 78
Options:
A.
A task can be called using a CALL statement to run a set of predefined SQL commands.
B.
A task allows a user to execute a single SQL statement/command using a predefined schedule.
C.
D.
A task can be executed using a SELECT statement to run a predefined SQL command.
Answer:
Explanation:
Explanation:
In Snowflake, a task allows a user to execute a single SQL statement/command using a
predefined schedule (B). Tasks are used to automate the execution of SQL statements at
scheduled intervals.
Questions 79
Options:
A.
B.
To allow users the ability to choose the type of compute nodes that make up a virtual warehouse
cluster
C.
D.
Answer:
Explanation:
Explanation:
Multi-cluster virtual warehouses in Snowflake are designed to manage user and query
concurrency needs. They allow for the allocation of additional clusters of compute resources,
either statically or dynamically, to handle increased loads and reduce or eliminate the queuing of
concurrent queries2.
https://docs.snowflake.com/en/user-guide/warehouses-
multicluster.html#:~:text=Multi%2Dcluster%20warehouses%20enable%20you,during%20peak%2
0and%20off%20hours .
Questions 80
When publishing a Snowflake Data Marketplace listing into a remote region what should be taken
into consideration? (Choose two.)
Options:
A.
There is no need to have a Snowflake account in the target region, a share will be created for each
user.
B.
The listing is replicated into all selected regions automatically, the data is not.
C.
The user must have the ORGADMIN role available in at least one account to link accounts for
replication.
D.
Shares attached to listings in remote regions can be viewed from any account in an organization.
E.
For a standard listing the user can wait until the first customer requests the data before
replicating it to the target region.
Answer:
B, C
Explanation:
Explanation:
When publishing a Snowflake Data Marketplace listing into a remote region, it’s important to note
that while the listing is replicated into all selected regions automatically, the data itself is not.
Therefore, the data must be replicated separately. Additionally, the user must have the
ORGADMIN role in at least one account to manage the replication of accounts1.
Questions 81
A.
The Data Consumer pays for data storage as well as for data computing.
B.
The shared data is copied into the Data Consumer account, so the Consumer can modify it
without impacting the base data of the Provider.
C.
D.
The Provider is charged for compute resources used by the Data Consumer to query the shared
data.
E.
The Data Consumer pays only for compute resources to query the shared data.
Answer:
C, E
Explanation:
Explanation:
In Snowflake’s data sharing model, any full Snowflake account can both provide and consume
shared data. Additionally, the data consumer pays only for the compute resources used to query
the shared data. No actual data is copied or transferred between accounts, and shared data does
not take up any storage in a consumer account, so the consumer does not pay for data storage1.
Questions 82
What is the minimum Snowflake edition required to use Dynamic Data Masking?
Options:
A.
Standard
B.
Enterprise
C.
Business Critical
D.
Answer:
Explanation:
Explanation:
The minimum Snowflake edition required to use Dynamic Data Masking is the Enterprise
edition. This feature is not available in the Standard edition2.
Questions 83
Options:
A.
Compute
B.
Data storage
C.
Cloud services
D.
Cloud provider
Show Answer Buy Now
Answer:
Explanation:
Explanation:
In Snowflake’s architecture, the Cloud Services layer is responsible for generating the query
execution plan. This layer handles all the coordination, optimization, and management tasks,
including query parsing, optimization, and compilation into an execution plan that can be
processed by the Compute layer.
Questions 84
What COPY INTO SQL command should be used to unload data into multiple files?
Options:
A.
SINGLE=TRUE
B.
MULTIPLE=TRUE
C.
MULTIPLE=FALSE
D.
SINGLE=FALSE
Answer:
Explanation:
Explanation:
The COPY INTO SQL command with the option SINGLE=FALSE is used to unload data into
multiple files. This option allows the data to be split into multiple files during the unload
process. References: SnowPro Core Certification COPY INTO SQL command unload multiple files
Questions 85
Which Snowflake feature allows a user to substitute a randomly generated identifier for sensitive
data, in order to prevent unauthorized users access to the data, before loading it into Snowflake?
Options:
A.
External Tokenization
B.
External Tables
C.
Materialized Views
D.
Answer:
Explanation:
Explanation:
The feature in Snowflake that allows a user to substitute a randomly generated identifier for
sensitive data before loading it into Snowflake is known as External Tokenization. This process
helps to secure sensitive data by ensuring that it is not exposed in its original form, thus
preventing unauthorized access3.
Questions 86
How should a virtual warehouse be configured if a user wants to ensure that additional multi-
clusters are resumed with no delay?
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
To ensure that additional multi-clusters are resumed with no delay, a virtual warehouse should be
configured to a size larger than generally required. This configuration allows for immediate
availability of additional resources when needed, without waiting for new clusters to start up
Questions 87
A table needs to be loaded. The input data is in JSON format and is a concatenation of multiple
JSON documents. The file size is 3 GB. A warehouse size small is being used. The following COPY
INTO command was executed:
Max LOB size (16777216) exceeded, actual size of parsed column is 17894470.
A.
B.
Split the file into multiple files in the recommended size range (100 MB - 250 MB).
C.
D.
Answer:
Explanation:
Explanation:
The error “Max LOB size (16777216) exceeded” indicates that the size of the parsed column
exceeds the maximum size allowed for a single column value in Snowflake, which is 16 MB. To
resolve this issue, the file should be split into multiple smaller files that are within the
recommended size range of 100 MB to 250 MB. This will ensure that each JSON document
within the files is smaller than the maximum LOB size allowed. Compressing the file, using a
larger-sized warehouse, or setting STRIP_OUTER_ARRAY=TRUE will not resolve the issue of the
column size exceeding the maximum allowed. References: COPY INTO Error during Structured
Data Load: “Max LOB size (16777216) exceeded…”
Questions 88
Which statements are correct concerning the leveraging of third-party data from the Snowflake
Data Marketplace? (Choose two.)
Options:
A.
Data is live, ready-to-query, and can be personalized.
B.
C.
D.
E.
Data transformations are required when combining Data Marketplace datasets with existing data
in Snowflake.
Answer:
A, D
Explanation:
Explanation:
When leveraging third-party data from the Snowflake Data Marketplace, the data is live, ready-
to-query, and can be personalized. Additionally, the data is available without the need for copying
or moving it to an individual Snowflake account, allowing for seamless integration with existing
data
Questions 89
Which Snowflake database object can be used to track data changes made to table data?
Options:
A.
Tag
B.
Task
C.
Stream
D.
Stored procedure
Answer:
Explanation:
Explanation:
A Stream object in Snowflake is used for change data capture (CDC), which records data
manipulation language (DML) changes made to tables, including inserts, updates, and deletes3.
Questions 90
When working with a managed access schema, who has the OWNERSHIP privilege of any tables
added to the schema?
Options:
A.
B.
C.
D.
Explanation:
Explanation:
In a managed access schema, the schema owner retains the OWNERSHIP privilege of any tables
added to the schema. This means that while object owners have certain privileges over the
objects they create, only the schema owner can manage privilege grants on these objects1.
Questions 91
Options:
A.
View
B.
Materialized view
C.
External table
D.
Answer:
Explanation:
Explanation:
Views can be used with Secure Data Sharing in Snowflake. Materialized views, external tables,
and UDFs are not typically shared directly for security and performance reasons2.
Questions 92
Which Snowflake view is used to support compliance auditing?
Options:
A.
ACCESS_HISTORY
B.
COPY_HISTORY
C.
QUERY_HISTORY
D.
Answer:
Explanation:
Explanation:
Questions 93
Options:
A.
SQL functions like the GET command can be used to copy the unstructured data to a location on
the client.
B.
SQL functions can be used to create different types of URLs pointing to the unstructured data.
These URLs can be used to download the data to a client.
C.
SQL functions can be used to retrieve the data from the query results cache. When the query
results are output to a client, the unstructured data will be output to the client as files.
D.
SQL functions can call on different web extensions designed to display different types of files as a
web page. The web extensions will allow the files to be downloaded to the client.
Questions 94
What does a Notify & Suspend action for a resource monitor do?
Options:
A.
Send an alert notification to all account users who have notifications enabled.
B.
Send an alert notification to all virtual warehouse users when thresholds over 100% have been
met.
C.
Send a notification to all account administrators who have notifications enabled, and suspend all
assigned warehouses after all statements being executed by the warehouses have completed.
D.
Send a notification to all account administrators who have notifications enabled, and suspend all
assigned warehouses immediately, canceling any statements being executed by the warehouses.
Answer:
C
Explanation:
Explanation:
The Notify & Suspend action for a resource monitor in Snowflake sends a notification to all
account administrators who have notifications enabled and suspends all assigned
warehouses. However, the suspension only occurs after all currently running statements in the
warehouses have been completed1. References: [COF-C02] SnowPro Core Certification Exam
Study Guide
Questions 95
Which views are included in the DATA SHARING USAGE schema? (Select TWO).
Options:
A.
ACCESS_HISTORY
B.
DATA_TRANSFER_HISTORY
C.
WAREHOUSE_METERING_HISTORY
D.
MONETIZED_USAGE_DAILY
E.
Answer:
D, E
Explanation:
Explanation:
The DATA_SHARING_USAGE schema includes views that display information about listings
published in the Snowflake Marketplace or a data exchange, which includes
DATA_TRANSFER_HISTORY and LISTING_TELEMETRY_DAILY2.
Questions 96
Options:
A.
B.
C.
D.
E.
Warehouses can only be used for querying and cannot be used for data loading.
Answer:
B, C
Explanation:
Explanation:
Virtual warehouses in Snowflake can be started and stopped at any time, providing flexibility in
managing compute resources. They can also be resized at any time, even while running, to
accommodate varying workloads910. References: [COF-C02] SnowPro Core Certification Exam
Study Guide
Questions 97
Which Snowflake command can be used to unload the result of a query to a single file?
Options:
A.
B.
C.
Use COPY INTO with SINGLE = TRUE followed by a GET command to download the file.
D.
Use COPY INTO with SINGLE = TRUE followed by a PUT command to download the file.
Answer:
Explanation:
Explanation:
The Snowflake command to unload the result of a query to a single file is COPY INTO with
SINGLE = TRUE followed by a GET command to download the file. This command unloads the
query result into a single file in the specified internal stage
Questions 98
Options:
A.
B.
C.
Answer:
Explanation:
Explanation:
SnowCD is a connectivity diagnostic tool that helps users troubleshoot network connections to
Snowflake. It performs a series of checks to evaluate the network connection and provides
suggestions for resolving any issues4.
Questions 99
How can performance be optimized for a query that returns a small amount of data from a very
large base table?
Options:
A.
B.
C.
D.
Answer:
C
Explanation:
Explanation:
The search optimization service in Snowflake is designed to improve the performance of selective
point lookup queries on large tables, which is ideal for scenarios where a query returns a small
amount of data from a very large base table1. References: [COF-C02] SnowPro Core Certification
Exam Study Guide
Questions 100
Which statistics are displayed in a Query Profile that indicate that intermediate results do not fit
in memory? (Select TWO).
Options:
A.
Bytes scanned
B.
Partitions scanned
C.
D.
E.
Answer:
C, D
Explanation:
Explanation:
The Query Profile statistics that indicate intermediate results do not fit in memory are the bytes
spilled to local storage and bytes spilled to remote storage2.
Questions 101
Which Snowflake feature allows a user to track sensitive data for compliance, discovery,
protection, and resource usage?
Options:
A.
Tags
B.
Comments
C.
Internal tokenization
D.
Answer:
Explanation:
Explanation:
Tags in Snowflake allow users to track sensitive data for compliance, discovery, protection, and
resource usage. They enable the categorization and tracking of data, supporting compliance with
privacy regulations678. References: [COF-C02] SnowPro Core Certification Exam Study Guide
Questions 102
What will occur when a user with a password that was defined by Snowflake attempts to log in to
Snowflake?
Options:
A.
The user will be unable to enter a password.
B.
The user will encounter an error, and will not be able to log in.
C.
D.
After entering the username and password, the user will be redirected to an Identity Provider
(IdP) login page.
Answer:
Explanation:
Explanation:
What value provides information about disk usage for operations where intermediate results do
not fit in memory in a Query Profile?
A. IO
B. Network
C. Pruning
D. Spilling
Answer: D
In Snowflake, when a query execution requires more memory than what is available, Snowflake
handles these situations by spilling the intermediate results to disk. This process is known as
"spilling." The Query Profile in Snowflake includes a metric that helps users identify when and
how much data spilling occurs during the execution of a query. This information is crucial for
optimizing queries as excessive spilling can significantly slow down query performance. The value
that provides this information about disk usage due to intermediate results not fitting in memory
is appropriately labeled as "Spilling" in the Query Profile.
References:
• Snowflake Documentation on Query Profile and Performance: This section explains the
various components of the query profile, including the spilling metric, which indicates disk
usage for operations where intermediate results exceed available memory.
Questions 103
What metadata does Snowflake store for rows in micro-partitions? (Select TWO).
Options:
A.
Range of values
B.
Distinct values
C.
Index values
D.
Sorted values
E.
Null values
Answer:
A, B
Explanation:
Explanation:
Snowflake stores metadata for rows in micro-partitions, including the range of values for each
column and the number of distinct values1.
Questions 104
Which metadata table will store the storage utilization information even for dropped tables?
Options:
A.
DATABASE_STORAGE_USAGE_HISTORY
B.
TABLE_STORAGE_METRICS
C.
STORAGE_DAILY_HISTORY
D.
Answer:
Explanation:
Explanation:
Questions 105
Options:
A.
TO_CHAR
B.
TO_VARIANT
C.
TO_VARCHAR
D.
Answer:
Explanation:
Explanation:
The STRIP_NULL_VALUE function in Snowflake is used to convert a JSON null value into a SQL
NULL value1.
Questions 106
Options:
A.
FOR
B.
LOOP
C.
REPEAT
D.
WHILE
Explanation:
Explanation:
The LOOP type of loop in Snowflake Scripting does not have a built-in termination condition and
requires a BREAK statement to stop executing4.
Questions 107
Which commands are restricted in owner's rights stored procedures? (Select TWO).
Options:
A.
SHOW
B.
MERGE
C.
INSERT
D.
DELETE
E.
DESCRIBE
Answer:
A, E
Explanation:
Explanation:
In owner’s rights stored procedures, certain commands are restricted to maintain security and
integrity. The SHOW and DESCRIBE commands are limited because they can reveal metadata and
structure information that may not be intended for all roles.
Questions 108
Options:
A.
B.
C.
D.
E.
Answer:
A, C
Explanation:
Explanation:
The factors that impact storage costs in Snowflake include the account type (Capacity or On
Demand) and the cloud region used by the account. These factors determine the rate at which
storage is billed, with different regions potentially having different rates3.
Questions 109
A permanent table and temporary table have the same name, TBL1, in a schema.
What will happen if a user executes select * from TBL1 ;?
Options:
A.
The temporary table will take precedence over the permanent table.
B.
The permanent table will take precedence over the temporary table.
C.
An error will say there cannot be two tables with the same name in a schema.
D.
The table that was created most recently will take precedence over the older table.
Answer:
Explanation:
Explanation:
In Snowflake, if a temporary table and a permanent table have the same name within the same
schema, the temporary table takes precedence over the permanent table within the session
where the temporary table was created4.
Questions 110
Options:
A.
A single data type, with one or more conditions, and one or more masking functions
B.
A single data type, with only one condition, and only one masking function
C.
Multiple data types, with only one condition, and one or more masking functions
D.
Multiple data types, with one or more conditions, and one or more masking functions
Answer:
Explanation:
Explanation:
A masking policy in Snowflake consists of a single data type, with one or more conditions, and
one or more masking functions. These components define how the data is masked based on the
specified conditions3.
Questions 111
Options:
A.
B.
C.
D.
Answer:
A
Explanation:
Explanation:
To reduce data spilling in Snowflake, using a larger virtual warehouse is effective because it
provides more memory and local disk space, which can accommodate larger data operations and
minimize the need to spill data to disk or remote storage1. References: [COF-C02] SnowPro Core
Certification Exam Study Guide
Questions 112
What objects in Snowflake are supported by Dynamic Data Masking? (Select TWO).'
Options:
A.
Views
B.
Materialized views
C.
Tables
D.
External tables
E.
Future grants
Answer:
A, C
Explanation:
Explanation:
Dynamic Data Masking in Snowflake supports tables and views. These objects can have masking
policies applied to their columns to dynamically mask data at query time3.
Questions 113
Which Snowflake data types can be used to build nested hierarchical data? (Select TWO)
Options:
A.
INTEGER
B.
OBJECT
C.
VARIANT
D.
VARCHAR
E.
LIST
Answer:
B, C
Explanation:
Explanation:
The Snowflake data types that can be used to build nested hierarchical data are OBJECT and
VARIANT. These data types support the storage and querying of semi-structured data, allowing
for the creation of complex, nested data structures
Questions 114
A column named "Data" contains VARIANT data and stores values as follows:
How will Snowflake extract the employee's name from the column data?
Options:
A.
Data:employee.name
B.
DATA:employee.name
C.
data:Employee.name
D.
data:employee.name
Answer:
Explanation:
Explanation:
In Snowflake, to extract a specific value from a VARIANT column, you use the column name
followed by a colon and then the key. The keys are case-sensitive. Therefore, to extract the
employee’s name from the “Data” column, the correct syntax is data:employee.name.
Questions 115
A.
TO_OBJECT
B.
TO_JSON
C.
TO_VARIANT
D.
OBJECT CONSTRUCT
Answer:
Explanation:
Explanation:
The TO_JSON function is used to convert a VARIANT value into a string containing the JSON
representation of the value. This function is suitable for unloading data from a relational table to
JSON format. References: [COF-C02] SnowPro Core Certification Exam Study Guide
Questions 116
Options:
A.
Object owner
B.
Schema owner
C.
Database owner
D.
SYSADMIN
Answer:
Explanation:
Explanation:
In a regular schema within Snowflake, the object owner has the privilege to grant object
privileges. The object owner is typically the role that created the object or to whom the
ownership of the object has been transferred78.
Questions 117
Which VALIDATION_MODE value will return the errors across the files specified in a COPY
command, including files that were partially loaded during an earlier load?
Options:
A.
RETURN_-1_R0WS
B.
RETURN_n_ROWS
C.
RETURN_ERRORS
D.
Explanation:
Explanation:
The RETURN_ERRORS value in the VALIDATION_MODE option of the COPY command instructs
Snowflake to validate the data files and return errors encountered across all specified files,
including those that were partially loaded during an earlier load2. References: [COF-C02]
SnowPro Core Certification Exam Study Guide
Questions 118
Which data types can be used in Snowflake to store semi-structured data? (Select TWO)
Options:
A.
ARRAY
B.
BLOB
C.
CLOB
D.
JSON
E.
VARIANT
Answer:
A, E
Explanation:
Explanation:
Snowflake supports the storage of semi-structured data using the ARRAY and VARIANT data
types. The ARRAY data type can directly contain VARIANT, and thus indirectly contain any other
data type, including itself. The VARIANT data type can store a value of any other type,
including OBJECT and ARRAY, and is often used to represent semi-structured data formats like
JSON, Avro, ORC, Parquet, or XML34.
Questions 119
In the Data Exchange, who can get or request data from the listings? (Select TWO).
Options:
A.
B.
C.
D.
E.
Answer:
A, D
Explanation:
Explanation:
In the Snowflake Data Exchange, the ability to get or request data from listings is generally
controlled by specific roles and privileges:
• A. Users with ACCOUNTADMIN role: This role typically has the highest level of access
within a Snowflake account, including the ability to manage and access all features and
functions. Users with this role can access data listings within the Data Exchange.
• D. Users with import share privilege: This specific privilege is necessary for users who
need to import shared data from the Data Exchange. This privilege allows them to request
and access data listings explicitly shared with them.
Questions 120
Which solution improves the performance of point lookup queries that return a small number of
rows from large tables using highly selective filters?
Options:
A.
Automatic clustering
B.
Materialized views
C.
D.
Answer:
Explanation:
Explanation:
The search optimization service improves the performance of point lookup queries on large tables
by using selective filters to quickly return a small number of rows. It creates an optimized data
structure that helps in pruning the micro-partitions that do not contain the queried values3.
References: [COF-C02] SnowPro Core Certification Exam Study Guide
Questions 121
How does Snowflake handle the data retention period for a table if a stream has not been
consumed?
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
In Snowflake, the use of streams impacts how the data retention period for a table is handled,
particularly in scenarios where the stream has not been consumed. The key point to understand is
that Snowflake's streams are designed to capture data manipulation language (DML) changes such
as INSERTS, UPDATES, and DELETES that occur on a source table. Streams maintain a record of
these changes until they are consumed by a DML operation or a COPY command that references
the stream.
When a stream is created on a table and remains unconsumed, Snowflake extends the data
retention period of the table to ensure that the changes captured by the stream are preserved.
This extension is specifically up to the point in time represented by the stream's offset, which
effectively ensures that the data necessary for consuming the stream's contents is retained. This
mechanism is in place to prevent data loss and ensure the integrity of the stream's data,
facilitating accurate and reliable data processing and analysis based on the captured DML
changes.
This behavior emphasizes the importance of managing streams and their consumption
appropriately to balance between data retention needs and storage costs. It's also crucial to
understand how this temporary extension of the data retention period impacts the overall
management of data within Snowflake, including aspects related to data lifecycle, storage cost
implications, and the planning of data consumption strategies.
References:
Questions 122
What is a non-configurable feature that provides historical data that Snowflake may recover
during a 7-day period?
Options:
A.
Fail-safe
B.
Time Travel
C.
Cloning
D.
Account replication
Answer:
A
Explanation:
Explanation:
Questions 123
Options:
A.
B.
Snowflake tables are the physical instantiation of data loaded into Snowflake.
C.
D.
Answer:
Explanation:
Explanation:
In Snowflake, tables represent a logical structure through which users interact with the stored
data. The actual physical data is stored in micro-partitions managed by Snowflake, and the logical
table structure provides the means by which SQL operations are mapped to this data. This
architecture allows Snowflake to optimize storage and querying across its distributed, cloud-
based data storage system.References: Snowflake Documentation on Tables
Questions 124
Which Snowflake table type is only visible to the user who creates it, can have the same name as
permanent tables in the same schema, and is dropped at the end of the session?
Options:
A.
Temporary
B.
Local
C.
User
D.
Transient
Answer:
Explanation:
Explanation:
In Snowflake, a Temporary table is a type of table that is only visible to the user who creates it,
can have the same name as permanent tables in the same schema, and is automatically dropped at
the end of the session in which it was created. Temporary tables are designed for transient data
processing needs, where data is needed for the duration of a specific task or session but not
beyond. Since they are automatically cleaned up at the end of the session, they help manage
storage usage efficiently and ensure that sensitive data is not inadvertently persisted.
References:
How does Snowflake define i1s approach to Discretionary Access Control (DAC)?
Options:
A.
B.
C.
Each object has an owner, who can in turn grail access to that object.
D.
Access privileges are assigned to roles. which are in turn assigned to use's
Answer:
Explanation:
Explanation:
Snowflake implements Discretionary Access Control (DAC) by using a role-based access control
model. In this model, access privileges are not directly assigned to individual objects or users but
are encapsulated within roles. These roles are then assigned to users, effectively granting them
the access privileges contained within the role. This approach allows for granular control over
database access, making it easier to manage permissions in a scalable and flexible
manner.References: Snowflake Documentation on Access Control
Questions 126
A Snowflake user wants to optimize performance for a query that queries only a small number of
rows in a table. The rows require significant processing. The data in the table does not change
frequently.
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
In a scenario where a Snowflake user queries only a small number of rows that require significant
processing and the data in the table does not change frequently, the most effective way to
optimize performance is by creating a materialized view based on the query. Materialized views
store the result of the query and can significantly reduce the computation time for queries that
are executed frequently over unchanged data.
• Why Materialized Views: Materialized views precompute and store the result of the query.
This is especially beneficial for queries that require heavy processing. Since the data does
not change frequently, the materialized view will not need to be refreshed often, making it
an ideal solution for this use case.
• Implementation Steps:
CREATE MATERIALIZED VIEW my_materialized_view AS SELECT ... FROM my_table WHERE ...;
• uk.co.certification.simulator.questionpool.PList@18720bb0
Reference: [Reference: For more information on materialized views and how they can be used to
optimize query performance, refer to the Snowflake documentation on materialized views:
https://docs.snowflake.com/en/user-guide/views-materialized.html, , ]
Questions 127
Which chart type is supported in Snowsight for Snowflake users to visualize data with
dashboards?
Options:
A.
Area chart
B.
Box plot
C.
Heat grid
D.
Pie chart
Answer:
Explanation:
Explanation:
Snowsight, Snowflake's user interface for exploring, analyzing, and visualizing data, supports a
variety of chart types for creating dashboards and visualizations. One of the supported chart
types in Snowsight is the Area Chart (A). Area charts are useful for representing quantities over
time and can be used to highlight volume change and rate of change, as well as to compare
multiple quantities.
While Snowsight supports many types of visualizations to help users analyze their data
effectively, including line charts, bar charts, and scatter plots, it's important to select the specific
reference documentation or release notes for the most current list of supported chart types, as
Snowflake continues to enhance and update Snowsight's capabilities.
As of the last update, Box plots (B), Heat grids (C), and Pie charts (D) are types of visualizations
that may be supported in various analytics and visualization tools, but for the specific context of
Snowsight's currently confirmed features, Area charts are a verified option for users to visualize
their data.
Questions 128
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
In Snowflake, a user within a reader account primarily has read-only access to the shared data.
Reader accounts are created to enable third parties or separate business units to access and query
data shared with them without allowing them to modify the underlying data. This means a user
with a reader account can perform operations like querying shared data to analyze or report on it
but cannot load new data, update existing data, or create new shares. This setup is crucial for
maintaining data governance and security while enabling data sharing and
collaboration.References: Snowflake Documentation on Reader Accounts
Questions 129
Options:
A.
User
B.
Role
C.
Attribute
D.
Schema
Answer:
Explanation:
Explanation:
In Snowflake, permissions for accessing database objects, including tables, are not granted directly
to users but rather to roles. A role encapsulates a collection of privileges on various Snowflake
objects. Users are then granted roles, and through those roles, they inherit the permissions
necessary to read a table or perform other actions. This approach adheres to the principle of least
privilege, allowing for granular control over database access and simplifying the management of
user permissions.
What does Snowflake recommend for a user assigned the ACCOUNTADMIN role?
Options:
A.
B.
C.
D.
There should be just one user with the ACCOUNTADMIN role in each Snowflake account.
Answer:
Explanation:
Explanation:
For users assigned the ACCOUNTADMIN role, Snowflake recommends enforcing Multi-Factor
Authentication (MFA) to enhance security. The ACCOUNTADMIN role has extensive
permissions, making it crucial to secure accounts held by such users against unauthorized access.
MFA adds an additional layer of security by requiring a second form of verification beyond just
the username and password, significantly reducing the risk of account compromise.References:
Snowflake Security Best Practices
Questions 131
What does Snowflake attempt to do if any of the compute resources for a virtual warehouse fail
to provision during start-up?
Options:
A.
Repair the failed resources.
B.
C.
D.
Answer:
Explanation:
Explanation:
If any compute resources for a virtual warehouse fail to provision during startup, Snowflake will
attempt to restart those failed resources. The system is designed to automatically handle transient
issues that might occur during the provisioning of compute resources. By restarting the failed
resources, Snowflake aims to ensure that the virtual warehouse has the necessary compute
capacity to handle the user's workloads without manual intervention.References: Snowflake
Documentation on Virtual Warehouses
Questions 132
When unloading data with the COPY into command, what is the purpose of the PARTITION
BY parameter option?
Options:
A.
B.
To delimit the records in the output file using the specified expression.
C.
To include a new column in the output using the specified window function expression.
D.
To split the output into multiple files, one for each distinct value of the specified expression.
Answer:
Explanation:
Explanation:
The PARTITION BY <expression> parameter option in the COPY INTO <location> command is
used to split the output into multiple files based on the distinct values of the specified expression.
This feature is particularly useful for organizing large datasets into smaller, more manageable files
and can help with optimizing downstream processing or consumption of the data. For example, if
you are unloading a large dataset of transactions and use PARTITION BY
DATE(transactions.transaction_date), Snowflake generates a separate output file for each unique
transaction date, facilitating easier data management and access.
This approach to data unloading can significantly improve efficiency when dealing with large
volumes of data by enabling parallel processing and simplifying data retrieval based on specific
criteria or dimensions.
References:
Questions 133
A user needs to MINIMIZE the cost of large tables that are used to store transitory data. The data
does not need to be protected against failures, because the data can be reconstructed outside of
Snowflake.
Options:
A.
Permanent
B.
Transient
C.
Temporary
D.
Externa
Answer:
Explanation:
Explanation:
For minimizing the cost of large tables that are used to store transitory data, which does not need
to be protected against failures because it can be reconstructed outside of Snowflake, the best
table type to use is Transient. Transient tables in Snowflake are designed for temporary or
transitory data storage and offer reduced storage costs compared to permanent tables. However,
unlike temporary tables, they persist across sessions until explicitly dropped.
• Why Transient Tables: Transient tables provide a cost-effective solution for storing data
that is temporary but needs to be available longer than a single session. They have lower
data storage costs because Snowflake does not maintain historical data (Time Travel) for as
long as it does for permanent tables.
• Creating a Transient Table:
• Use Case Considerations: Transient tables are ideal for scenarios where the data is not
critical, can be easily recreated, and where cost optimization is a priority. They are suitable
for development, testing, or staging environments where data longevity is not a concern.
Reference: [Reference: For more details on transient tables and their usage scenarios, refer to the
Snowflake documentation on table types: https://docs.snowflake.com/en/sql-
reference/sql/create-table.html#table-types, , ]
Questions 134
Which statements reflect valid commands when using secondary roles? (Select TWO).
Options:
A.
B.
C.
D.
E.
Answer:
C, E
Explanation:
Explanation:
•
• Incorrect Commands: The options referencing "RESUME", "SUSPEND", and "ADD" are not
valid commands in the context of secondary roles.
References:
Questions 135
Who can create network policies within Snowflake? (Select TWO).
Options:
A.
SYSADMIN only
B.
ORCADMIN only
C.
D.
E.
Answer:
C, D
Explanation:
Explanation:
In Snowflake, network policies define the allowed IP address ranges from which users can
connect to Snowflake, enhancing security by restricting access based on network location. The
creation and management of network policies require sufficient privileges. Specifically, a user with
the SECURITYADMIN role or any role with higher privileges, such as ACCOUNTADMIN, can
create network policies. Additionally, a custom role can be granted the CREATE NETWORK
POLICY privilege, enabling users assigned to that role to also create network policies. This
approach allows for flexible and secure management of network access to Snowflake.References:
Snowflake Documentation on Network Policies
Questions 136
Which table function should be used to view details on a Directed Acyclic Graphic (DAG) run that
is presently scheduled or is executing?
Options:
A.
TASK_HISTORY
B.
TASK_DEPENDENTS
C.
CURRENT_TASK_GRAPHS
D.
COMPLETE_TASK_GRAPHS
Answer:
Explanation:
Explanation:
Questions 137
Which Snowflake feature or tool helps troubleshoot issues in SQL query expressions that
commonly cause performance bottlenecks?
Options:
A.
B.
QUERY_HISTORY View
C.
D.
Query Profile
Answer:
Explanation:
Explanation:
The Snowflake feature that helps troubleshoot issues in SQL query expressions and commonly
identify performance bottlenecks is the Query Profile. The Query Profile provides a detailed
breakdown of a query's execution plan, including each operation's time and resources consumed.
It visualizes the steps involved in the query execution, highlighting areas that may be causing
inefficiencies, such as full table scans, large joins, or operations that could benefit from
optimization.
By examining the Query Profile, developers and database administrators can identify and
troubleshoot performance issues, optimize query structures, and make informed decisions about
potential schema or indexing changes to improve performance.
References:
Questions 138
For how many days will the data be retained at the object level?
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
The settings shown in the image indicate that the data retention time in days is configured at two
different levels: the account level and the object level. At the account level,
the MIN_DATA_RETENTION_TIME_IN_DAYS is set to 5 days, and at the object level,
the DATA_RETENTION_TIME_IN_DAYS is set to 2 days. Since the object level setting has a
lower value, it takes precedence over the account level setting for the specific object. Therefore,
the data will be retained for 2 days at the object level.References: Snowflake Documentation on
Data Retention Policies
Questions 139
Which activities are managed by Slowflake's Cloud Services layer? (Select TWO).
Options:
A.
Authorisation
B.
Access delegation
C.
Data pruning
D.
Data compression
E.
Answer:
A, E
Explanation:
Explanation:
Snowflake's Cloud Services layer is responsible for managing various aspects of the platform that
are not directly related to computing or storage. Specifically, it handles authorisation, ensuring
that users have appropriate access rights to perform actions or access data. Additionally, it takes
care of query parsing and optimization, interpreting SQL queries and optimizing their execution
plans for better performance. This layer abstracts much of the platform's complexity, allowing
users to focus on their data and queries without managing the underlying
infrastructure.References: Snowflake Architecture Documentation
Questions 140
Options:
A.
B.
C.
The suspended warehouse is resumed and new compute resources are provisioned immediately.
D.
The additional compute resources are provisioned when the warehouse is resumed.
Answer:
Explanation:
Explanation:
In Snowflake, resizing a virtual warehouse that is currently suspended does not immediately
provision the new compute resources. Instead, the change in size is recorded, and the additional
compute resources are provisioned when the warehouse is resumed. This means that the action
of resizing a suspended warehouse does not cause it to resume operation automatically. The
warehouse remains suspended until an explicit command to resume it is issued, or until it
automatically resumes upon the next query execution that requires it.
This behavior allows for efficient management of compute resources, ensuring that credits are not
consumed by a warehouse that is not in use, even if its size is adjusted while it is suspended.
Questions 141
Options:
A.
Account role
B.
Custom role
C.
Database role
D.
Secondary role
Answer:
Explanation:
Explanation:
In Snowflake, shares are used to share data between Snowflake accounts. When creating a share,
it is possible to grant access to the share to roles within the Snowflake account that is creating
the share. The type of role that can be granted to a share is a Custom role. Custom roles are user-
defined roles that account administrators can create to manage access control in a more granular
way. Unlike predefined roles such as ACCOUNTADMIN or SYSADMIN, custom roles can be
tailored with specific privileges to meet the security and access requirements of different groups
within an organization.
Granting a custom role access to a share enables users associated with that role to access the
shared data if the share is received by another Snowflake account. It is important to carefully
manage the privileges granted to custom roles to ensure that data sharing aligns with
organizational policies and data governance standards.
References:
Questions 142
Snowflake's access control framework combines which models for securing data? (Select TWO).
Options:
A.
B.
D.
E.
Answer:
B, D
Explanation:
Explanation:
Questions 143
Based on Snowflake recommendations, when creating a hierarchy of custom roles, the top-most
custom role should be assigned to which role?
Options:
A.
ACCOUNTADMIN
B.
SECURITYADMIN
C.
SYSADMIN
D.
USERADMIN
Answer:
Explanation:
Explanation:
Based on Snowflake recommendations, when creating a hierarchy of custom roles, the top-most
custom role should ideally be granted to the ACCOUNTADMIN role. This recommendation stems
from the best practices for implementing a least privilege access control model, ensuring that only
the necessary permissions are granted at each level of the role hierarchy. The ACCOUNTADMIN
role has the highest level of privileges in Snowflake, including the ability to manage all aspects of
the Snowflake account. By assigning the top-most custom role to ACCOUNTADMIN, you ensure
that the administration of role hierarchies and the assignment of roles remain under the control of
users with the highest level of oversight and responsibility within the Snowflake environment.
References:
Questions 144
Options:
A.
The cache is dropped when the warehouse is suspended and is no longer available upon restart.
B.
The warehouse cache persists for as long the warehouse exists, regardless of its suspension
status.
C.
The cache is maintained for up to two hours and can be restored If the warehouse Is restarted
within this limit.
D.
The cache is maintained for the auto suspend duration and can be restored it the warehouse 15
restarted within this limit.
Answer:
Explanation:
Explanation:
When a virtual warehouse in Snowflake is suspended, the cache is dropped and is no longer
available upon restart. This means that all cached data, including results and temporary data, are
cleared from memory. The purpose of this behavior is to conserve resources while the warehouse
is not active. Upon restarting the warehouse, it will need to reload any data required for queries
from storage, which may result in a slower initial performance until the cache is repopulated. This
is a critical consideration for managing performance and cost in Snowflake.
Questions 145
Which Snowflake feature records changes mace to a table so actions can be taken using that
change data capture?
Options:
A.
Materialized View
B.
Pipe
C.
Stream
D.
Task
Show Answer Buy Now
Answer:
Explanation:
Explanation:
Snowflake's Streams feature is specifically designed for change data capture (CDC). A stream
records insert, update, and delete operations performed on a table, and allows users to query
these changes. This enables actions to be taken on the changed data, facilitating processes like
incremental data loads and real-time analytics. Streams provide a powerful mechanism for
applications to respond to data changes in Snowflake tables efficiently.References: Snowflake
Documentation on Streams
Questions 146
Which privilege is required on a virtual warehouse to abort any existing executing queries?
Options:
A.
USAGE
B.
OPERATE
C.
MODIFY
D.
MONITOR
Answer:
B
Explanation:
Explanation:
The privilege required on a virtual warehouse to abort any existing executing queries
is OPERATE. The OPERATE privilege on a virtual warehouse allows a user to perform operational
tasks on the warehouse, including starting, stopping, and restarting the warehouse, as well as
aborting running queries. This level of control is essential for managing resource utilization and
ensuring that the virtual warehouse operates efficiently.
References:
Questions 147
Options:
A.
B.
C.
D.
The privileges can be revoked by any user-defined role with appropriate privileges.
Answer:
Explanation:
Explanation:
The privileges granted to Snowflake's system-defined roles cannot be revoked. System-defined
roles, such as SYSADMIN, ACCOUNTADMIN, SECURITYADMIN, and others, come with a set of
predefined privileges that are essential for the roles to function correctly within the Snowflake
environment. These privileges are intrinsic to the roles and ensure that users assigned these roles
can perform the necessary tasks and operations relevant to their responsibilities.
The design of Snowflake's role-based access control (RBAC) model ensures that system-defined
roles have a specific set of non-revocable privileges to maintain the security, integrity, and
operational efficiency of the Snowflake environment. This approach prevents accidental or
intentional modification of privileges that could disrupt the essential functions or compromise the
security of the Snowflake account.
References:
Questions 148
Which function can be used with the copy into statement to convent rows from a relational table
to a single variant column, and to unload rows into a JSON file?
Options:
A.
FLATTEN
B.
OBJECT_AS
C.
OBJECT_CONSTRUCT
D.
TO VARIANT
Answer:
D
Explanation:
Explanation:
The correct function to use with the COPY INTO <location> statement to convert rows from a
relational table into a single variant column and to unload rows into a JSON file is TO VARIANT.
The TO VARIANT function is used to explicitly convert a value of any supported data type into a
VARIANT data type. This is particularly useful when needing to aggregate multiple columns or
complex data structures into a single JSON-formatted string, which can then be unloaded into a
file.
In the context of unloading data, the COPY INTO <location> statement combined with TO
VARIANT enables the conversion of structured data from Snowflake tables into a semi-structured
VARIANT format, typically JSON, which can then be efficiently exported and stored. This
approach is often utilized for data integration scenarios, backups, or when data needs to be
shared in a format that is easily consumed by various applications or services that support JSON.
References:
Questions 149
Which Snowflake data governance feature can support auditing when a user query reads column
data?
Options:
A.
Access History
B.
Data classification
C.
Column-level security
D.
Object dependencies
Explanation:
Explanation:
Access History in Snowflake is a feature designed to support auditing by tracking access to data
within Snowflake, including when a user's query reads column data. It provides detailed
information on queries executed, including the user who ran the query, the query text, and the
objects (e.g., tables, views) accessed by the query. This feature is instrumental for auditing
purposes, helping organizations to monitor and audit data access for security and compliance.
Reference: [Reference: Snowflake Documentation on Access History which details its use for
auditing access to data: https://docs.snowflake.com/en/user-guide/access-history.html, , ]
Questions 150
Options:
A.
B.
C.
Set the minimum Clusters and maximum Clusters settings to the same value.
D.
Set the minimum clusters and maximum clusters settings to different values.
Answer:
C
Explanation:
Explanation:
To run a multi-cluster warehouse in maximized mode, a user should set the minimum and
maximum number of clusters to the same value. This ensures that all clusters are available when
the warehouse is started, providing maximum resources for query execution. References:
Snowflake Documentation2.
Questions 151
Options:
A.
ACCOUNTADMIN
B.
SECURITYADMIN
C.
SYSADMIN
D.
USERADMIN
Answer:
Explanation:
Explanation:
The role that can by default create resource monitors in Snowflake is the ACCOUNTADMIN role.
Resource monitors are a crucial feature in Snowflake that allows administrators to track and
control the consumption of compute resources, ensuring that usage stays within specified limits.
The creation and management of resource monitors involve defining thresholds for credits usage,
setting up notifications, and specifying actions to be taken when certain thresholds are exceeded.
Given the significant impact that resource monitors can have on the operational aspects and
billing of a Snowflake account, the capability to create and manage them is restricted to
the ACCOUNTADMIN role. This role has the broadest set of privileges in Snowflake, including
the ability to manage all aspects of the account, such as users, roles, warehouses, databases, and
resource monitors, among others.
References:
Questions 152
If file format options are specified in multiple locations, the load operation selects which option
FIRST to apply in order of precedence?
Options:
A.
Table definition
B.
Stage definition
C.
Session level
D.
Answer:
Explanation:
Explanation:
When file format options are specified in multiple locations, the load operation applies the
options in the following order of precedence: first, the COPY INTO TABLE statement; second, the
stage definition; and third, the table definition1
Questions 153
Options:
A.
The warehouse size does not change until all queries currently running in the warehouse have
completed.
B.
The warehouse size does not change until all queries currently in the warehouse queue have
completed.
C.
The warehouse size does not change until the warehouse is suspended and restarted.
D.
It does not return from the command until the warehouse has finished changing its size.
Answer:
Explanation:
Explanation:
Questions 154
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
To address query concurrency issues, a user can add additional clusters to the virtual
warehouse. This allows for the distribution of queries across multiple clusters, reducing the load
on any single cluster and improving overall query performance2.
Questions 155
A user needs to create a materialized view in the schema MYDB.MYSCHEMA. Which statements
will provide this access?
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
To provide a user with the necessary access to create a materialized view in a schema, the user
must be granted a role that has the CREATE MATERIALIZED VIEW privilege on that
schema. First, the role is granted to the user, and then the privilege is granted to the role
Questions 156
Options:
A.
inscrtFilcs
B.
insertReport
C.
GET /api/tiles/
D.
loadHistoryScan
Show Answer Buy Now
Answer:
Explanation:
Explanation:
The REST API used with unstructured data in Snowflake is GET /api/files/, which retrieves
(downloads) a data file from an internal or external stage4.
Questions 157
For the ALLOWED VALUES tag property, what is the MAXIMUM number of possible string
values for a single tag?
Options:
A.
10
B.
50
C.
64
D.
256
Answer:
Explanation:
Explanation:
For the ALLOWED VALUES tag property, the maximum number of possible string values for a
single tag is 256. This allows for a wide range of values to be assigned to a tag when it is set on an
object
Questions 158
What is the recommended compressed file size range for continuous data loads using Snowpipe?
Options:
A.
8-16 MB
B.
16-24 MB
C.
10-99 MB
D.
100-250 MB
Answer:
Explanation:
Explanation:
For continuous data loads using Snowpipe, the recommended compressed file size range is
between 100-250 MB. This size range is suggested to optimize the number of parallel operations
for a load and to avoid size limitations, ensuring efficient and cost-effective data loading
Questions 159
Options:
A.
Schema stage
B.
Named stage
C.
User stage
D.
Stream stage
E.
Table stage
F.
Database stage
Answer:
B, C, E
Explanation:
Explanation:
Snowflake supports three types of internal stages: Named, User, and Table stages. These stages
are used for staging data files to be loaded into Snowflake tables. Schema, Stream, and Database
stages are not supported as internal stages in Snowflake. References: Snowflake
Documentation1.
Questions 160
Options:
A.
B.
Use the ALTER VIEW command to update the view
C.
D.
Answer:
Explanation:
Explanation:
In Snowflake, to change the columns referenced in a view, the view must be recreated with the
required changes. The ALTER VIEW command does not allow changing the definition of a view; it
can only be used to rename a view, convert it to or from a secure view, or add, overwrite, or
remove a comment for a view. Therefore, the correct approach is to drop the existing view and
create a new one with the desired column references.
Questions 161
How does Snowflake recommend handling the bulk loading of data batches from files already
available in cloud storage?
Options:
A.
Use Snowpipe.
B.
C.
D.
Answer:
Explanation:
Explanation:
Snowflake recommends using the COPY command for bulk loading data batches from files
already available in cloud storage. This command allows for efficient and large-scale data loading
operations from files staged in cloud storage into Snowflake tables3.
Questions 162
Options:
A.
B.
C.
Snowpark does not require that a separate cluster be running outside of Snowflake.
D.
Snowpark allows users to run existing Spark code on virtual warehouses without the need to
reconfigure the code.
E.
Snowpark executes as much work as possible in the source databases for all operations including
User-Defined Functions (UDFs).
Answer:
C, D
Explanation:
Explanation:
Questions 163
A Snowflake user executed a query and received the results. Another user executed the same
query 4 hours later. The data had not changed.
Options:
A.
No virtual warehouse will be used, data will be read from the result cache.
B.
No virtual warehouse will be used, data will be read from the local disk cache.
C.
D.
The virtual warehouse that is defined at the session level will be used to read all data.
Answer:
Explanation:
Explanation:
Snowflake maintains a result cache that stores the results of every query for 24 hours. If the
same query is executed again within this time frame and the data has not changed, Snowflake will
retrieve the data from the result cache instead of using a virtual warehouse to recompute the
results2.
Questions 164
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
During a query on a virtual warehouse, the columns in the result set of the query are cached. This
allows for faster retrieval of data if the same or a similar query is run again, as the system can
retrieve the data from the cache rather than reprocessing the entire query. References: [COF-
C02] SnowPro Core Certification Exam Study Guide
Questions 165
Which native data types are used for storing semi-structured data in Snowflake? (Select TWO)
Options:
A.
NUMBER
B.
OBJECT
C.
STRING
D.
VARCHAR
E.
VARIANT
Answer:
B, E
Explanation:
Explanation:
Snowflake supports semi-structured data types, which include OBJECT and VARIANT. These
data types are capable of storing JSON-like data structures, allowing for flexibility in data
representation. OBJECT can directly contain VARIANT, and thus indirectly contain any other data
type, including itself1.
Questions 166
Options:
A.
B.
C.
Materialized view refreshes need to be maintained by the user.
D.
Querying a materialized view is slower than executing a query against the base table of the view.
Answer:
Explanation:
Explanation:
A characteristic of a materialized view is that the data accessed through it can be stale. This is
because the data in a materialized view may not reflect the latest changes in the base tables until
the view is refreshed
Questions 167
Which parameter can be used to instruct a COPY command to verify data files instead of loading
them into a specified table?
Options:
A.
STRIP_NULL_VALUES
B.
SKIP_BYTE_ORDER_MARK
C.
REPLACE_INVALID_CHARACTERS
D.
VALIDATION_MODE
Explanation:
Explanation:
The VALIDATION_MODE parameter can be used with the COPY command to verify data files
without loading them into the specified table. This parameter allows users to check for errors in
the files
Questions 168
A materialized view should be created when which of the following occurs? (Choose two.)
Options:
A.
B.
C.
D.
The query is highly optimized and does not consume many compute resources.
E.
The results of the query do not change often and are used frequently.
Answer:
B, E
Explanation:
Explanation:
A materialized view is beneficial when the query consumes many compute resources every time it
runs (B), and when the results of the query do not change often and are used frequently (E). This
is because materialized views store pre-computed data, which can speed up query performance
for workloads that are run frequently or are complex
Questions 169
A data provider wants to share data with a consumer who does not have a Snowflake account.
The provider creates a reader account for the consumer following these steps:
2. Created a database to hold the share and an extra-small warehouse to query the data
3. Granted the role PUBLIC the following privileges: Usage on the warehouse, database, and
schema, and SELECT on all the objects in the share
Options:
A.
The reader account will automatically use the Standard edition of Snowflake.
B.
C.
The reader account can clone data the provider has shared, but cannot re-share it.
D.
The reader account can create a copy of the shared data using CREATE TABLE AS...
Answer:
Explanation:
Explanation:
The reader account compute will be billed to the provider account. Very Comprehensive
Explanation: In Snowflake, when a provider creates a reader account for a consumer who does
not have a Snowflake account, the compute resources used by the reader account are billed to
the provider’s account. This allows the consumer to query the shared data without incurring any
costs. References: [COF-C02] SnowPro Core Certification Exam Study Guide
Questions 170
What file formats does Snowflake support for loading semi-structured data? (Choose three.)
Options:
A.
TSV
B.
JSON
C.
D.
Avro
E.
Parquet
F.
JPEG
Answer:
B, D, E
Explanation:
Explanation:
Snowflake supports several semi-structured data formats for loading data. The supported
formats include JSON, Avro, and Parquet12. These formats allow for efficient storage and
querying of data that does not conform to a traditional relational database schema.
Questions 171
Options:
A.
SnowCLI
B.
SnowUI
C.
SnowSQL
D.
SnowCD
Answer:
Explanation:
Explanation:
SnowCD (Snowflake Connectivity Diagnostic Tool) is the best tool provided by Snowflake for
troubleshooting network connectivity issues. It helps diagnose and resolve issues related to
connecting to Snowflake services
https://docs.snowflake.com/en/user-
guide/snowcd.html#:~:text=SnowCD%20(i.e.%20Snowflake%20Connectivity%20Diagnostic,their
%20network%20connection%20to%20Snowflake .
Questions 172
What is the recommended way to change the existing file format type in my format from CSV to
JSON?
Options:
A.
B.
C.
D.
Answer:
Explanation:
Explanation:
To change the existing file format type from CSV to JSON, the recommended way is to use the
ALTER FILE FORMAT command with the SET TYPE=JSON clause. This alters the file format
specification to use JSON instead of CSV. References: Based on my internal knowledge as of
2021.
Questions 173
Which user object property requires contacting Snowflake Support in order to set a value for it?
Options:
A.
DISABLED
B.
MINS TO BYPASS MFA
C.
D.
MINS TO UNLOCK
Answer:
Explanation:
Explanation:
The user property ‘MINS TO BYPASS MFA’ in Snowflake allows temporary bypass of MFA for a
user, which can be set by an account administrator without contacting Snowflake Support2.
Questions 174
A view is defined on a permanent table. A temporary table with the same name is created in the
same schema as the referenced table. What will the query from the view return?
Options:
A.
B.
C.
D.
An error stating that the referenced object could not be uniquely identified.
Explanation:
Explanation:
When a view is defined on a permanent table, and a temporary table with the same name is
created in the same schema, the query from the view will return the data from the permanent
table. Temporary tables are session-specific and do not affect the data returned by views defined
on permanent tables2.
Questions 175
Which type of join will list a I rows in the specified table, even if those rows have no match in the
other table?
Options:
A.
Cross join
B.
Inner join
C.
Natural join
D.
Outer join
Answer:
Explanation:
Explanation:
An outer join, specifically a left outer join, will list all rows from the left table and match them
with rows from the right table. If there is no match, the result will still include the row from the
left table, with NULLs for columns from the right table. References: Based on general SQL
knowledge as of 2021.
Questions 176
What is the difference between a stored procedure and a User-Defined Function (UDF)?
Options:
A.
B.
Returning a value is required in a stored procedure while returning values in a UDF is optional.
C.
Values returned by a stored procedure can be used directly in a SQL statement while the values
returned by a UDF cannot.
D.
Multiple stored procedures can be called as part of a single executable statement while a single
SQL statement can only call one UDF at a time.
Answer:
Explanation:
Explanation:
Stored procedures in Snowflake can perform a variety of database operations, including DDL and
DML, whereas UDFs are designed to return values and cannot execute database operations1.
Questions 177
A.
B.
C.
File keys
D.
E.
Answer:
A, C
Explanation:
Explanation:
Snowflake’s hierarchical key model includes several levels of keys, where Account master keys
and File keys are part of this hierarchy. Account master keys are used to encrypt all the data
within an account, while File keys are used to encrypt individual files within the database2.
Questions 178
Which kind of Snowflake table stores file-level metadata for each file in a stage?
Options:
A.
Directory
B.
External
C.
Temporary
D.
Transient
Questions 179
What is the minimum Snowflake edition needed for database failover and fail-back between
Snowflake accounts for business continuity and disaster recovery?
Options:
A.
Standard
B.
Enterprise
C.
Business Critical
D.
Answer:
Explanation:
Explanation:
The minimum Snowflake edition required for database failover and fail-back between Snowflake
accounts for business continuity and disaster recovery is the Business Critical edition. References:
Snowflake Documentation3.
Questions 180
Which database objects can be shared with the Snowflake secure data sharing feature? (Choose
two.)
Options:
A.
Files
B.
External tables
C.
D.
Sequences
E.
Streams
Answer:
B, C
Explanation:
Explanation:
Snowflake’s secure data sharing feature allows sharing of certain database objects with other
Snowflake accounts. Among the options provided, external tables and secure UDFs can be shared
Questions 181
Which operations are handled in the Cloud Services layer of Snowflake? (Select TWO).
Options:
A.
Security
B.
Data storage
C.
Data visualization
D.
Query computation
E.
Metadata management
Answer:
A, E
Explanation:
Explanation:
The Cloud Services layer in Snowflake is responsible for various services, including security (like
authentication and authorization) and metadata management (like query parsing and
optimization). References: Based on general cloud architecture knowledge as of 2021.
Q1. What can you easily check to see if a large table will
benefit from explicitly defining a clustering key?
A. Clustering depth
B. Clustering ratio
C . Values in a table
Answer: A. The clustering depth measures the average depth of the
overlapping micro-partitions for specified columns in a table (1 or
greater). The smaller the cluster depth is, the better clustered the
table is. You can get the clustering depth of a Snowflake table using
this command:
B. Snowflake used the persisted query results from the query result
cache
D. A new Snowflake version has been released in the last two hours,
improving the speed of the service
Answer: B. The query result cache stores the results of our queries
for 24 hours, so as long as we perform the same query and the data
hasn’t changed in the storage layer, it will return the same result
without using the warehouse and without consuming credits.
C. Cache results.
Q4. What option will you specify to delete the stage files
after a successful load into a Snowflake table with the
COPY INTO command?
A. DELETE = TRUE
B. REMOVE = TRUE
C. PURGE = TRUE
D. TRUNCATE = TRUE
A. Permanent
B. Temporary
C. Transient
D. External
Answer: B. With temporary tables, you can optimize storage costs,
as when the Snowflake session ends, data stored in the table is
entirely purged from the system. But they also require storage costs
while the session is active. A temporary table is purged once the
session ends, so the retention period is for 24 hours or the
remainder of the session.
Question # 1
A.
B.
C.
D.
Answer:
B
Explanation:
Explanation:
A materialized view in Snowflake gets suspended when structural changes that could impact the
view's integrity are made to the base table, such as B. When a column is dropped from the base
table. Dropping a column from the base table on which a materialized view is defined can
invalidate the view's data, as the view might rely on the column that is being removed. To
maintain data consistency and prevent the materialized view from serving stale or incorrect data,
Snowflake automatically suspends the materialized view.
Upon suspension, the materialized view does not reflect changes to the base table until it is
refreshed or re-created. This ensures that only accurate and current data is presented to users
querying the materialized view.
References:
Question # 2
Which task is supported by the use of Access History in Snowflake?
A.
Data backups
B.
Cost monitoring
C.
Compliance auditing
D.
Performance optimization
View Answer
Answer:
C
Explanation:
Explanation:
Access History in Snowflake is primarily utilized for compliance auditing. The Access History
feature provides detailed logs that track data access and modifications, including queries that read
from or write to database objects. This information is crucial for organizations to meet regulatory
requirements and to perform audits related to data access and usage.
• Role of Access History: Access History logs are designed to help organizations understand
who accessed what data and when. This is particularly important for compliance with
various regulations that require detailed auditing capabilities.
• How Access History Supports Compliance Auditing:
Reference: [Reference: For more information on how Access History supports compliance auditing,
refer to the Snowflake documentation on Access History: https://docs.snowflake.com/en/sql-
reference/account-usage/access_history.html, , ]
Question # 3
A.
User
B.
Role
C.
Account
D.
Organization
View Answer
Answer:
C
Explanation:
Explanation:
References:
Question # 4
In which hierarchy is tag inheritance possible?
A.
B.
C.
D.
View Answer
Answer:
D
Explanation:
Explanation:
In Snowflake, tag inheritance is a feature that allows tags, which are key-value pairs assigned to
objects for the purpose of data governance and metadata management, to be inherited within a
hierarchy. The hierarchy in which tag inheritance is possible is from Schema to Table to Column.
This means that a tag applied to a schema can be inherited by the tables within that schema, and a
tag applied to a table can be inherited by the columns within that table.References: Snowflake
Documentation on Tagging and Object Hierarchy
Question # 5
A.
PUT @mystage
B.
GET @mystage
C.
D.
INSERT @mystage
View Answer
Answer:
A
Explanation:
Explanation:
The command used to export or unload data from Snowflake to a stage (such as a file in an S3
bucket, Azure Blob Storage, or Google Cloud Storage) is the PUT command. The PUT command is
designed to upload data files from a local file system (in the case of SnowSQL or other client) or a
virtual warehouse to a specified stage. This functionality is critical for scenarios where data needs
to be extracted from Snowflake for use in external systems, backups, or further processing.
The syntax for the PUT command follows the structure: PUT
file://<local_file_path> @<stage_name>, where <local_file_path> specifies the path to the file(s)
on the local file system that you wish to upload, and <stage_name> specifies the destination stage
in Snowflake.
It's important to distinguish that the PUT command is used for exporting data out of Snowflake,
whereas the COPY INTO <table> command is used for importing data into Snowflake from a
stage. The GET command, on the other hand, is used to download files from a stage to the local
file system, essentially the inverse operation of the PUT command.
Question # 6
A user with which privileges can create or manage other users in a Snowflake account? (Select
TWO).
A.
GRANT
B.
SELECT
C.
MODIFY
D.
OWNERSHIP
E.
CREATE USER
Answer:
D, E
Explanation:
Explanation:
A user with the OWNERSHIP privilege on a user object or the CREATE USER privilege on the
account can create or manage other users in a Snowflake account56.
Question # 7
For the ALLOWED VALUES tag property, what is the MAXIMUM number of possible string
values for a single tag?
A.
10
B.
50
C.
64
D.
256
Answer:
D
Explanation:
Explanation:
For the ALLOWED VALUES tag property, the maximum number of possible string values for a
single tag is 256. This allows for a wide range of values to be assigned to a tag when it is set on an
object
Question # 8
Which view can be used to determine if a table has frequent row updates or deletes?
A.
TABLES
B.
TABLE_STORAGE_METRICS
C.
STORAGE_DAILY_HISTORY
D.
STORAGE USAGE
B
Explanation:
Explanation:
The TABLE_STORAGE_METRICS view can be used to determine if a table has frequent row
updates or deletes. This view provides detailed metrics on the storage utilization of tables within
Snowflake, including metrics that reflect the impact of DML operations such as updates and
deletes on table storage. For example, metrics related to the number of active and deleted rows
can help identify tables that experience high levels of row modifications, indicating frequent
updates or deletions.
References:
Question # 9
What type of columns does Snowflake recommend to be used as clustering keys? (Select TWO).
A.
A VARIANT column
B.
C.
D.
E.
Answer:
C, D
Explanation:
Explanation:
Snowflake recommends using columns with very high cardinality and those that are most actively
used in selective filters as clustering keys. High cardinality columns have a wide range of unique
values, which helps in evenly distributing the data across micro-partitions. Columns used in
selective filters help in pruning the number of micro-partitions to scan, thus improving query
performance. References: Based on general database optimization principles.