<< Prev Question Next Question >>

Question 120/149

You are implementing row access policies on a 'SALES DATA table to restrict access based on the 'REGION' column. Different users are allowed to see data only for specific regions. You have a mapping table 'USER REGION MAP' with columns 'USERNAME' and 'REGION'. You want to create a row access policy that dynamically filters the 'SALES DATA' based on the user and their allowed region. Which of the following options represents a correct approach to create and apply this row access policy?

LEAVE A REPLY

Your email address will not be published. Required fields are marked *

Question List (149q)
Question 1: You are tasked with building a data pipeline that ingests cu...
Question 2: You are designing a data pipeline in Snowflake that involves...
Question 3: You are developing a Secure UDF in Snowflake to encrypt sens...
Question 4: You are tasked with optimizing a continuous data pipeline th...
Question 5: You are using Snowpark to perform a complex join operation b...
Question 6: Your team is developing a set of complex analytical queries ...
Question 7: You're designing a Snowpark data transformation pipeline tha...
Question 8: You have a requirement to continuously load data from a clou...
Question 9: You are using Snowpark Python to transform a DataFrame 'df_o...
Question 10: You're tasked with optimizing a Snowflake data pipeline that...
Question 11: You are developing a Snowpark Python application that needs ...
Question 12: You are tasked with designing a data sharing solution where ...
Question 13: You are designing a CI/CD pipeline for your Snowflake data t...
Question 14: You are responsible for monitoring data quality in a Snowfla...
Question 15: You have a Snowflake task that executes a complex stored pro...
Question 16: You have a 'SALES table and a 'PRODUCTS table. The 'SALES ta...
Question 17: You have a table named 'TRANSACTIONS with the following defi...
Question 18: You have a Snowflake table named 'ORDERS clustered on 'ORDER...
Question 19: You are tasked with migrating data from a legacy SQL Server ...
Question 20: You are designing a data protection strategy for a Snowflake...
Question 21: You are tasked with implementing data masking on a 'CUSTOMER...
Question 22: You are tasked with creating a resilient data pipeline using...
Question 23: You have implemented a Snowpipe using auto-ingest to load da...
Question 24: A financial institution is using Snowflake to store transact...
Question 25: You are building a data pipeline to ingest clickstream data ...
Question 26: You are tasked with implementing a projection policy in Snow...
Question 27: A data warehousing team is experiencing inconsistent query p...
Question 28: You're designing a Snowpark Scala stored procedure that must...
Question 29: You have a requirement to create a UDF in Snowflake that tra...
Question 30: You have a complex data pipeline implemented using Snow park...
Question 31: You are developing a data pipeline to ingest customer feedba...
Question 32: You have configured a Snowpipe to load data from an AWS S3 b...
Question 33: You need to define a UDF in Snowflake that takes a date as i...
Question 34: You are ingesting data from an AWS S3 bucket into a Snowflak...
Question 35: You have a large dataset stored in AWS S3 in Parquet format....
Question 36: You are developing a data pipeline in Snowflake that uses SQ...
Question 37: You have configured a Kafka Connector to load JSON data into...
Question 38: You are designing a data pipeline that uses the Snowflake SQ...
Question 39: You are tasked with processing streaming data in Snowflake u...
Question 40: You have a Snowflake table 'CUSTOMER DATA with a column 'EMA...
Question 41: You are designing a data protection strategy for a Snowflake...
Question 42: You are designing a data recovery strategy for a critical ta...
Question 43: You are tasked with loading a large dataset (50TB) of JSON f...
Question 44: You're building a data product on the Snowflake Marketplace ...
Question 45: A large e-commerce company is experiencing performance issue...
Question 46: A financial institution needs to tokenize sensitive customer...
Question 47: You accidentally truncated a large table named 'SALES DATA' ...
Question 48: You are designing a continuous data pipeline to load data fr...
Question 49: You are using the Snowflake Developer API to automate the cr...
Question 50: You have implemented external tokenization for a sensitive d...
Question 51: A data engineer wants to use Snowpark to read a large CSV fi...
Question 52: You're using Snowpark Python to transform data in a Snowflak...
Question 53: A Snowflake data engineer is troubleshooting a slow-running ...
Question 54: Consider the following Snowflake Javascript UDF designed to ...
Question 55: A Snowflake data pipeline utilizes Snowpipe to ingest JSON d...
Question 56: A Snowflake data pipeline ingests data from multiple externa...
Question 57: A large e-commerce company stores clickstream data in an AWS...
Question 58: You have created a Snowflake Iceberg table that points to da...
Question 59: You have created a masking policy called which redacts salar...
Question 60: You have a Python UDF in Snowflake designed to enrich custom...
Question 61: A data engineer is facing performance issues with a complex ...
Question 62: You're working on a data transformation pipeline in Snowflak...
Question 63: Consider a scenario where you have a Snowflake table named '...
Question 64: You are developing a JavaScript UDF in Snowflake to perform ...
Question 65: You are designing a data pipeline that involves unloading la...
Question 66: A critical database, 'PRODUCTION DB', in your Snowflake acco...
Question 67: You are working with a Snowpark DataFrame named 'customer da...
Question 68: You have created a secure external function that uses a Snow...
Question 69: A data engineering team is tasked with optimizing a complex ...
Question 70: A data engineering team is using a Snowflake stream to captu...
Question 71: You are using Snowpark Python to perform data transformation...
Question 72: You are tasked with creating a system to monitor the data qu...
Question 73: A company is using Snowflake's web app interface to manage i...
Question 74: You have a table 'ORDERS in your Snowflake database. You are...
Question 75: (Exhibit)
Question 76: A data engineer is investigating high credit consumption on ...
Question 77: A data engineering team uses Snowflake to analyze website cl...
Question 78: A data engineer is using the Snowflake Spark connector to re...
Question 79: Consider a scenario where you have a Snowflake external tabl...
Question 80: A financial services company is implementing Snowflake. They...
Question 81: You have implemented a masking policy on the 'SSN' column of...
Question 82: A data engineer is tasked with creating a Snowpark Python UD...
Question 83: You are implementing a data share between two Snowflake acco...
Question 84: You are tasked with ingesting a large volume of CSV files fr...
Question 85: You have a Snowflake Task that is designed to transform and ...
Question 86: A healthcare provider stores patient data in Snowflake, incl...
Question 87: You are configuring a Snowflake Data Clean Room for two heal...
Question 88: You are designing a data pipeline using Snowpipe to ingest d...
Question 89: You are configuring cross-cloud replication for a Snowflake ...
Question 90: You are developing a Snowpark Python stored procedure that p...
Question 91: You are monitoring a Snowpipe pipeline that loads data from ...
Question 92: You are tasked with ingesting data from an external stage in...
Question 93: You are tasked with creating a Snowpark Python UDF that calc...
Question 94: You have a Snowflake Stream named 'PRODUCT CHANGES' created ...
Question 95: You are tasked with implementing column-level security on th...
Question 96: A data engineering team observes that queries against a larg...
Question 97: A data engineering team has implemented a continuous data pi...
Question 98: You are setting up a Kafka connector to load data from a Kaf...
Question 99: You are designing a data loading process for a high-volume s...
Question 100: You are responsible for monitoring a critical data pipeline ...
Question 101: You need to load data from a stream of CSV files into a Snow...
Question 102: A Snowflake data warehouse contains a table named 'SALES TRA...
Question 103: You have a Snowflake table 'raw_data' with columns 'id', 'ti...
Question 104: You are tasked with implementing a data loading process for ...
Question 105: You are using the Snowflake Spark connector to update record...
Question 106: You are tasked with building a Snowpipe to ingest JSON data ...
Question 107: A data engineering team is managing a Snowflake warehouse th...
Question 108: You are designing a data sharing solution where the consumer...
Question 109: You are using Snowpipe to ingest data from Azure Blob Storag...
Question 110: You are tasked with building a robust data quality monitorin...
Question 111: You're managing a Snowflake data warehouse and need to creat...
Question 112: You have a data pipeline that aggregates web server logs hou...
Question 113: You're tasked with building an external function in Snowflak...
Question 114: A financial services company stores sensitive customer data,...
Question 115: You are tasked with creating a SQL UDF in Snowflake to mask ...
Question 116: You are designing a system to monitor data access patterns i...
Question 117: You are designing a data pipeline to load JSON data from an ...
Question 118: You've created a JavaScript UDF in Snowflake to perform comp...
Question 119: You are tasked with sharing a subset of a customer table (CU...
Question 120: You are implementing row access policies on a 'SALES DATA ta...
Question 121: You are designing a complex data pipeline in Snowflake that ...
Question 122: You are managing a Snowflake environment where data retentio...
Question 123: A data engineer is tasked with implementing a data governanc...
Question 124: You have a Snowpark DataFrame 'df_products' with columns 'pr...
Question 125: You are responsible for ensuring data consistency across mul...
Question 126: A data engineer is using the Snowflake Spark connector to wr...
Question 127: You have a large dataset of JSON documents stored in AWS S3,...
Question 128: You are designing a Snowpipe pipeline to ingest data from an...
Question 129: A data engineer is using Snowpark Scala to create a UDF that...
Question 130: Which of the following statements are TRUE regarding Snowfla...
Question 131: You are tasked with implementing row-level security (RLS) on...
Question 132: A data engineer is working with a Snowpark DataFrame 'sales ...
Question 133: Consider a scenario where you need to transform data in a Sn...
Question 134: A data provider wants to share a large dataset (several TB) ...
Question 135: A Snowflake table 'PRODUCT REVIEWS' is being ingested into f...
Question 136: Given the following scenario: You have an external table 'EX...
Question 137: You have a Snowflake view that joins three large tables: ORD...
Question 138: A data engineering team is responsible for processing a high...
Question 139: You are tasked with creating a Snowpark Java stored procedur...
Question 140: You are working on a Snowpark Python application that needs ...
Question 141: You have a Snowflake stage pointing to an external cloud sto...
Question 142: You have a base table 'ORDERS' with columns 'ORDER ID, 'CUST...
Question 143: You are designing a data governance strategy for a Snowflake...
Question 144: Consider a table with columns and 'customer _ region'. You w...
Question 145: You are developing a JavaScript stored procedure in Snowflak...
Question 146: A data engineer observes that a daily data transformation pi...
Question 147: You are tasked with optimizing a data pipeline that loads da...
Question 148: Consider the following Snowflake UDTF definition written in ...
Question 149: You are tasked with optimizing the performance of a Snowflak...