- Platform Release 6.5
- Privacera Platform Release 6.5
- Enhancements and updates in Privacera Access Management 6.5 release
- Enhancements and updates in Privacera Discovery 6.5 release
- Enhancements and updates in Privacera Encryption 6.5 release
- Deprecation of older version of PolicySync
- Upgrade Prerequisites
- Supported versions of third-party systems
- Documentation changelog
- Known Issues 6.5
- Platform - Supported Versions of Third-Party Systems
- Platform Support Policy and End-of-Support Dates
- Privacera Platform Release 6.5
- Privacera Platform Installation
- About Privacera Manager (PM)
- Install overview
- Prerequisites
- Installation
- Default services configuration
- Component services configurations
- Access Management
- Data Server
- UserSync
- Privacera Plugin
- Databricks
- Spark standalone
- Spark on EKS
- Portal SSO with PingFederate
- Trino Open Source
- Dremio
- AWS EMR
- AWS EMR with Native Apache Ranger
- GCP Dataproc
- Starburst Enterprise
- Privacera services (Data Assets)
- Audit Fluentd
- Grafana
- Ranger Tagsync
- Discovery
- Encryption & Masking
- Privacera Encryption Gateway (PEG) and Cryptography with Ranger KMS
- AWS S3 bucket encryption
- Ranger KMS
- AuthZ / AuthN
- Security
- Access Management
- Reference - Custom Properties
- Validation
- Additional Privacera Manager configurations
- Upgrade Privacera Manager
- Troubleshooting
- How to validate installation
- Possible Errors and Solutions in Privacera Manager
- Unable to Connect to Docker
- Terminate Installation
- 6.5 Platform Installation fails with invalid apiVersion
- Ansible Kubernetes Module does not load
- Unable to connect to Kubernetes Cluster
- Common Errors/Warnings in YAML Config Files
- Delete old unused Privacera Docker images
- Unable to debug error for an Ansible task
- Unable to upgrade from 4.x to 5.x or 6.x due to Zookeeper snapshot issue
- Storage issue in Privacera UserSync & PolicySync
- Permission Denied Errors in PM Docker Installation
- Unable to initialize the Discovery Kubernetes pod
- Portal service
- Grafana service
- Audit server
- Audit Fluentd
- Privacera Plugin
- How-to
- Appendix
- AWS topics
- AWS CLI
- AWS IAM
- Configure S3 for real-time scanning
- Install Docker and Docker compose (AWS-Linux-RHEL)
- AWS S3 MinIO quick setup
- Cross account IAM role for Databricks
- Integrate Privacera services in separate VPC
- Securely access S3 buckets ssing IAM roles
- Multiple AWS account support in Dataserver using Databricks
- Multiple AWS S3 IAM role support in Dataserver
- Azure topics
- GCP topics
- Kubernetes
- Microsoft SQL topics
- Snowflake configuration for PolicySync
- Create Azure resources
- Databricks
- Spark Plug-in
- Azure key vault
- Add custom properties
- Migrate Ranger KMS master key
- IAM policy for AWS controller
- Customize topic and table names
- Configure SSL for Privacera
- Configure Real-time scan across projects in GCP
- Upload custom SSL certificates
- Deployment size
- Service-level system properties
- PrestoSQL standalone installation
- AWS topics
- Privacera Platform User Guide
- Introduction to Privacera Platform
- Settings
- Data inventory
- Token generator
- System configuration
- Diagnostics
- Notifications
- How-to
- Privacera Discovery User Guide
- What is Discovery?
- Discovery Dashboard
- Scan Techniques
- Processing order of scan techniques
- Add and scan resources in a data source
- Start or cancel a scan
- Tags
- Dictionaries
- Patterns
- Scan status
- Data zone movement
- Models
- Disallowed Tags policy
- Rules
- Types of rules
- Example rules and classifications
- Create a structured rule
- Create an unstructured rule
- Create a rule mapping
- Export rules and mappings
- Import rules and mappings
- Post-processing in real-time and offline scans
- Enable post-processing
- Example of post-processing rules on tags
- List of structured rules
- Supported scan file formats
- Data Source Scanning
- Data Inventory
- TagSync using Apache Ranger
- Compliance Workflow
- Data zones and workflow policies
- Workflow Policies
- Alerts Dashboard
- Data Zone Dashboard
- Data zone movement
- Workflow policy use case example
- Discovery Health Check
- Reports
- How-to
- Privacera Encryption Guide
- Overview of Privacera Encryption
- Install Privacera Encryption
- Encryption Key Management
- Schemes
- Encryption with PEG REST API
- Privacera Encryption REST API
- PEG API endpoint
- PEG REST API encryption endpoints
- PEG REST API authentication methods on Privacera Platform
- Common PEG REST API fields
- Construct the datalist for the /protect endpoint
- Deconstruct the response from the /unprotect endpoint
- Example data transformation with the /unprotect endpoint and presentation scheme
- Example PEG API endpoints
- /authenticate
- /protect with encryption scheme
- /protect with masking scheme
- /protect with both encryption and masking schemes
- /unprotect without presentation scheme
- /unprotect with presentation scheme
- /unprotect with masking scheme
- REST API response partial success on bulk operations
- Audit details for PEG REST API accesses
- Make encryption API calls on behalf of another user
- Troubleshoot REST API Issues on Privacera Platform
- Privacera Encryption REST API
- Encryption with Databricks, Hive, Streamsets, Trino
- Databricks UDFs for encryption and masking on PrivaceraPlatform
- Hive UDFs for encryption on Privacera Platform
- StreamSets Data Collector (SDC) and Privacera Encryption on Privacera Platform
- Trino UDFs for encryption and masking on Privacera Platform
- Privacera Access Management User Guide
- Privacera Access Management
- How Polices are evaluated
- Resource policies
- Policies overview
- Creating Resource Based Policies
- Configure Policy with Attribute-Based Access Control
- Configuring Policy with Conditional Masking
- Tag Policies
- Entitlement
- Service Explorer
- Users, groups, and roles
- Permissions
- Reports
- Audit
- Security Zone
- Access Control using APIs
- AWS User Guide
- Overview of Privacera on AWS
- Configure policies for AWS services
- Using Athena with data access server
- Using DynamoDB with data access server
- Databricks access manager policy
- Accessing Kinesis with data access server
- Accessing Firehose with Data Access Server
- EMR user guide
- AWS S3 bucket encryption
- Getting started with Minio
- Plugins
- How to Get Support
- Coordinated Vulnerability Disclosure (CVD) Program of Privacera
- Shared Security Model
- Privacera Platform documentation changelog
Creating Resource Based Policies
Concepts in Access Management
For conceptual background, see How Access Management Works.
Create and configure policies that control access to specific resources.
From the home page, click Access Management > Resource Policies.
Click a service in one of the service groups.
Click Add New Policy.
Configure the new resource policy.
Configuration Settings Common to All Policies
Policies contain access rules associated with a particular data source or a subset of it. Specific policy attributes differ depending on the policy type, but all policies contain the following attributes:
Policy Type: The basis for controlling access. For example, a policy can be based on the resource, on a tag, or on a scheme.
Policy Name: Policies are assigned a name, either by the system or when created by a portal user. Default, system-created policies can be renamed. The policy name should be unique and can not be duplicated across the system.
Normal/Override: This option allows you to select policy type whether it is a 'Normal' or 'Override' policy. If you select 'Override', access permissions in the policy override the access permissions in existing policies.
Enable/Disable: By default, the policy is enabled. If the policy is not required, you can disable it by switching to 'Disabled' mode.
Policy Id: Each policy is assigned a numeric identifier. These IDs are incremented and unique within each account. Policy identifiers are referenced in the audit trail event messages, so that action taken and recorded to the audit trail is associated with a specific policy.
Policy Label: A descriptive label that helps users find this policy when searching for policies and filtering policy lists.
Resource Specifier: These will be different for each type of resource, and the set of specifiers will change depending on the top down choices.
The autocomplete feature is available only if you have defined PolicySync connectors for the following services:
Postgres
Redshift
MSSQL
Snowflake
Databricks SQL
Validity Period: A policy can be defined to be effective only for a period of time. Start and End date/times (defined to the minute), with a selectable Time zone.
Description: This field required description of policy which can be used to identify among others policies.
Audit Logging: Enable/disable Audit Logging. Toggle to 'No', if this policy doesn't need to be audited. By default, it is selected as 'Yes'.
Condition Sets: The rules that allow or deny access to a resource. Available permissions are specific to the type of service. There are four access conditions:
Allow Conditions
Exclude from Allow Conditions
Deny Conditions
Exclude from Deny Conditions
At least one rule must be defined. One or more default 'all...' policies are automatically created for any default created services (those named as "privacera_<service_type>"). Policy names reflect the type of service.
Service-Specific Policy Configuration Settings
Service Name | Supported Policy Type |
---|---|
Hive, Presto, MS SQL, Postgres, Snowflake | Access, Masking, Row Level Filter |
S3, DynamoDB, Athena, Glue, Redshift, Kinesis, Lambda, ADLS, Kafka, PowerBI, GCS, GBQ, and Files. | Access |
Hive
Database: Specify the database name.
Table/UDF: Specify the table or udf name.
Column: Specify the column name.
Note
By default the 'Include' option is selected to allow access for all the above fields. In case you want to deny access, toggle to the 'Exclude' option.
URL: Specify the cloud storage path. For example - s3a://user/poc/sales.txt where the end-user permission is needed to read/write the Hive data from/to a cloud storage path.
Recursive
Non-recursive
Global: Specify global dataset.
Allow Conditions:
Policy Conditions: This option allows a user to add custom conditions while evaluating authorization requests.
Accessed Together ?: This option allows a user to access a specified request (minimum 2 columns) in the query format.
For example: default.employeepersonalview.EMP_SSN, default.employeepersonalview.CC
Above query allows user to access EMP_SSN & CC columns only when both are mentioned together in the query else it will give denied permission error.
Not Accessed Together?: This option denies specified requests (minimum 2 columns) in the query format.
For example: default.employeepersonalview.EMP_SSN, default.employeepersonalview.CC
Above query deny user to view EMP_SSN & CC columns data when both are mentioned together in the query and give denied permission error.
Permission: Add permissions as per the requirement. The list of permissions are -
Select:
Update:
Create:
Drop:
Alter:
Index:
Lock:
All:
Read:
Write:
Hive - Masking Policy
Hive Database: Select the appropriate database. This field holds the list of Hive databases.
Hive Table: Select the appropriate table. This field holds the list of Hive tables.
Hive Column: Select the appropriate column. This field holds the list of Hive columns.
Masking Conditions:
Permissions: Tick the permission as 'Select'. At present, only 'Select' permission is available.
Select Masking Options: You are allowed to select only one masking option from the below list -
Redact: This option masks all the alphabetic characters with 'x' and all numeric characters with 'n'.
Partial mask: show last 4 – This option shows only the last four characters.
Partial mask: show first 4 – This option shows only the first four characters.
Hash: This option replaces all the characters with '#' of the entire cell value.
Nullify: This option replaces all the characters with NULL value.
Unmasked (retain original value): This option is used when no masking is required.
Date: show only year: This option shows only the year portion of a date string and default the month and day to 01/01.
Custom: Using this option you need to mention a custom masked value or expression. Custom masking can use any valid Hive UDF (Hive that returns the same data type as the data type in the column being masked).
Hive - Row Level Filter
Hive Database: Enter the appropriate database name.
Hive Table: Enter the appropriate table name.
Row Level Conditions:
Permissions: Click the Add Permissions and tick as 'Select'. At present, only 'Select' permission is available.
Row Level Filter: Click the Add Row Filter and enter the valid SQL predicate for whom the policy will be applied based on selected role/groups/users. Note: Row level filtering works by adding the predicate to the query, if this is not a valid SQL query, then it can be failed. If you do not wish to apply a row level filter then keep this field blank. In this case, only 'Select' access will be applied.
AWS S3
Bucket Name: Specify the bucket name. For example: aws-athena-query-result
Note: Wildcard characters such as '*' are allowed if you want to give access to all buckets. |
Object Path: Specify the object path. It accepts wildcard character such as '*'.
Recursive: This allows you to view multiple folders based on the mentioned object path.
Non-recursive: This allows you to view specific folders based on the mentioned object path.
Example:
If the Bucket name is {bucket-AWS} and the Object path is {path1},
Sample 1: s3://bucket-AWS/path1/
Sample 2: s3://bucket-name/path1/path2/
If the Recursive toggle button is enabled [the default behavior], you can view all files within the path1
and path2
folders.
If the Recursive toggle button is disabled, you won't be able to view any files in the path1
folder.
Allow Conditions:
Permissions:
Read: READ permission on the URL permits the user to perform HiveServer2 operations which use S3 as a data source for Hive tables.
Write: WRITE permission on the URL permits the user to perform HiveServer2 operations which write data to the specified S3 location.
Delete: DELETE permission allows you to delete the resource.
Metadata Read: METADATA READ permission allows you to run HEAD operation on objects. Also, this permission list buckets, list objects and retrieves objects metadata.
Metadata Write: METADATA WRITE permission allows you to modify object's metadata and object's ACL, Tagging, Cros, etc.
Admin: Administrators can edit or delete the policy, and can also create child policies based on the original policy.
Presto
Catalog: Specify the catalog name.
Schema: Specify the schema name.
Sessionproperty: Specify the session property.
Table: Specify the table name.
Procedure: Specify the procedure name.
Column: Specify the column name.
Prestouser:
Systemproperty:
Function:
Allow Conditions:
Permissions:
Select
Insert
Create
Drop
Delete
Use
Alter
Grant
Revoke
Show
Impersonate
All
Execute
Create View
Delegate Admin: Assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Presto - Masking Policy
Presto Catalog
Presto Schema
Presto Table
Presto Column
Masking Conditions:
Permissions
Select: Tick the permission as 'Select'. At present, only 'Select' permission is available.
Select Masking Option: You are allowed to select only one masking option from the below list.
Redact: This option masks all the alphabetic characters with 'x' and all numeric characters with 'n'.
Partial mask: show last 4 – This option shows only the last four characters.
Partial mask: show first 4 – This option shows only the first four characters.
Hash: This option replaces all the characters with '#' of the entire cell value.
Nullify: This option replaces all the characters with NULL value.
Unmasked (retain original value): This option is used when no masking is required.
Date: show only year: This option shows only the year portion of a date string and default the month and day to 01/01.
Custom: Using this option you need to mention a custom masked value or expression.
Presto - Row Level Filter
Presto Catalog
Presto Schema
Presto Table
Row Level Conditions:
Permissions: Click the Add Permissions and tick as 'Select'. At present, only 'Select' permission is available.
Row Level Filter: Click the Add Row Filter and enter the valid SQL predicate to which the policy will be applied based on selected role/groups/users. Note: Row level filtering works by adding the predicate to the query. If the query is not valid, it will fail.
DynamoDB
Table: Specify the table name.
Attribute: Specify the attribute name.
Allow Conditions
Permissions:
Read
Write
Create
Delete
List tables
Admin
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Athena
Workgroup: Specify the workgroup name of Athena.
Data source: Specify the name of the data source.
Database: Specify the name of the database.
Table: Specify the name of the table.
Column: Specify the name of the column.
URL: Specify the cloud storage path. For example - s3a://user/poc/sales.txt where the end-user permission is needed to access the data from/to a cloud storage path.
Allow Conditions:
Permissions:
BatchGetNamedQuery
BatchGetQueryExecution
CreateNamedQuery
CreateWorkGroup
DeleteNamedQuery
DeleteWorkGroup
GetNamedQuery
GetQueryExecution
GetQueryResults
GetWorkGroup
ListNamedQueries
ListQueryExecutions
ListTagsForResource
ListWorkGroups
StartQueryExecution
StopQueryExecution
TagResource
UntagResource
UpdateWorkGroup
Alter
Create
Describe
Drop
Insert
MSCK Repair
Select
Show
ListDataCatalogs
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Glue
Database: Specify the database name.
Table: Specify the table name.
Note: You are allowed to enter wildcard character such as '*'. in the above fields.
Allow Conditions:
Permissions:
GetCatalogImportStatus
GetDatabases
GetDatabase
GetTables
GetTable
CreateTable
CreateDatabase
DeleteDatabase
DeleteTable
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Redshift
Global: Specify the Redshift hosted IP. To get Redshift hosted ip, connect with Redshift environment and run this query: SELECT inet_server_addr() as host, inet_server_port() as port
Database: Specify the database name.
Schema: Specify the schema name.
Table: Specify the table name.
Column: Specify the column name.
Cluster: Specify the cluster ip.
Allow Condition:
Permissions:
Create Database
Create Schema
Usage Schema
Create Table
Select
Insert
Update
Delete
ListClusters
CreateCluster
UpdateCluster
DeleteCluster
ResizeCluster
PauseCluster
RebootCluster
CreateSnapshot
RestoreSnapshot
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Kinesis
Kinesis_Datastream: Specify the datastream name.
Kinesis_Firehose: Specify the firehose name.
Allow Conditions:
Permissions:
PutRecord
CreateDeliveryStream
DeleteDeliveryStream
DeleteDeliveryStream
ListDeliveryStreams
UpdateDestination
PutRecordBatch
ListTagsForDeliveryStream
StartDeliveryStreamEncryption
StopDeliveryStreamEncryption
TagDeliveryStream
UntagDeliveryStream
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Lambda
Function: Specify the function name of Lambda.
Layer: Specify the layer name of Lambda.
Note: You are allowed to enter wildcard characters such as '*'.
Allow Conditions:
Permissions:
ListAliases
ListEventSourceMappings
ListFunctionEventInvokeConfigs
ListFunctions
ListLayers
ListLayerVersions
ListProvisionedConcurrencyConfigs
ListVersionsByFunction
GetAccountSettings
GetAlias
GetEventSourceMapping
GetFunction
GetFunctionConcurrency
GetFunctionConfiguration
GetFunctionEventInvokeConfig
GetLayerVersion
GetLayerVersionByArn
GetLayerVersionPolicy
GetPolicy
GetProvisionedConcurrencyConfig
ListTags
CreateAlias
CreateEventSourceMapping
CreateFunction
DeleteAlias
DeleteEventSourceMapping
DeleteFunction
DeleteFunctionConcurrency
DeleteFunctionEventInvokeConfig
DeleteLayerVersion
DeleteProvisionedConcurrencyConfig
InvokeFunction
PublishLayerVersion
PublishVersion
PutFunctionConcurrency
PutFunctionEventInvokeConfig
PutProvisionedConcurrencyConfig
TagResource
UntagResource
UpdateAlias
UpdateEventSourceMapping
UpdateFunctionCode
UpdateFunctionConfiguration
UpdateFunctionEventInvokeConfig
AddLayerVersionPermission
AddPermission
RemoveLayerVersionPermission
RemovePermission
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
MSSQL
Database
Schema
Table
Column
Allow Conditions:
Permissions
Create Database
Create Schema
Create Table
Select
Insert
Update
Delete
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
MSSQL - Masking Policy
Database
Schema
Table
Column
Masking Conditions:
Permissions
Select
Select Masking Options:
Default
Nullify: This option replaces all the characters with NULL value.
Unmasked: This option is used when no masking is required.
Custom: Using this option you need to mention a custom masked value or expression.
MSSQL - Row Level Filter
Database
Schema
Table
Row Level Conditions:
Permissions: Click the Add Permissions and tick as 'Select'. At present, only 'Select' permission is available.
Row Level Filter: Click the Add Row Filter and enter the valid SQL predicate for whom the policy will be applied based on selected role/groups/users. Note: Row level filtering works by adding the predicate to the query. If the query is not valid, it will fail.
ADLS
Account Name
Container Name
Object Path
Allow Conditions:
Permissions:
Read: READ permission on the URL permits the user to perform HiveServer2 operations which use S3 as a data source for Hive tables.
Write: WRITE permission on the URL permits the user to perform HiveServer2 operations which write data to the specified S3 location.
Delete: DELETE permission allows you to delete the resource.
Metadata Read: METADATA READ permission allows you to run HEAD operation on objects. Also, this permission list buckets, list objects and retrieves objects metadata.
Metadata Write: METADATA WRITE permission allows you to modify object's metadata and object's ACL, Tagging, Cros, etc.
Admin: Administrators can edit or delete the policy, and can also create child policies based on the original policy.
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Postgres
Global
Database
Schema
Table
Column
Allow Conditions:
Permissions:
Create Database
Connect Database
Create Schema
Usage Schema
Create Table
Select
Insert
Update
Delete
Truncate
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Postgres - Masking Policy
Database
Schema
Table
Column
Masking Conditions:
Permissions
Select
Select Masking Option:
Default:
Nullify: This option replaces all the characters with NULL value.
Unmasked: This option is used when no masking is required.
Custom: Using this option you need to mention a custom masked value or expression.
Postgres - Row Level Filter
Database
Schema
Table
Row Level Conditions:
Permissions: Click the Add Permissions and tick as 'Select'. At present, only 'Select' permission is available.
Row Level Filter: Click the Add Row Filter and enter the valid SQL predicate for whom the policy will be applied based on selected role/groups/users. Note: Row level filtering works by adding the predicate to the query. If the query is not valid, it will fail.
Kafka
Topic
Transactionalid
Cluster
Delegationtoken
Consumergroup
Policy Conditions
Add Conditions
Allow Conditions:
Policy Conditions
Add Conditions
Permissions
Consume
Describe
Delete
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Snowflake
Warehouse: Specify the warehouse name of Snowflake.
When you select
warehouse
, the following warehouse permissions will be displayed in the Allow Conditions > Permissions section.Operate
UseWarehouse
Monitor
Modify
Database: Specify the database name.
When you select
database
, the following database permissions will be displayed in the Allow Conditions > Permissions section.CreateSchema
UseDB
Schema: Specify the schema name.
When you select
schema
along withdatabase
, the following schema permissions will be displayed in the Allow Conditions > Permissions section.CreateTmpTable
CreateTable
UseSchema
CreateStream
CreateFunction
CreateProcedure
CreateSequence
CreatePipe
CreateFileFormat
CreateStage
CreateExternalTable
Table: Specify the table name.
When you select
table
along withdatabase
andschema
, the following table permissions will be displayed in the Allow Conditions > Permissions section.Select
Insert
Update
Delete
Truncate
References
Stream: Specify the stream that you have created over standard tables.
When you select
stream
along withdatabase
andschema
, the following stream permission will be displayed in the Allow Conditions > Permissions section.Select
Function: Specify the function.
When you select
function
along withdatabase
andschema
, the following function permission will be displayed in the Allow Conditions > Permissions section.Usage
Procedure: Specify Snowflake stored procedure.
When you select
procedure
along withdatabase
andschema
, the following procedure permission will be displayed in the Allow Conditions > Permissions section.Usage
File_Format: Specify the file format for SQL statement.
When you select
file_format
along withdatabase
andschema
, the following file_format permission will be displayed in the Allow Conditions > Permissions section.Usage
Pipe: Specify pipe objects that are created and managed to load data using Snowpipe.
When you select
pipe
along withdatabase
andschema
, the following pipe permissions will be displayed in the Allow Conditions > Permissions section.Operate
Monitor
External_stage: Specify external storage, which is the object storage of the cloud platform.
When you select
external_stage
along withdatabase
andschema
, the following external_stage permission will be displayed in the Allow Conditions > Permissions section.Usage
Internal_stage: Specify internal storage, which is the database storage.
When you select
internal_stage
along withdatabase
andschema
, the following Internal_stage permissions will be displayed in the Allow Conditions > Permissions section.Read
Write
Sequence: Specify Snowflake sequence objects.
When you select
sequence
along withdatabase
andschema
, the following sequence permission will be displayed in the Allow Conditions > Permissions section.Usage
Column: Specify the column name.
When you select
column
along withdatabase
,schema
andtable
, the followingcolumn
permissions will be displayed in the Allow Conditions > Permissions section.Select
Insert
Update
Delete
Truncate
References
Global: Specify the snowflake account name. To get the snowflake account name, connect with Snowflake environment and run this query: select current_account() as account
When you select
global
, the following global permissions will be displayed in the Allow Conditions > Permissions section.CreateWarehouse
CreateDatabase
Delegate Admin: Select the
Delegate Admin
checkbox to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Note
When you create a policy for a table with UPDATE and DELETE permissions granted to a user/group/role, you must choose the SELECT permission along with it.
Snowflake - Masking Policy
Database: Specify the database name.
Schema: Specify the schema name.
Table/View: Specify the table or view name.
Column: Specify the column name.
Masking Conditions:
Permissions: Tick the permission as 'Select'. At present, only 'Select' permission is available.
Select Masking Option: If a masking option is applied to a data type that is not supported, then the default masking value is applied. You are allowed to select only one masking option from the following list:
Default: This option masks column with default value specified by its datatype's property.
The following are the default data type property values:
SNOWFLAKE_MASKED_NUMBER_VALUE=0
SNOWFLAKE_MASKED_DOUBLE_VALUE=0
SNOWFLAKE_MASKED_TEXT_VALUE='{{MASKED}}'
Hash: Returns a hex-encoded string containing the N-bit SHA-2 of the volume in the column, where N is the specified output digest size.
Internal Function: SHA2({col})
Supported Data Type: Text
For more information see Snowflake Documentation.
Nullify: This option replaces all the characters with NULL value.
Supported Data Type: All Data Types
Unmasked (retain original value): This option is used when no masking is required.
Supported Data Type: All Data Types
Regular expression:
Internal Function: regexp_replace({col},'{value_or_expr}','{replace_value}')
Supported Data Type: Text
For more information see Snowflake Documentation.
Literal mask: This option replaces entire cell value with given character.
Supported Data Type: Text
Partial mask: show last 4 - This option shows only the last four characters.
Internal Function: regexp_replace({col},'(..)(.{4})(.)','***\2')
Supported Data Type: Text
For more information see Snowflake Documentation.
Partial mask: show first 4 - This option shows only the first four characters.
Internal Function: regexp_replace({col},'.','*','5')
Supported Data Type: Text
For more information see Snowflake Documentation.
Protect:
Supported Data Type: Text
For more information see /protect.
Unprotect:
Supported Data Type: Text
For more information see /unprotect.
Custom: Using this option you need to mention a custom masked value or expression.
Snowflake - Row Level Filter
Database: Specify the database name.
Schema: Specify the schema name.
Table: Specify the table name.
Row Level Conditions:
Permissions: Click the Add Permissions and tick as 'Select'. At present, only 'Select' permission is available.
Row Level Filter: Click the Add Row Filter and enter the valid SQL predicate for whom the policy will be applied based on selected role/groups/users. Note: Row level filtering works by adding the predicate to the query. If the query is not valid, it will fail.
PowerBI
Workspace
Allow Conditions:
Permissions
Contributor
Member
Admin
None
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
GCS
Project ID
Bucket Name
Object Path
Recursive/Non-recursive:
Allow Conditions
Permissions:
Read: READ permission on the URL permits the user to perform HiveServer2 operations which use S3 as a data source for Hive tables.
Write: WRITE permission on the URL permits the user to perform HiveServer2 operations which write data to the specified S3 location.
Delete: DELETE permission allows you to delete the resource.
Metadata Read: METADATA READ permission allows you to run HEAD operation on objects. Also, this permission list buckets, list objects, and retrieves objects metadata.
Metadata Write: METADATA WRITE permission allows you to modify object's metadata and object's ACL, Tagging, Cros, etc.
Admin: Administrators can edit or delete the policy, and can also create child policies based on the original policy.
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
GBQ
Project ID
Dataset Name
TableName
Column Name
Allow Conditions
Permissions
CreateTable
CreateTableAsSelect
CreateView
Delete
DropTable
DropView
Insert
Query
Update
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Files
Resource Path
Recursive/Non-Recursive:
Allow Conditions
Permissions
Read
Write
Delegate Admin: Select 'Delegate Admin' to assign administrator rights to the roles, groups, or users specified in the policy. The administrator can edit or delete the policy, and can also create child policies based on the original policy.
Databricks
By default, Databricks File System (DBFS) is protected by Privacera. This blocks common tasks like adding jars/libraries into the cluster. For example, when you try to install a library into a protected DBFS cluster, the following exception will be displayed:
Exception
Exception while installing a Jar in Databricks Cluster with Plugin enabled? java.lang.RuntimeException: ManagedLibraryInstallFailed: java.security.AccessControlException: Access denied for resource [dbfs:/local_disk0/tmp/addedFile4604599454488620309privacera_crypto_jar_with_dependencies-eba20.jar] action [READ] for library:JavaJarId(dbfs:/privacera/crypto/jars/privacera-crypto-jar-with-dependencies.jar,,NONE),isSharedLibrary=false
To grant permissions to read/write on DBFS, you need to create an access policy. Access to DBFS will be audited.
To create an access policy for Databricks, do the following:
Go to Access Management > Resource Policies > privacera_files.
Click Add New Policy.
Enter the following details:
Policy Name: Access to Temporary Folder for adding libraries
Resource: dbfs:/local_disk0/tmp
Note
Make sure the recursive box next to the Resource field is checked.
Group: public
Permission: read & write
Note
The above policy gives permission to all the users. If you want to restrict to only certain users, then instead of giving permission to the group public, provide it to appropriate users or groups.