- Platform Release 6.5
- Privacera Platform Installation
- About Privacera Manager (PM)
- Install overview
- Prerequisites
- Installation
- Default services configuration
- Component services configurations
- Access Management
- Data Server
- PolicySync
- Snowflake
- Redshift
- Redshift Spectrum
- PostgreSQL
- Microsoft SQL Server
- Databricks SQL
- RocksDB
- Google BigQuery
- Power BI
- UserSync
- Privacera Plugin
- Databricks
- Spark standalone
- Spark on EKS
- Trino Open Source
- Dremio
- AWS EMR
- AWS EMR with Native Apache Ranger
- GCP Dataproc
- Starburst Enterprise
- Privacera services (Data Assets)
- Audit Fluentd
- Grafana
- Access Request Manager (ARM)
- Ranger Tagsync
- Discovery
- Encryption & Masking
- Privacera Encryption Gateway (PEG) and Cryptography with Ranger KMS
- AWS S3 bucket encryption
- Ranger KMS
- AuthZ / AuthN
- Security
- Access Management
- Reference - Custom Properties
- Validation
- Additional Privacera Manager configurations
- CLI actions
- Debugging and logging
- Advanced service configuration
- Increase Privacera portal timeout for large requests
- Order of precedence in PolicySync filter
- Configure system properties
- PolicySync
- Databricks
- Table properties
- Upgrade Privacera Manager
- Troubleshooting
- Possible Errors and Solutions in Privacera Manager
-
- Unable to Connect to Docker
- Terminate Installation
- 6.5 Platform Installation fails with invalid apiVersion
- Ansible Kubernetes Module does not load
- Unable to connect to Kubernetes Cluster
- Common Errors/Warnings in YAML Config Files
- Delete old unused Privacera Docker images
- Unable to debug error for an Ansible task
- Unable to upgrade from 4.x to 5.x or 6.x due to Zookeeper snapshot issue
- Storage issue in Privacera UserSync & PolicySync
- Permission Denied Errors in PM Docker Installation
- Unable to initialize the Discovery Kubernetes pod
- Portal service
- Grafana service
- Audit server
- Audit Fluentd
- Privacera Plugin
-
- Possible Errors and Solutions in Privacera Manager
- How-to
- Appendix
- AWS topics
- AWS CLI
- AWS IAM
- Configure S3 for real-time scanning
- Install Docker and Docker compose (AWS-Linux-RHEL)
- AWS S3 MinIO quick setup
- Cross account IAM role for Databricks
- Integrate Privacera services in separate VPC
- Securely access S3 buckets ssing IAM roles
- Multiple AWS account support in Dataserver using Databricks
- Multiple AWS S3 IAM role support in Dataserver
- Azure topics
- GCP topics
- Kubernetes
- Microsoft SQL topics
- Snowflake configuration for PolicySync
- Create Azure resources
- Databricks
- Spark Plug-in
- Azure key vault
- Add custom properties
- Migrate Ranger KMS master key
- IAM policy for AWS controller
- Customize topic and table names
- Configure SSL for Privacera
- Configure Real-time scan across projects in GCP
- Upload custom SSL certificates
- Deployment size
- Service-level system properties
- PrestoSQL standalone installation
- AWS topics
- Privacera Platform User Guide
- Introduction to Privacera Platform
- Settings
- Data inventory
- Token generator
- System configuration
- Diagnostics
- Notifications
- How-to
- Privacera Discovery User Guide
- What is Discovery?
- Discovery Dashboard
- Scan Techniques
- Processing order of scan techniques
- Add and scan resources in a data source
- Start or cancel a scan
- Tags
- Dictionaries
- Patterns
- Scan status
- Data zone movement
- Models
- Disallowed Tags Policy
- Rules
- Types of rules
- Example rules and classifications
- Create a structured rule
- Create an unstructured rule
- Create a rule mapping
- Export rules and mappings
- Import rules and mappings
- Post-processing in real-time and offline scans
- Enable post-processing
- Example of post-processing rules on tags
- List of structured rules
- Supported scan file formats
- Data Source Scanning
- Data Inventory
- TagSync using Apache Ranger
- Compliance Workflow
- Data zones and workflow policies
- Workflow Policies
- Alerts Dashboard
- Data Zone Dashboard
- Data zone movement
- Example Workflow Usage
- Discovery health check
- Reports
- Built-in Reports
- Saved reports
- Offline reports
- Reports with the query builder
- How-to
- Privacera Encryption Guide
- Essential Privacera Encryption terminology
- Install Privacera Encryption
- Encryption Key Management
- Schemes
- Scheme Policies
- Encryption Schemes
- Presentation Schemes
- Masking schemes
- Encryption formats, algorithms, and scopes
- Deprecated encryption formats, algorithms, and scopes
- Encryption with PEG REST API
- PEG REST API on Privacera Platform
- PEG API Endpoint
- Encryption Endpoint Summary for Privacera Platform
- Authentication Methods on Privacera Platform
- Anatomy of the /protect API Endpoint on Privacera Platform
- About Constructing the datalist for protect
- About Deconstructing the datalist for unprotect
- Example of Data Transformation with /unprotect and Presentation Scheme
- Example PEG API endpoints
- /unprotect with masking scheme
- REST API Response Partial Success on Bulk Operations
- Audit Details for PEG REST API Accesses
- REST API Reference
- Make calls on behalf of another user
- Troubleshoot REST API Issues on Privacera Platform
- PEG REST API on Privacera Platform
- Encryption with Databricks, Hive, Streamsets, Trino
- Databricks UDFs for encryption and masking
- Hive UDFs
- Streamsets
- Trino UDFs
- Privacera Access Management User Guide
- Privacera Access Management
- How Polices are evaluated
- Resource policies
- Policies overview
- Creating Resource Based Policies
- Configure Policy with Attribute-Based Access Control
- Configuring Policy with Conditional Masking
- Tag Policies
- Entitlement
- Request Access
- Approve access requests
- Service Explorer
- User/Groups/Roles
- Permissions
- Reports
- Audit
- Security Zone
- Access Control using APIs
- AWS User Guide
- Overview of Privacera on AWS
- Set policies for AWS services
- Using Athena with data access server
- Using DynamoDB with data access server
- Databricks access manager policy
- Accessing Kinesis with data access server
- Accessing Firehose with Data Access Server
- EMR user guide
- AWS S3 bucket encryption
- S3 browser
- Getting started with Minio
- Plugins
- How to Get Support
- Coordinated Vulnerability Disclosure (CVD) Program of Privacera
- Shared Security Model
- Privacera documentation changelog
Discovery in Azure
Azure Discovery
This topic allows you to setup the Azure configuration for installing Privacera Discovery.
Prerequisites
Ensure the following prerequisites are met:
Azure storage account
Create an Azure storage account. For more information, refer to Microsoft's documentation Create a storage account.
Create a private-access container. For more information, refer to Microsoft's documentation Create a container.
Get the access key. For more information, refer to Microsoft's documentation View account access keys.
Azure Cosmos DB account
Create an Azure Cosmos DB, For more information, refer to Microsoft's documentation Cosmos DB.
Get the URI from the Overview section.
Get the Primary Key from the Settings > Keys section.
Set the consistency to Strong in the Settings > Default Consistency section.
For Terraform
Assign permissions to create Azure resources using managed-identity. For more information, refer to create Azure resources.
CLI configuration
SSH to the instance where Privacera is installed.
Configure your environment.
Configure Discovery for a Kubernetes environment. You need to set the Kubernetes cluster name. For more information, see Discovery (Kubernetes Mode)
For a Docker environment, you can skip this step.
Run the following commands.
cd ~/privacera/privacera-manager cp config/sample-vars/vars.kafka.yml config/custom-vars vi config/custom-vars/vars.kafka.yml
Run the following commands.
cd ~/privacera/privacera-manager cp config/sample-vars/vars.discovery.azure.yml config/custom-vars vi config/custom-vars/vars.discovery.azure.yml
Edit the following properties. For property details and description, refer to the Configuration Properties below.
DISCOVERY_FS_PREFIX: "<PLEASE_CHANGE>" DISCOVERY_AZURE_STORAGE_ACCOUNT_NAME: <PLEASE_CHANGE>" DISCOVERY_COSMOSDB_URL: <PLEASE_CHANGE>" DISCOVERY_COSMOSDB_KEY: "<PLEASE_CHANGE>" DISCOVERY_AZURE_STORAGE_ACCOUNT_KEY: "<PLEASE_CHANGE>" CREATE_AZURE_RESOURCES: "false" DISCOVERY_AZURE_RESOURCE_GROUP: "<PLEASE_CHANGE>" DISCOVERY_AZURE_COSMOS_DB_ACCOUNT: "<PLEASE_CHANGE>" DISCOVERY_AZURE_LOCATION: "<PLEASE_CHANGE>"
(Optional) If you want to customize Discovery configuration further, you can add custom Discovery properties. For more information, refer to Discovery Custom Properties.
For example, by default, the username and password for the Discovery service is padmin/padmin. If you choose to change it, refer to Add Custom Properties.
To configure real-time scan for audits, refer to Pkafka.
Run the following commands.
cd ~/privacera/privacera-manager ./privacera-manager.sh update
Configuration properties
Property | Description | Example |
---|---|---|
DISCOVERY_ENABLE | In the **Basic** tab, enable/disable Privacera Discovery. | |
DISCOVERY_REALTIME_ENABLE | In the **Basic** tab, enable/disable real-time scan in Privacera Discovery. For real-time scan to work, ensure the following:
| |
DISCOVERY_FS_PREFIX | Enter the container name. Get it from the Prerequisites section. | container1 |
DISCOVERY_AZURE_STORAGE_ACCOUNT_NAME | Enter the name of the Azure Storage account. Get it from the Prerequisites section. | azurestorage |
DISCOVERY_COSMOSDB_URL DISCOVERY_COSMOSDB_KEY | Enter the Cosmos DB URL and Primary Key. Get it from the Prerequisites section. | DISCOVERY_COSMOSDB_URL: "https://url1.documents.azure.com:443/" DISCOVERY_COSMOSDB_KEY: "xavosdocof" |
DISCOVERY_AZURE_STORAGE_ACCOUNT_KEY | Enter the Access Key of the storage account. Get it from the Prerequisites section. | GMi0xftgifp== |
[Properties of Topic and Table names](../pm-ig/customize_topic_and_tables_names.md) | Topic and Table names are assigned by default in Privacera Discovery. To customize any topic or table name, refer to the link. | |
PKAFKA_EVENT_HUB | In the **Advanced > Pkafka Configuration** section, enter the Event Hub name. Get it from the Prerequisites section. | eventhub1 |
PKAFKA_EVENT_HUB_NAMESPACE | In the **Advanced > Pkafka Configuration** section, enter the name of the Event Hub namespace. Get it from the Prerequisites section. | eventhubnamespace1 |
PKAFKA_EVENT_HUB_CONSUMER_GROUP | In the **Advanced > Pkafka Configuration** section, enter the name of the Consumer Group. Get it from the Prerequisites section. | congroup1 |
PKAFKA_EVENT_HUB_CONNECTION_STRING | In the **Advanced > Pkafka Configuration** section, enter the connection string. Get it from the Prerequisites section. | Endpoint=sb://eventhub1.servicebus.windows.net/; SharedAccessKeyName=RootManageSharedAccessKey; SharedAccessKey=sAmPLEP/8PytEsT= |
CREATE_AZURE_RESOURCES | For terraform usage, assign the value as true. Its default value is false. | true |
DISCOVERY_AZURE_RESOURCE_GROUP | Get the value from the Prerequisite section. | resource1 |
DISCOVERY_AZURE_COSMOS_DB_ACCOUNT | Get the value from the Prerequisite section. | database1 |