- Platform Release 6.5
- Privacera Platform Release 6.5
- Enhancements and updates in Privacera Access Management 6.5 release
- Enhancements and updates in Privacera Discovery 6.5 release
- Enhancements and updates in Privacera Encryption 6.5 release
- Deprecation of older version of PolicySync
- Upgrade Prerequisites
- Supported versions of third-party systems
- Documentation changelog
- Known Issues 6.5
- Platform - Supported Versions of Third-Party Systems
- Platform Support Policy and End-of-Support Dates
- Privacera Platform Release 6.5
- Privacera Platform Installation
- About Privacera Manager (PM)
- Install overview
- Prerequisites
- Installation
- Default services configuration
- Component services configurations
- Access Management
- Data Server
- UserSync
- Privacera Plugin
- Databricks
- Spark standalone
- Spark on EKS
- Portal SSO with PingFederate
- Trino Open Source
- Dremio
- AWS EMR
- AWS EMR with Native Apache Ranger
- GCP Dataproc
- Starburst Enterprise
- Privacera services (Data Assets)
- Audit Fluentd
- Grafana
- Ranger Tagsync
- Discovery
- Encryption & Masking
- Privacera Encryption Gateway (PEG) and Cryptography with Ranger KMS
- AWS S3 bucket encryption
- Ranger KMS
- AuthZ / AuthN
- Security
- Access Management
- Reference - Custom Properties
- Validation
- Additional Privacera Manager configurations
- Upgrade Privacera Manager
- Troubleshooting
- How to validate installation
- Possible Errors and Solutions in Privacera Manager
- Unable to Connect to Docker
- Terminate Installation
- 6.5 Platform Installation fails with invalid apiVersion
- Ansible Kubernetes Module does not load
- Unable to connect to Kubernetes Cluster
- Common Errors/Warnings in YAML Config Files
- Delete old unused Privacera Docker images
- Unable to debug error for an Ansible task
- Unable to upgrade from 4.x to 5.x or 6.x due to Zookeeper snapshot issue
- Storage issue in Privacera UserSync & PolicySync
- Permission Denied Errors in PM Docker Installation
- Unable to initialize the Discovery Kubernetes pod
- Portal service
- Grafana service
- Audit server
- Audit Fluentd
- Privacera Plugin
- How-to
- Appendix
- AWS topics
- AWS CLI
- AWS IAM
- Configure S3 for real-time scanning
- Install Docker and Docker compose (AWS-Linux-RHEL)
- AWS S3 MinIO quick setup
- Cross account IAM role for Databricks
- Integrate Privacera services in separate VPC
- Securely access S3 buckets ssing IAM roles
- Multiple AWS account support in Dataserver using Databricks
- Multiple AWS S3 IAM role support in Dataserver
- Azure topics
- GCP topics
- Kubernetes
- Microsoft SQL topics
- Snowflake configuration for PolicySync
- Create Azure resources
- Databricks
- Spark Plug-in
- Azure key vault
- Add custom properties
- Migrate Ranger KMS master key
- IAM policy for AWS controller
- Customize topic and table names
- Configure SSL for Privacera
- Configure Real-time scan across projects in GCP
- Upload custom SSL certificates
- Deployment size
- Service-level system properties
- PrestoSQL standalone installation
- AWS topics
- Privacera Platform User Guide
- Introduction to Privacera Platform
- Settings
- Data inventory
- Token generator
- System configuration
- Diagnostics
- Notifications
- How-to
- Privacera Discovery User Guide
- What is Discovery?
- Discovery Dashboard
- Scan Techniques
- Processing order of scan techniques
- Add and scan resources in a data source
- Start or cancel a scan
- Tags
- Dictionaries
- Patterns
- Scan status
- Data zone movement
- Models
- Disallowed Tags policy
- Rules
- Types of rules
- Example rules and classifications
- Create a structured rule
- Create an unstructured rule
- Create a rule mapping
- Export rules and mappings
- Import rules and mappings
- Post-processing in real-time and offline scans
- Enable post-processing
- Example of post-processing rules on tags
- List of structured rules
- Supported scan file formats
- Data Source Scanning
- Data Inventory
- TagSync using Apache Ranger
- Compliance Workflow
- Data zones and workflow policies
- Workflow Policies
- Alerts Dashboard
- Data Zone Dashboard
- Data zone movement
- Workflow policy use case example
- Discovery Health Check
- Reports
- How-to
- Privacera Encryption Guide
- Overview of Privacera Encryption
- Install Privacera Encryption
- Encryption Key Management
- Schemes
- Encryption with PEG REST API
- Privacera Encryption REST API
- PEG API endpoint
- PEG REST API encryption endpoints
- PEG REST API authentication methods on Privacera Platform
- Common PEG REST API fields
- Construct the datalist for the /protect endpoint
- Deconstruct the response from the /unprotect endpoint
- Example data transformation with the /unprotect endpoint and presentation scheme
- Example PEG API endpoints
- /authenticate
- /protect with encryption scheme
- /protect with masking scheme
- /protect with both encryption and masking schemes
- /unprotect without presentation scheme
- /unprotect with presentation scheme
- /unprotect with masking scheme
- REST API response partial success on bulk operations
- Audit details for PEG REST API accesses
- Make encryption API calls on behalf of another user
- Troubleshoot REST API Issues on Privacera Platform
- Privacera Encryption REST API
- Encryption with Databricks, Hive, Streamsets, Trino
- Databricks UDFs for encryption and masking on PrivaceraPlatform
- Hive UDFs for encryption on Privacera Platform
- StreamSets Data Collector (SDC) and Privacera Encryption on Privacera Platform
- Trino UDFs for encryption and masking on Privacera Platform
- Privacera Access Management User Guide
- Privacera Access Management
- How Polices are evaluated
- Resource policies
- Policies overview
- Creating Resource Based Policies
- Configure Policy with Attribute-Based Access Control
- Configuring Policy with Conditional Masking
- Tag Policies
- Entitlement
- Service Explorer
- Users, groups, and roles
- Permissions
- Reports
- Audit
- Security Zone
- Access Control using APIs
- AWS User Guide
- Overview of Privacera on AWS
- Configure policies for AWS services
- Using Athena with data access server
- Using DynamoDB with data access server
- Databricks access manager policy
- Accessing Kinesis with data access server
- Accessing Firehose with Data Access Server
- EMR user guide
- AWS S3 bucket encryption
- Getting started with Minio
- Plugins
- How to Get Support
- Coordinated Vulnerability Disclosure (CVD) Program of Privacera
- Shared Security Model
- Privacera Platform documentation changelog
Configure Kafka destination
This topic covers how you can configure Kafka audit endpoint in AuditServer for the Ranger plugin and the Ranger Admin to send the audits.
Prerequisites
Ensure the following prerequisites are met:
AuditServer must be configured. For more information, click here.
CLI configuration
SSH to an instance where Privacera is installed.
Run the following commands.
cd ~/privacera/privacera-manager cp config/sample-vars/vars.auditserver.kafka.destination.yml config/custom-vars/ vi config/custom-vars/vars.auditserver.kafka.destination.yml
Modify the properties. For property details and description, refer to the Configuration Properties below.
AUDITSERVER_KAFKA_DESTINATION:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_BROKER_LIST:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_TOPIC_NAME:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_SECURITY_PROTOCOL:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_SSL_KEYSTORE_LOCATION:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_SSL_KEYSTORE_PASSWORD:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_SSL_KEY_PASSWORD:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_SSL_TRUSTSTORE_LOCATION:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_SSL_TRUSTSTORE_PASSWORD:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_SASL_JAAS_CONFIG:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_SASL_MECHANISM:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_SASL_LOGIN_CALLBACK_HANDLER_CLASS:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_OAUTH_TOKEN_ENDPOINT_URI:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_OAUTH_WITH_SSL:"<PLEASE_CHANGE>"AUDITSERVER_OAUTH_ACCEPT_UNSECURE_SERVER:"<PLEASE_CHANGE>"AUDITSERVER_OAUTH_LOGIN_GRANT_TYPE:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_OAUTH_CLIENT_ID:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_OAUTH_CLIENT_SECRET:"<PLEASE_CHANGE>"AUDITSERVER_KAFKA_BATCH_FILESPOOL_DIR:"/workdir/privacera-audit-server/kafka-spool"ADMIN_AUDITSERVER_KAFKA_DESTINATION:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_BROKER_LIST:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_TOPIC_NAME:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_SECURITY_PROTOCOL:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_SSL_KEYSTORE_LOCATION:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_SSL_KEYSTORE_PASSWORD:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_SSL_KEY_PASSWORD:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_SSL_TRUSTSTORE_LOCATION:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_SSL_TRUSTSTORE_PASSWORD:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_SASL_JAAS_CONFIG:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_SASL_MECHANISM:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_SASL_LOGIN_CALLBACK_HANDLER_CLASS:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_OAUTH_TOKEN_ENDPOINT_URI:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_OAUTH_WITH_SSL:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_OAUTH_ACCEPT_UNSECURE_SERVER:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_OAUTH_LOGIN_GRANT_TYPE:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_OAUTH_CLIENT_ID:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_OAUTH_CLIENT_SECRET:"<PLEASE_CHANGE>"ADMIN_AUDITSERVER_KAFKA_BATCH_FILESPOOL_DIR:"/workdir/privacera-audit-server/kafka-spool"
Run the following commands.
cd ~/privacera/privacera-manager ./privacera-manager.sh update
Configuration properties
The property names prefixed with ADMIN_ refer to Privacera Ranger Admin, whereas the others refer to Privacera Portal.
Property | Description | Example |
---|---|---|
AUDITSERVER_KAFKA_DESTINATION ADMIN_AUDITSERVER_KAFKA_DESTINATION | Set to true if audit destination is kafka | |
AUDITSERVER_KAFKA_BROKER_LIST ADMIN_AUDITSERVER_KAFKA_BROKER_LIST | A list of host/port pairs to use for establishing the initial connection to the Kafka cluster. This list should be in the form host1:port1,host2:port2,.... Since these servers are just used for the initial connection to discover the full cluster membership (which may change dynamically), this list need not contain the full set of servers (you may want more than one, though, in case a server is down). | 10.xxx.xx.xxx:9093 |
AUDITSERVER_KAFKA_TOPIC_NAME ADMIN_AUDITSERVER_KAFKA_TOPIC_NAME | Topic name to which audits are to be sent | topic-name |
AUDITSERVER_KAFKA_SECURITY_PROTOCOL ADMIN_AUDITSERVER_KAFKA_SECURITY_PROTOCOL | Protocol used to communicate with brokers. Valid values are: PLAINTEXT, SSL, SASL_PLAINTEXT, SASL_SSL. | SASL_SSL |
AUDITSERVER_KAFKA_SSL_KEYSTORE_LOCATION ADMIN_AUDITSERVER_KAFKA_SSL_KEYSTORE_LOCATION | The location of the key store file. Make sure key is copied in config/ssl folder. Provide name of the file. | kafka.server.keystore |
AUDITSERVER_KAFKA_SSL_KEYSTORE_PASSWORD ADMIN_AUDITSERVER_KAFKA_SSL_KEYSTORE_PASSWORD | The store password for the key store file.This is optional and only needed if AUDITSERVER_KAFKA_SSL_KEYSTORE_LOCATION is configured. | privacera |
AUDITSERVER_KAFKA_SSL_KEY_PASSWORD ADMIN_AUDITSERVER_KAFKA_SSL_KEY_PASSWORD | The password of the private key in the key store file. This is optional. | privacera |
AUDITSERVER_KAFKA_SSL_TRUSTSTORE_LOCATION ADMIN_AUDITSERVER_KAFKA_SSL_TRUSTSTORE_LOCATION | The location of the trust store file. Make sure the key is copied in config/ssl folder. Provide name of the file. | kafka.server.truststore |
AUDITSERVER_KAFKA_SSL_TRUSTSTORE_PASSWORD ADMIN_AUDITSERVER_KAFKA_SSL_TRUSTSTORE_PASSWORD | The password for the trust store file. | privacera |
AUDITSERVER_KAFKA_SASL_JAAS_CONFIG ADMIN_AUDITSERVER_KAFKA_SASL_JAAS_CONFIG | Kafka uses the Java Authentication and Authorization Service (JAAS) for SASL configuration. You must provide JAAS configurations for all SASL authentication mechanisms. E.g "org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required ;" if AUDITSERVER_KAFKA_SASL_MECHANISM is "OAUTHBEARER | org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required ; |
AUDITSERVER_KAFKA_SASL_MECHANISM ADMIN_AUDITSERVER_KAFKA_SASL_MECHANISM | SASL mechanism used for connections. This may be any mechanism for which a security provider is available. GSSAPI is the default mechanism. | OAUTHBEARER |
AUDITSERVER_KAFKA_SASL_LOGIN_CALLBACK_HANDLER_CLASS ADMIN_AUDITSERVER_KAFKA_SASL_LOGIN_CALLBACK_HANDLER_CLASS | The LoginModule for the selected SASL_MECHANISM E.g "io.strimzi.kafka.oauth.client.JaasClientOauthLoginCallbackHandler" if AUDITSERVER_KAFKA_SASL_MECHANISM is "OAUTHBEARER | io.strimzi.kafka.oauth.client.JaasClientOauthLoginCallbackHandler |
AUDITSERVER_KAFKA_OAUTH_TOKEN_ENDPOINT_URI ADMIN_AUDITSERVER_KAFKA_OAUTH_TOKEN_ENDPOINT_URI | OAUTH Token endpoint URL used by the application in order to get an access token or a refresh token | http://10.211.93.140:4444/oauth2/token |
AUDITSERVER_KAFKA_OAUTH_WITH_SSL ADMIN_AUDITSERVER_KAFKA_OAUTH_WITH_SSL | Set to true if SSL is applied on OAUTH. | |
AUDITSERVER_OAUTH_ACCEPT_UNSECURE_SERVER ADMIN_AUDITSERVER_OAUTH_ACCEPT_UNSECURE_SERVER | Set to true if OAUTH accept unsecure server. | |
AUDITSERVER_OAUTH_LOGIN_GRANT_TYPE ADMIN_AUDITSERVER_OAUTH_LOGIN_GRANT_TYPE | The authorization server needs to know which grant type the application wants to use since it affects the kind of credential it will issue e.g client_credentials | client_credentials |
AUDITSERVER_KAFKA_OAUTH_CLIENT_ID ADMIN_AUDITSERVER_KAFKA_OAUTH_CLIENT_ID | The ID of the application that asks for authorization. | broker-kafka |
AUDITSERVER_KAFKA_OAUTH_CLIENT_SECRET ADMIN_AUDITSERVER_KAFKA_OAUTH_CLIENT_SECRET | The secret of the application that asks for authorization. | broker-kafka |
AUDITSERVER_KAFKA_BATCH_FILESPOOL_DIR ADMIN_AUDITSERVER_KAFKA_BATCH_FILESPOOL_DIR | If audit framework detects that an audit destination is down then it buffers the audit messages in memory. Once memory buffer fills up then it can be configured to spool the unsent messages to disk files to prevent or minimize the loss of audit messages. Local disk directory where spool files would be kept. This value must be specified. Default location is "/workdir/privacera-audit-server/kafka-spool | /workdir/privacera-audit-server/kafka-spool |