Connector Guide - AWS EMR - Using Delta Lake¶
If you are using AWS EMR with Delta Lake, you can use the following instructions to connect to the cluster and run Spark jobs.
-
SSH to your EMR master node:
Bash -
If you are using JWT for authentication, then you will have to pass the JWT token to the EMR cluster. You can do this by either passing the JWT token directly as a command-line argument or using a file path containing the JWT token.
-
To pass the JWT token directly as a command-line argument, use the following configuration when connecting to the cluster:
Bash -
To use the file path containing the JWT token, use the following configuration:
Bash
-
-
Connecting to Apache Spark Cluster
If you are using JWT, then add the --conf option from above with JWT while connecting to the cluster
-
If you are using OLAC, connect to
pysparkas belowBash -
Run spark read/write
-
If you are using OLAC, connect to
spark-shellas belowBash -
Run spark read/write
When using Spark SQL, the query retrieves the metadata from AWS Glue catalog or Hive Metastore, which provides the location of the data in S3. The access to these files is controlled by Privacera.
For running SQL commands, the cluster should have access to the AWS Glue catalog or Hive Metastore.
-
If you are using OLAC, connect to
spark-sqlas belowBash -
Run spark sql query
- Prev Connector Guide