Using S3 Table in Apache Spark OLAC¶
If you have enabled S3 Table support (see Enable S3 Table), the required 3 Table configurations are automatically automatically applied to spark-defaults.conf. You can start Spark without any additional configuration.
-
Navigate to
${SPARK_HOME}/binfolder and export the JWT token -
Start spark-session (choose one of spark-shell, pyspark, or spark-sql)
-
To pass the JWT token directly as a command-line argument, use the following configuration when connecting to the cluster:
-
To use the file path containing the JWT token, use the following configuration:
-
If you want to override the warehouse path, add the following configuration:
-
-
Use S3 Table tables
- Prev topic: Connector Guide