I want to create a function such that user enters the path of their JDBC driver jar file, and a running SparkSession (already configured) gets the data from the JDBC Database table
I tried the following
- Copied the jar file to the Spark jars folder (
/opt/spark/jars
) during runtime
- Updated properties using
SparkSession.sparkContext.getConf().set("spark.driver.extraClassPath", "/path/to/jdbc/driver")
, SparkSession.sparkContext.addFile("spark.driver.extraClassPath", "/path/to/jdbc/driver")
etc.
None of the methods helped my case