Add JDBC Driver dynamically during runtime in PySpark - Stack Overflow

admin2025-04-16  4

I want to create a function such that user enters the path of their JDBC driver jar file, and a running SparkSession (already configured) gets the data from the JDBC Database table

I tried the following

  • Copied the jar file to the Spark jars folder (/opt/spark/jars) during runtime
  • Updated properties using SparkSession.sparkContext.getConf().set("spark.driver.extraClassPath", "/path/to/jdbc/driver"), SparkSession.sparkContext.addFile("spark.driver.extraClassPath", "/path/to/jdbc/driver") etc.

None of the methods helped my case

转载请注明原文地址:http://anycun.com/QandA/1744775422a87453.html