can we pass parameter or environment variable to Spark job?
These options allow you to set JVM or java options for the driver or executor processes, respectively.
function of your application using the --conf
command-line option followed by spark.driver.extraJavaOptions or spark.executor.extraJavaOptions
Arguments passed before the .jar file will be arguments to the JVM,
as arguments passed after the jar file will be passed on to the user's program.
I am using to run a scala class with the main function with program arguments (String[] args).
However, when I submit the job using spark-submit and pass program arguments as with Below code : -
spark-submit --class com.example.MySparkApp \
--master local \
--conf "spark.driver.extraJavaOptions=-DmyArg1=value1 -DmyArg2=value2" \
mysparkapp.jar arg1 arg2
public class MySCalaApp {
public static void main(String[] args) {
// args[0] will contain "arg1"
// args[1] will contain "arg2"
// Your Spark application ETL job
}
}