Answers for "py4jjavaerror: an error occurred while calling z:org.apache.spark.api.python.pythonrdd.collectandserve."

0

py4jjavaerror: an error occurred while calling z:org.apache.spark.api.python.pythonrdd.collectandserve.

Step 1:

Downgrade or upgrade your java version to 8, if you have already installed one. (see how to alternate among java versions)

Step 2:

Add the following to ~/.bashrc

export JAVA_HOME='/usr/lib/jvm/java-8-openjdk-amd64'
export PATH=$JAVA_HOME/bin:$PATH
export SPARK_HOME='/path/to/spark-2.x.x-bin-hadoop2.7'
export PATH=$SPARK_HOME/bin:$PATH
and run source ~/.bashrc to load it, or just start a new terminal.

An alternative approach would be to copy /path/to/spark-2.x.x-bin-hadoop2.7/conf/spark-env.sh.template to /path/to/spark-2.x.x-bin-hadoop2.7/conf/spark-env.sh. Then add the following to spark-env.sh

export JAVA_HOME='/usr/lib/jvm/java-8-openjdk-amd64'
export PYSPARK_PYTHON=python3
Then add the following to ~/.bashrc

export SPARK_HOME='/path/to/spark-2.x.x-bin-hadoop2.7'
export PATH=$SPARK_HOME/bin:$PATH
export SPARK_CONF_DIR=$SPARK_HOME/conf
and run source ~/.bashrc.
Posted by: Guest on July-21-2021

Code answers related to "py4jjavaerror: an error occurred while calling z:org.apache.spark.api.python.pythonrdd.collectandserve."

Browse Popular Code Answers by Language