ApacheSpark: 如何在 Python3中使用 pypark

我从 GH 开发总监那里构建了 Spark 1.4,而且构建过程很顺利。但是当我做 bin/pyspark时,我得到的是 Python 2.7.9版本。我怎样才能改变这一切?

95663 次浏览

Have a look into the file. The shebang line is probably pointed to the 'env' binary which searches the path for the first compatible executable.

You can change python to python3. Change the env to directly use hardcoded the python3 binary. Or execute the binary directly with python3 and omit the shebang line.

PYSPARK_PYTHON=python3
./bin/pyspark

If you want to run in in IPython Notebook, write:

PYSPARK_PYTHON=python3
PYSPARK_DRIVER_PYTHON=ipython
PYSPARK_DRIVER_PYTHON_OPTS="notebook"
./bin/pyspark

If python3 is not accessible, you need to pass path to it instead.

Bear in mind that the current documentation (as of 1.4.1) has outdate instructions. Fortunately, it has been patched.

Just set the environment variable:

export PYSPARK_PYTHON=python3

in case you want this to be a permanent change add this line to pyspark script.

1,edit profile :vim ~/.profile

2,add the code into the file: export PYSPARK_PYTHON=python3

3, execute command : source ~/.profile

4, ./bin/pyspark

For Jupyter Notebook, edit spark-env.sh file as shown below from command line

$ vi $SPARK_HOME/conf/spark-env.sh

Goto the bottom of the file and copy paste these lines

export PYSPARK_PYTHON=python3
export PYSPARK_DRIVER_PYTHON=jupyter
export PYSPARK_DRIVER_PYTHON_OPTS="notebook"

Then, simply run following command to start pyspark in notebook

$ pyspark