numpy - Install the right python version for Spark -


i use python 2.7.6 on machine

$ python --version python 2.7.6 

i have on machine spark 1.1.0 depended python 2.7.6. if execute:

user@user:~/bin/spark-1.1.0$ ./bin/pyspark 

i

python 2.7.6 (default, mar 22 2014, 22:59:56)  [gcc 4.8.2] on linux2 type "help", "copyright", "credits" or "license" more information. . . . 

today install new pre-built version of spark 1.3.1 (i don't know why, depended python 2.7.5). if execute same command new version:

user@user:~/bin/spark-1.3.1-bin-hadoop2.6$ ./bin/pyspark 

i older python version

python 2.7.5 (default, jun 18 2014, 09:37:37)  [gcc 4.6.3] on linux2 type "help", "copyright", "credits" or "license" more information. . . . 

the main difference in older spark version can execute import numpy, in new 1 not.

i created next path python in .bashrc file:

export pythonpath=$pythonpath:usr/lib/python2.7 

i don't find way distinguish between version 2.7.6 , 2.7.5 in python, don't know place, python 2.7.6 stored (command find missing).

i run , worked fine. "export pyspark_python=python3"


Comments

Popular posts from this blog

php - failed to open stream: HTTP request failed! HTTP/1.0 400 Bad Request -

java - How to filter a backspace keyboard input -

java - Show Soft Keyboard when EditText Appears -