Starting a single Spark Slave (or Worker) -


when this

spark-1.3.0-bin-hadoop2.4% sbin/start-slave.sh 

i message

failed launch org.apache.spark.deploy.worker.worker:                          default conf/spark-defaults.conf. 

even though have this:

spark-1.3.0-bin-hadoop2.4% ll conf | grep spark-defaults.conf -rw-rwxr--+ 1 xxxx.xxxxx ama-unix  507 apr 29 07:09 spark-defaults.conf -rw-rwxr--+ 1 xxxx.xxxxx ama-unix  507 apr 13 12:06 spark-defaults.conf.template 

any idea why?

thanks

first of all, should make sure using command correctly,

usage: start-slave.sh <worker#> <spark-master-url> 

where <worker#> worker number want launch on machine running script.
<spark-master-url> spark://localhost:7077


Comments

Popular posts from this blog

php - failed to open stream: HTTP request failed! HTTP/1.0 400 Bad Request -

java - How to filter a backspace keyboard input -

java - Show Soft Keyboard when EditText Appears -