[[email protected] spark-1.5.0]$ ./sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /srv/spark-1.5.0/sbin/../logs/spark-dyq-org.apache.spark.deploy.master.Master-1-master.out
slave2: starting org.apache.spark.deploy.worker.Worker, logging to /srv/spark-1.5.0/sbin/../logs/spark-dyq-org.apache.spark.deploy.worker.Worker-1-slave2.out
slave1: starting org.apache.spark.deploy.worker.Worker, logging to /srv/spark-1.5.0/sbin/../logs/spark-dyq-org.apache.spark.deploy.worker.Worker-1-slave1.out
slave2: failed to launch org.apache.spark.deploy.worker.Worker:
slave1: failed to launch org.apache.spark.deploy.worker.Worker:
slave1: full log in /srv/spark-1.5.0/sbin/../logs/spark-dyq-org.apache.spark.deploy.worker.Worker-1-slave1.out
slave2: full log in /srv/spark-1.5.0/sbin/../logs/spark-dyq-org.apache.spark.deploy.worker.Worker-1-slave2.out
这是什么原因
master机器
Spark Command: /srv/jdk1.7.0_79/bin/java -cp /srv/spark-1.5.0/sbin/../conf/:/srv/spark-1.5.0/lib/spark-assembly-1.5.0-hadoop2.6.0.jar:/srv/spark-1.5.0/lib/datanucleus-core-3.2.10.jar:/srv/spark-1.5.0/lib/datanucleus-api-jdo-3.2.6.jar:/srv/spark-1.5.0/lib/datanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --ip 192.168.0.100 --port 7077 --webui-port 8080
========================================
Using Spark‘s default log4j profile: org/apache/spark/log4j-defaults.properties
16/07/19 09:11:23 INFO Master: Registered signal handlers for [TERM, HUP, INT]
16/07/19 09:11:47 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/07/19 09:12:08 INFO SecurityManager: Changing view acls to: dyq
16/07/19 09:12:08 INFO SecurityManager: Changing modify acls to: dyq
16/07/19 09:12:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(dyq); users with modify permissions: Set(dyq)
16/07/19 09:13:45 INFO Slf4jLogger: Slf4jLogger started
16/07/19 09:13:56 INFO Remoting: Starting remoting
Exception in thread "main" java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
slave机器上:
Spark Command: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.65-3.b17.el7.x86_64/jre/bin/java -cp /srv/spark-1.5.0/sbin/../conf/:/srv/spark-1.5.0/lib/spark-assembly-1.5.0-hadoop2.6.0.jar:/srv/spark-1.5.0/lib/datanucleus-core-3.2.10.jar:/srv/spark-1.5.0/lib/datanucleus-api-jdo-3.2.6.jar:/srv/spark-1.5.0/lib/datanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://192.168.0.100:7077
========================================
Caused by: [拒绝连接: /192.168.0.100:7077]