构建在hadoop2.6.0之上的
1.在官网下载spark-1.4.0-bin-hadoop2.6.tgz
2.解压到你想要放的文件夹里,tar zxvf spark-1.4.0-bin-hadoop2.6.tgz
3.配置profile
sudo gedit /etc/profile
在文件下面加入一下路径配置,保存退出,并用source /etc/profile 使环境生效
export SCALA_HOME=/home/jiahong/spark-1.4.0-bin-hadoop2.6 export PATH=$SPARK_HOME/bin:$PATH
4.启动spark
[email protected]7010:~$ cd spark-1.4.0-bin-hadoop2.6/ [email protected]-OptiPlex-7010:~/spark-1.4.0-bin-hadoop2.6$ sbin/start-all.sh starting org.apache.spark.deploy.master.Master, logging to /home/jiahong/spark-1.4.0-bin-hadoop2.6/sbin/../logs/spark-jiahong-org.apache.spark.deploy.master.Master-1-jiahong-OptiPlex-7010.out localhost: starting org.apache.spark.deploy.worker.Worker, logging to /home/jiahong/spark-1.4.0-bin-hadoop2.6/sbin/../logs/spark-jiahong-org.apache.spark.deploy.worker.Worker-1-jiahong-OptiPlex-7010.ou
5.访问spark界面
localhost:8080
时间: 2024-11-05 19:31:39