一、环境说明
Mac OSX 10.10.3 Java 1.7.0_71 Spark 1.4.0
二、编译安装
tar -zxvf spark-1.4.0.tgz cd spark-1.4.0 ./sbt/sbt assembly
ps:如果之前执行过编译,需要执行 ./sbt/sbt clean
清理后才能重新编译。
三、运行
adeMacBook-Pro:spark-1.4.0 apple$ ./bin/spark-shell log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Using Spark‘s default log4j profile: org/apache/spark/log4j-defaults.properties 15/06/14 11:32:25 INFO SecurityManager: Changing view acls to: apple 15/06/14 11:32:25 INFO SecurityManager: Changing modify acls to: apple 15/06/14 11:32:25 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(apple); users with modify permissions: Set(apple) 15/06/14 11:32:25 INFO HttpServer: Starting HTTP Server 15/06/14 11:32:26 INFO Server: jetty-8.y.z-SNAPSHOT 15/06/14 11:32:26 INFO AbstractConnector: Started [email protected]0.0.0.0:61566 15/06/14 11:32:26 INFO Utils: Successfully started service ‘HTTP class server‘ on port 61566. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ ‘_/ /___/ .__/\_,_/_/ /_/\_\ version 1.4.0 /_/ Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_71) Type in expressions to have them evaluated. Type :help for more information. 15/06/14 11:32:31 INFO SparkContext: Running Spark version 1.4.0 15/06/14 11:32:31 INFO SecurityManager: Changing view acls to: apple 15/06/14 11:32:31 INFO SecurityManager: Changing modify acls to: apple 15/06/14 11:32:31 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(apple); users with modify permissions: Set(apple) 15/06/14 11:32:31 INFO Slf4jLogger: Slf4jLogger started 15/06/14 11:32:31 INFO Remoting: Starting remoting 15/06/14 11:32:32 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:61567] 15/06/14 11:32:32 INFO Utils: Successfully started service ‘sparkDriver‘ on port 61567. 15/06/14 11:32:32 INFO SparkEnv: Registering MapOutputTracker 15/06/14 11:32:32 INFO SparkEnv: Registering BlockManagerMaster 15/06/14 11:32:32 INFO DiskBlockManager: Created local directory at /private/var/folders/s3/llfgz_mx47572r5b4pbk7xm80000gp/T/spark-cf6feb6b-1464-4d54-89f3-8d97bf15205f/blockmgr-b8410cda-aa29-4069-9406-d6155512cd53 15/06/14 11:32:32 INFO MemoryStore: MemoryStore started with capacity 265.4 MB 15/06/14 11:32:32 INFO HttpFileServer: HTTP File server directory is /private/var/folders/s3/llfgz_mx47572r5b4pbk7xm80000gp/T/spark-cf6feb6b-1464-4d54-89f3-8d97bf15205f/httpd-a1838f08-2ccd-42d2-9419-6e91cb6fdfad 15/06/14 11:32:32 INFO HttpServer: Starting HTTP Server 15/06/14 11:32:32 INFO Server: jetty-8.y.z-SNAPSHOT 15/06/14 11:32:32 INFO AbstractConnector: Started [email protected]0.0.0.0:61568 15/06/14 11:32:32 INFO Utils: Successfully started service ‘HTTP file server‘ on port 61568. 15/06/14 11:32:32 INFO SparkEnv: Registering OutputCommitCoordinator 15/06/14 11:32:32 INFO Server: jetty-8.y.z-SNAPSHOT 15/06/14 11:32:32 INFO AbstractConnector: Started [email protected]0.0.0.0:4040 15/06/14 11:32:32 INFO Utils: Successfully started service ‘SparkUI‘ on port 4040. 15/06/14 11:32:32 INFO SparkUI: Started SparkUI at http://192.168.1.106:4040 15/06/14 11:32:32 INFO Executor: Starting executor ID driver on host localhost 15/06/14 11:32:32 INFO Executor: Using REPL class URI: http://192.168.1.106:61566 15/06/14 11:32:32 INFO Utils: Successfully started service ‘org.apache.spark.network.netty.NettyBlockTransferService‘ on port 61569. 15/06/14 11:32:32 INFO NettyBlockTransferService: Server created on 61569 15/06/14 11:32:32 INFO BlockManagerMaster: Trying to register BlockManager 15/06/14 11:32:32 INFO BlockManagerMasterEndpoint: Registering block manager localhost:61569 with 265.4 MB RAM, BlockManagerId(driver, localhost, 61569) 15/06/14 11:32:32 INFO BlockManagerMaster: Registered BlockManager 15/06/14 11:32:33 INFO SparkILoop: Created spark context.. Spark context available as sc. 15/06/14 11:32:33 INFO SparkILoop: Created sql context.. SQL context available as sqlContext. scala>
参考:
https://spark.apache.org/docs/latest/
时间: 2024-10-08 15:21:07