spark-shell启动错误

18/06/24 16:41:40 ERROR spark.SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service ‘sparkDriver‘ failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service ‘sparkDriver‘ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:748)
java.net.BindException: Cannot assign requested address: Service ‘sparkDriver‘ failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service ‘sparkDriver‘ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:748)
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^

解决方案:这是由于ip与主机名不对应

sudo vim  /etc/hosts

内网ip(就是通过ifconfig得到的) hostname

保存退出即可

原文地址:https://www.cnblogs.com/Whynot1/p/9220978.html

时间: 2024-10-31 00:44:44

spark-shell启动错误的相关文章

centos启动错误:Inodes that were part of a corrupted orphan linked list found.

centos启动时,提示错误: 1 /dev/mapper/VolGroup-lv_root contains a file system with errors,check forced. 2 /dev/mapper/VolGroup-lv_root: 3 Inodes that were part of a corrupted orphan linked list found. 4 /dev/mapper/VolGroup-lv_root:unexpected inconsistency:r

【原创 Hadoop&amp;Spark 动手实践 5】Spark 基础入门,集群搭建以及Spark Shell

Spark 基础入门,集群搭建以及Spark Shell 主要借助Spark基础的PPT,再加上实际的动手操作来加强概念的理解和实践. Spark 安装部署 理论已经了解的差不多了,接下来是实际动手实验: 练习1 利用Spark Shell(本机模式) 完成WordCount spark-shell 进行Spark-shell本机模式 第一步:通过文件方式导入数据 scala> val rdd1 = sc.textFile("file:///tmp/wordcount.txt")

STS的启动错误

"Failed to load the JNI shared library jvm.dll” 下班回家下载了一个STS,居然打不开,检查环境变量配置没有错误,试了好几次,甚至把jre都拷贝到sts目录都不行,这让我无语得不行,网上查下,有人说版本可能不对,果然检查了一下自己下载的版本,因为点击的是网站的默认下载,没有进行版本检查,而自己是64位系统+64位JDK,当然总提示错误了.了解后,立刻到官网下了个64版本的,OK 这下没问题了.  做为程序员,真的要细心/细心,再细心. STS的启动错

shell启动hadoop集群

原创,转载请注明.谢谢 shell启动hadoop集群1:明确启动顺序 1)启动zookeeper集群(分别在centos 4-02,centos6-02,centos7-02) app/zookeeper-3.4.5/bin/zkServer.sh start //启动进程 app/zookeeper-3.4.5/bin/zkServer.sh status //查看状态 2)启动journalnode(分别在centos 4-02,centos6-02,centos7-02) app/had

Qt启动错误:Cannot create semaphore /tmp/qtembedded-0/QtEmbedded-0 &#39;d&#39;

最近要在ARM Cortex-A9平台上移植Qt4.7.3,由于我们选用的平台只支持Android系统,所以,要用Qt只能自己移植了. 硬件平台选用深圳思博特科技的SCM3188M,CPU是RK3188,1.6GHz四核Cortex-A9,启动后出现以下错误.     [[email protected]]# Cannot create semaphore /tmp/qtembedded-0/QtEmbedded-0 'd' Error 38 Function not implemented C

PHP Apache shutdown unexpectedly启动错误解释及解决办法

本文出自:http://blog.csdn.net/svitter 实验环境:Myeclipse10 + tomcat7.0 有时间会写windows和linux下的tomcat配置,现在时间有限,暂且不写了..有些东西也是没有理解透彻. <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"> <%@ page language="java" contentType="

eclipse启动错误

转载自:徐徐微风 eclipse启动错误 1.错误日志 !SESSION 2013-12-09 12:24:33.826 -----------------------------------------------eclipse.buildId=M20130204-1200java.version=1.7.0_25java.vendor=Oracle CorporationBootLoader constants: OS=win32, ARCH=x86, WS=win32, NL=zh_CNF

mongo shell启动配置文件.mongorc.js(二)

mongo shell启动配置文件.mongorc.js(二) 如果你的主目录下有个.mongorc.js文件,那么当你启动shell时他就会自动运行.使用它可以初始化任何你经常使用的helper方法和你不想意外操作的删除方法. 比如,你不想使用默认的dropDatabase()方法了,你可以在.mongorc.js文件中添加下面的命令: DB.prototype.dropDatabase = function() {        print("No dropping DBs!");

mongo shell启动配置文件.mongorc.js(三)

mongo shell启动配置文件.mongorc.js(三) 自定义MongoDB操作函数 可以把自己写的js代码保存在某个地方,让MongoDB加载它,然后就可以在MongoDB的命令行里操作它们. mongodb shell默认会加载~/.mongorc.js文件 例如以下修改了启动提示文字.左侧提示文字,增加了my_show_shards shell函数用于显示当前sharded collection的chunks在各分片的负载情况: //~/.mongorc.js   //show a

第一次看到Spark崩溃:Spark Shell内存OOM的现象!

第一次看到Spark崩溃 Spark Shell内存OOM的现象 要搞Spark图计算,所以用了Google的web-Google.txt,大小71.8MB. 以命令: val graph = GraphLoader.edgeListFile(sc,"hdfs://192.168.0.10:9000/input/graph/web-Google.txt") 建立图的时候,运算了半天后直接退回了控制台. 界面xian scala> val graph = GraphLoader.e