spark on yarn :state: ACCEPTED一直 出现

今天运行spark on yarn 一直出现

16/09/20 18:40:41 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:42 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:43 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:44 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:45 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:46 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:47 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:48 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:49 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:50 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:51 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:52 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:53 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:54 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:55 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:56 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:57 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:58 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:40:59 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:00 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:01 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:02 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:03 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:04 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:05 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:06 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:07 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:08 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:09 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:10 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:11 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)
16/09/20 18:41:12 INFO yarn.Client: Application report for application_1474179312027_0002 (state: ACCEPTED)

不知道是 什么问题,on yarn 上也没有日志提示。

时间: 2024-07-28 23:58:36

spark on yarn :state: ACCEPTED一直 出现的相关文章

spark on yarn提交任务时一直显示ACCEPTED

spark on yarn提交任务时一直显示ACCEPTED,过一个小时后就会出现任务失败,但在提交时shell终端显示的日志并没有报错,logs文件夹中也没有日志产生.注:spark on yarn是不需要启动spark集群的,只需要在提交任务的机器配置spark就可以了,因为任务是由hadoop来执行的,spark只负责任务的提交. 任务提交命令为 bin/spark-submit --class org.apache.spark.examples.JavaWordCount\     --

Spark(十二) -- Spark On Yarn & Spark as a Service & Spark On Tachyon

Spark On Yarn: 从0.6.0版本其,就可以在在Yarn上运行Spark 通过Yarn进行统一的资源管理和调度 进而可以实现不止Spark,多种处理框架并存工作的场景 部署Spark On Yarn的方式其实和Standalone是差不多的,区别就是需要在spark-env.sh中添加一些yarn的环境配置,在提交作业的时候会根据这些配置加载yarn的信息,然后将作业提交到yarn上进行管理 首先请确保已经部署了Yarn,相关操作请参考: hadoop2.2.0集群安装和配置 部署完

Spark on YARN两种运行模式介绍

本文出自:Spark on YARN两种运行模式介绍http://www.aboutyun.com/thread-12294-1-1.html(出处: about云开发)   问题导读 1.Spark在YARN中有几种模式? 2.Yarn Cluster模式,Driver程序在YARN中运行,应用的运行结果在什么地方可以查看? 3.由client向ResourceManager提交请求,并上传jar到HDFS上包含哪些步骤? 4.传递给app的参数应该通过什么来指定? 5.什么模式下最后将结果输

Spark on Yarn年度知识整理

大数据体系结构: Spark简介 Spark是整个BDAS的核心组件,是一个大数据分布式编程框架,不仅实现了MapReduce的算子map 函数和reduce函数及计算模型,还提供更为丰富的算子,如filter.join.groupByKey等.是一个用来实现快速而同用的集群计算的平台. Spark将分布式数据抽象为弹性分布式数据集(RDD),实现了应用任务调度.RPC.序列化和压缩,并为运行在其上的上层组件提供API.其底层采用Scala这种函数式语言书写而成,并且所提供的API深度借鉴Sca

Spark on Yarn

YARN是什么 YARN在hadoop生态系统中的位置 YARN产生的背景 YARN的基本架构 ResourceManager NodeManager ApplicationMaster container Spark On Yarn 配置和部署 编译时包含yarn 基本配置 在没有配置的前提下试下启动spark-shell 可以看到启动没问题 这里问题就来了!!! 下面我们配上来看看 可以看到报错了!!! 应该是资源不足导致的 先重启一下各个进程 $SPARK_HOME/bin/spark-s

Spark On YARN内存和CPU分配

本篇博客参考:http://blog.cloudera.com/blog/2015/03/how-to-tune-your-apache-spark-jobs-part-2/ 软件版本: CDH:5.7.2,JDK:1.7: 问题描述: 在使用Spark On YARN时(无论是Client模式或者是Cluster模式,当然下面会有这种模式的对比区别),可以添加诸如: --executor-memory 8G --executor-cores 5 --num-executors 20 等等这样的

Spark on Yarn彻底解密(DT大数据梦工厂)

内容: 1.Hadoop Yarn的工作流程解密: 2.Spark on Yarn两种运行模式实战: 3.Spark on Yarn工作流程解密: 4.Spark on Yarn工作内幕解密: 5.Spark on Yarn最佳实践: 资源管理框架Yarn Mesos是分布式集群的资源管理框架,和大数据没关系,但是可以管理大数据的资源 ==========Hadoop Yarn解析============ 1.Yarn是Hadoop推出的资源管理器,是负责分布式(大数据)集群计算的资源管理的,负

spark 在yarn执行job时一直抱0.0.0.0:8030错误

近日新写完的spark任务放到yarn上面执行时,在yarn的slave节点中一直看到报错日志:连接不到0.0.0.0:8030 . 1 The logs are as below: 2 2014-08-11 20:10:59,795 INFO [main] org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8030 3 2014-08-11 20:11:01,838 INFO [ma

Oozie Spark on YARN requirement failed

软件环境: CDH:5.7.3:Oozie:4.1.0-CDH5.7.3 : Spark:1.6.0-cdh5.7.3-hadoop2.6.0-cdh5.7.3 : Hadoop:hadoop2.6.0-cdh5.7.3(HDFS 采用HA方式): 问题描述: 在使用CDH5.7.3版本的时候,发起一个Oozie工作流,该工作流使用Spark On YARN的方式提交一个Spark程序,但是在Oozie中该程序运行失败,同时找到YARN监控中对应的任务,发现出现下面的错误(该Spark任务如果使