异常-User class threw exception: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.

1 详细信息

User class threw exception: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.

This stopped SparkContext was created at:

org.apache.spark.SparkContext.<init>(SparkContext.scala:76)

com.wm.bigdata.spark.etl.RentOrderDistributedEtl$.main(RentOrderDistributedEtl.scala:227)

com.wm.bigdata.spark.etl.RentOrderDistributedEtl.main(RentOrderDistributedEtl.scala)

sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

java.lang.reflect.Method.invoke(Method.java:498)

org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:678)

The currently active SparkContext was created at:

(No active SparkContext.)

at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:100)

at org.apache.spark.SparkContext$$anonfun$newAPIHadoopRDD$1.apply(SparkContext.scala:1186)

at org.apache.spark.SparkContext$$anonfun$newAPIHadoopRDD$1.apply(SparkContext.scala:1185)

at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)

at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)

at org.apache.spark.SparkContext.withScope(SparkContext.scala:699)

at org.apache.spark.SparkContext.newAPIHadoopRDD(SparkContext.scala:1185)

at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:49)

at org.apache.phoenix.spark.PhoenixRelation.buildScan(PhoenixRelation.scala:39)

at org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$10.apply(DataSourceStrategy.scala:293)

at org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$10.apply(DataSourceStrategy.scala:293)

at org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:326)

at org.apache.spark.sql.execution.datasources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:325)

at org.apache.spark.sql.execution.datasources.DataSourceStrategy.pruneFilterProjectRaw(DataSourceStrategy.scala:381)

at org.apache.spark.sql.execution.datasources.DataSourceStrategy.pruneFilterProject(DataSourceStrategy.scala:321)

at org.apache.spark.sql.execution.datasources.DataSourceStrategy.apply(DataSourceStrategy.scala:289)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63)

at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)

at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)

at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)

at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.Iterator$class.foreach(Iterator.scala:891)

at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)

at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)

at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)

at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)

at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)

at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.Iterator$class.foreach(Iterator.scala:891)

at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)

at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)

at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)

at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)

at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)

at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.Iterator$class.foreach(Iterator.scala:891)

at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)

at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)

at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)

at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)

at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)

at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.Iterator$class.foreach(Iterator.scala:891)

at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)

at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)

at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)

at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)

at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)

at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.Iterator$class.foreach(Iterator.scala:891)

at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)

at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)

at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)

at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)

at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)

at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)

at scala.collection.Iterator$class.foreach(Iterator.scala:891)

at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)

at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)

at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)

at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)

at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)

at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)

at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)

at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:72)

at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:68)

at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:77)

at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:77)

at org.apache.spark.sql.execution.QueryExecution$$anonfun$toString$3.apply(QueryExecution.scala:207)

at org.apache.spark.sql.execution.QueryExecution$$anonfun$toString$3.apply(QueryExecution.scala:207)

at org.apache.spark.sql.execution.QueryExecution.stringOrError(QueryExecution.scala:99)

at org.apache.spark.sql.execution.QueryExecution.toString(QueryExecution.scala:207)

at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:75)

at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)

at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)

at org.apache.spark.sql.Dataset.withNewRDDExecutionId(Dataset.scala:3346)

at org.apache.spark.sql.Dataset.foreach(Dataset.scala:2716)

at com.wm.bigdata.spark.etl.RentOrderDistributedEtl$$anonfun$main$1$$anonfun$apply$mcVJ$sp$1.apply(RentOrderDistributedEtl.scala:284)

at com.wm.bigdata.spark.etl.RentOrderDistributedEtl$$anonfun$main$1$$anonfun$apply$mcVJ$sp$1.apply(RentOrderDistributedEtl.scala:263)

at scala.collection.immutable.List.foreach(List.scala:392)

at com.wm.bigdata.spark.etl.RentOrderDistributedEtl$$anonfun$main$1.apply$mcVJ$sp(RentOrderDistributedEtl.scala:263)

at com.wm.bigdata.spark.etl.RentOrderDistributedEtl$$anonfun$main$1.apply(RentOrderDistributedEtl.scala:262)

at com.wm.bigdata.spark.etl.RentOrderDistributedEtl$$anonfun$main$1.apply(RentOrderDistributedEtl.scala:262)

at scala.collection.Iterator$class.foreach(Iterator.scala:891)

at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)

at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)

at scala.collection.AbstractIterable.foreach(Iterable.scala:54)

at com.wm.bigdata.spark.etl.RentOrderDistributedEtl$.main(RentOrderDistributedEtl.scala:262)

at com.wm.bigdata.spark.etl.RentOrderDistributedEtl.main(RentOrderDistributedEtl.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:678)

错误原因

  自己关闭资源的位置写在任务循环代码之内了

sqlContext.sparkSession.close()sc.stop()

详细检查写结束的位置,写在最后,解决问题。低级失误

原文地址:https://www.cnblogs.com/QuestionsZhang/p/10840538.html

时间: 2024-10-14 00:52:47

异常-User class threw exception: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.的相关文章

(未解决)严重:threw exception java.lang.NullPointerException

问题描述: Servlet.service() for servlet [servlet.AddModuleServlet] in context with path [/PermissionSystem] threw exception java.lang.NullPointerException 添加数据库时,空指针. AddModule.jsp在Module文件夹外则没有问题. 但form中添加路径了啊:<form action="<%=request.getContextPa

严重: End event threw exception java.lang.IllegalArgumentException: Can&#39;t convert argument: null

堆栈信息: 2014-6-17 10:33:58 org.apache.tomcat.util.digester.Digester endElement 严重: End event threw exception java.lang.IllegalArgumentException: Can't convert argument: null at org.apache.tomcat.util.IntrospectionUtils.convert(IntrospectionUtils.java:1

Servlet.service() for servlet UserServlet threw exception java.lang.NullPointerException 空指针异常

错误付现: 严重: Servlet.service() for servlet UserServlet threw exceptionjava.lang.NullPointerException at com.liuyang.servlet.UserServlet.doGet(UserServlet.java:17) at javax.servlet.http.HttpServlet.service(HttpServlet.java:617) at javax.servlet.http.Http

报错!!!Servlet.service() for servlet [action] in context with path [/myssh] threw exception [java.lang.NullPointerException] with root cause java.lang.NullPointerException

这个为什么报错啊~~ at com.hsp.basic.BasicService.executeQuery(BasicService.java:33) 这个对应的语句是   Query query =this.sessionFactory.getCurrentSession().createQuery(hql); Sep 24, 2017 11:39:50 PM org.apache.catalina.core.StandardWrapperValve invoke 严重: Servlet.se

严重: Servlet.service() for servlet [jsp] threw exception java.lang.NullPointerException

五月 04, 2018 11:53:24 上午 org.apache.catalina.core.ApplicationDispatcher invoke 严重: Servlet.service() for servlet [jsp] threw exception java.lang.NullPointerException at org.apache.jsp.tag.web.partyRoleId_tag.doTag(partyRoleId_tag.java:125) at org.apac

异常:java.lang.IllegalStateException: Ambiguous handler methods mapped for HTTP path &#39;/app/userInfoMaint/getProvince.do&#39;

调试代码时出现异常:java.lang.IllegalStateException: Ambiguous handler methods mapped for HTTP path '/app/userInfoMaint/getProvince.do':当时就郁闷了,好像之前没怎么见过,后来发现原来是后台有两个"app/userInfoMaint/getProvince.do",并且它们路径相同. 解决方法很简单,相信大家都知道,就是将后台两个方法请求路径名区分开. 由上可知,锁定问题是

java.lang.IllegalStateException: Ambiguous handler methods mapped for HTTP path &#39;http://localhost:8888/contactTime/3308X7TqA976r857&#39;: {public com.infohold.hm.utils.Result

java.lang.IllegalStateException: Ambiguous handler methods mapped for HTTP path 'http://localhost:8888/contactTime/3308X7TqA976r857': {public com.infohold.hm.utils.Result com.infohold.hm.controller.EmployersIntentController.orderContactTime(java.lang

java.lang.IllegalStateException: Must have the StrutsPrepareFilter execute before this one

Servlet.service() for servlet [jsp] in context with path [/struts2Study] threw exceptionjava.lang.IllegalStateException: Must have the StrutsPrepareFilter execute before this one at org.apache.struts2.dispatcher.ng.InitOperations.findDispatcherOnThre

java.lang.IllegalStateException错误

严重: Servlet.service() for servlet default threw exceptionjava.lang.IllegalStateException at org.apache.catalina.connector.ResponseFacade.sendError(ResponseFacade.java:407) at org.apache.struts2.dispatcher.Dispatcher.sendError(Dispatcher.java:852) at