cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.sql.execution.aggregate.SortAggregateExec.aggregateExpressions of type scala.collection.Seq in instance of org.apache.spark.sql.execution.aggregate.SortAggregateExec
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 8 in stage 4.0 failed 4 times, most recent failure: Lost task 8.3 in stage 4.0 (TID 332, 172.16.43.200, executor 1): java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.sql.execution.aggregate.SortAggregateExec.aggregateExpressions of type scala.collection.Seq in instance of org.apache.spark.sql.execution.aggregate.SortAggregateExec
当只指定了spark集群的地址,没有设定setJars这个参数,那么就会报以上这种错误,解决方法就是设置setjars这个参数,如下:
System.setProperty("hadoop.home.dir", "E:\\winutils") val conf = new SparkConf().setAppName("DetailRatio") .setMaster("spark://172.xx.xx.xx:7077") // .setMaster("local") .setJars(List("E:\\vense_work\\venseData\\out\\artifacts\\DetailRatio_jar\\venseData.jar")) //.set("spark.submit.deployMode", "client")
原文地址:https://www.cnblogs.com/zhnagqi-dream/p/11813578.html
时间: 2024-10-06 03:27:51