出现这个问题,是因为spark的excutor执行的时候,缺少hive的依赖包,添加依赖包的参数是:
--conf "spark.executor.extraClassPath=/opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/lib/hive/lib/*"
下面是是一个例子:
spark-submit --class com.simple.spark.Test \
--master yarn-client \
--num-executors 2 \
--driver-memory 600m \
--executor-memory 600m \
--executor-cores 1 \
--conf "spark.executor.extraClassPath=/opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/lib/hive/lib/*" \
--jars /opt/cloudera/parcels/CDH-5.3.1-1.cdh5.3.1.p0.5/lib/hive/lib/hbase-common.jar test.jar
时间: 2024-11-05 11:46:55