spark-submit报错:Application application_1529650293575_0148 finished with failed status

时间:2022-06-02
本文章向大家介绍spark-submit报错:Application application_1529650293575_0148 finished with failed status,主要内容包括其使用实例、应用技巧、基本知识点总结和需要注意事项,具有一定的参考价值,需要的朋友可以参考一下。

前言

记录spark-submit提交Spark程序出现的一个异常,以供第一次出现这种异常且不知道原因,该怎么解决的的同学参考。

1、异常信息

Exception in thread "main" org.apache.spark.SparkException: Application application_1529650293575_0148 finished with failed status
	at org.apache.spark.deploy.yarn.Client.run(Client.scala:1187)
	at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1233)
	at org.apache.spark.deploy.yarn.Client.main(Client.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:782)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

2、异常原因

出现该异常场景是spark-submit提交master为yarn cluster,yarn client没有这种问题,原因是因为,在代码里指定了master为local(本地测试用),在spark-submit提交程序时忘记删除了

3、异常再现

代码

package com.dkl.leanring.spark.exception

import org.apache.spark.sql.SparkSession

object YarnClusterDemo {

  def main(args: Array[String]): Unit = {

    val spark = SparkSession.builder().appName("YarnClusterDemo").master("local").getOrCreate()
    val sc = spark.sparkContext

    val rdd = sc.parallelize(Seq(1, 2, 3))
    println(rdd.count)

    spark.stop()
  }
}

这段代码在本地是没有问题,输出结果为3,把它打包,然后放在集群上,用spark-submit提交测试一下

4、yarn client

–master yarn 默认的就是client模式,所以用下面的命令

spark-submit --master yarn --class com.dkl.leanring.spark.exception.YarnClusterDemo spark-scala_2.11-1.0.jar

等价于

spark-submit --master yarn --deploy-mode client --class com.dkl.leanring.spark.exception.YarnClusterDemo spark-scala_2.11-1.0.jar

结果也会正常打印出来,因为在代码里指定了master为local所以实际上应该还是用的local,但是没有研究client模式不报错,可能是用的client模式用提交代码的那台机器为Driver,然后再用local模式吧

5、yarn cluster

用下面的命令,即可再现异常

spark-submit --master yarn --deploy-mode cluster --class com.dkl.leanring.spark.exception.YarnClusterDemo spark-scala_2.11-1.0.jar

所以实际用spark-submit提交程序的时候,将master在代码里删掉,然后用命令行–master指定即可

val spark = SparkSession.builder().appName("YarnClusterDemo").getOrCreate()

6、附图

本文由 董可伦 发表于 伦少的博客 ,采用署名-非商业性使用-禁止演绎 3.0进行许可。

非商业转载请注明作者及出处。商业转载请联系作者本人。

本文标题:spark-submit报错:Application application_1529650293575_0148 finished with failed status

本文链接:https://dongkelun.com/2018/07/06/sparkSubmitException1/

--------------------

作者联系方式:

QQ:1412359494

微信:dongkelun

--------------------