org.apache.flink.client.program.ProgramInvocationException: The program execution failed: Job execution failed.


Solutions on the web9

Solution icon of stackoverflow
via Stack Overflow by kadsank
, 8 months ago
The program execution failed: Job execution failed.

Solution icon of apache
via Apache's JIRA Issue Tracker by Renkai Ge, 1 year ago
The program execution failed: Job execution failed.

Solution icon of stackoverflow
via Stack Overflow by Borja
, 4 months ago
The program execution failed: Job execution failed.

Solution icon of stackoverflow
The program execution failed: Job execution failed.

Solution icon of github
via GitHub by karloscampus
, 9 months ago
The program execution failed: Could not upload the jar files to the job manager.

Solution icon of stackoverflow
The program execution failed: Communication with JobManager failed: Job submission to the JobManager timed out. You may increase 'akka.client.timeout' in case the JobManager needs more time to configure and confirm the job submission.

Solution icon of github
The program execution failed: Couldn't retrieve the JobExecutionResult from the JobManager.

Solution icon of stackoverflow
The program execution failed: Cannot initialize task 'CHAIN DataSource (at main(JobSource.java:49) (es.mypackage.flink.Sources.MyInputFormat)) -> Map (Map at main(JobSource.java:53))': Configuring the InputFormat (es.mypackage.flink.Sources.MyInputFormat@1f81aa00) failed: null

Solution icon of stackoverflow
The program execution failed: Cannot initialize task 'CHAIN DataSource (at createInput(ExecutionEnvironment.java:553) (org.apache.flink.api.java.io.jdbc.JDBCInputFormat)) -> FlatMap (where: (=(abnor_flag, 0)), select: (abnor_flag))': Deserializing

Stack trace

org.apache.flink.client.program.ProgramInvocationException: The program execution failed: Job execution failed.
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:409)
	at org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:95)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:382)
	at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:374)
	at org.apache.flink.streaming.api.environment.RemoteStreamEnvironment.executeRemotely(RemoteStreamEnvironment.java:209)
	at org.apache.flink.streaming.api.environment.RemoteStreamEnvironment.execute(RemoteStreamEnvironment.java:173)
	at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1429)
	at flink_ignite_sink_remote.main(flink_ignite_sink_remote.java:77)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$8.apply$mcV$sp(JobManager.scala:822)
	at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$8.apply(JobManager.scala:768)
	at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$8.apply(JobManager.scala:768)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.ignite.sink.flink.IgniteSink$SinkContext$Holder
	at org.apache.ignite.sink.flink.IgniteSink$SinkContext.getStreamer(IgniteSink.java:201)
	at org.apache.ignite.sink.flink.IgniteSink$SinkContext.access$100(IgniteSink.java:175)
	at org.apache.ignite.sink.flink.IgniteSink.invoke(IgniteSink.java:165)
	at org.apache.flink.streaming.api.operators.StreamSink.processElement(StreamSink.java:39)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:373)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:358)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:346)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:329)
	at org.apache.flink.streaming.api.operators.TimestampedCollector.collect(TimestampedCollector.java:51)
	at flink_ignite_sink_remote$Splitter.flatMap(flink_ignite_sink_remote.java:177)
	at flink_ignite_sink_remote$Splitter.flatMap(flink_ignite_sink_remote.java:1)
	at org.apache.flink.streaming.api.operators.StreamFlatMap.processElement(StreamFlatMap.java:48)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:373)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:358)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:346)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:329)
	at org.apache.flink.streaming.api.operators.StreamSource$NonTimestampContext.collect(StreamSource.java:161)
	at org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecord(AbstractFetcher.java:225)
	at org.apache.flink.streaming.connectors.kafka.internal.Kafka09Fetcher.run(Kafka09Fetcher.java:253)
	at java.lang.Thread.run(Thread.java:745)

Write tip

You have a different solution? A short tip here would help you and many other users who saw this issue last week.

Users with the same issue

You are the first who have seen this exception. Write a tip to help other users and build your expert profile.