使用SparkLauncher提交spark程序时,有时间会一直卡在status是running状态 finalstatus为undefined,这种状况是由于使用的java的ProcessBuilder,查阅了资料后发现多是由于buffer被填满,致使进程的阻塞。java
官方文档内也有相关提示:ide
By default, the created subprocess does not have its own terminal or console. All its standard I/O (i.e. stdin, stdout, stderr) operations will be redirected to the parent process, where they can be accessed via the streams obtained using the methods getOutputStream(), getInputStream(), and getErrorStream(). The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, or even deadlock.
-------------------------------------------------------------------------------ui
默认状况下,建立的子进程没有本身的终端或控制台。它的全部标准I/O(即,stdin、stdout、stderr)操做将被重定向到父进程,在那里能够经过使用方法getOutputStream()、getInputStream()和getErrorStream()得到的流来访问它们。父进程使用这些流来输入输入并从子进程获取输出。因为一些本机平台仅对标准输入和输出流提供有限的缓冲区大小,所以不能及时写入输入流或读取子进程的输出流可能致使子进程阻塞,甚至死锁。spa
因此咱们要本身开启子线程读取相应的buffer内容,最重要的是其中的getErrorStream(),通常spark的info信息都在这个里面.线程