解决Spark standalone部署模式cores为0的问题

在docker中运行spark程序,发现docker日志打印如下内容: [Timer-0] o.a.spark.scheduler.TaskSchedulerImpl : Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and ha
相关文章
相关标签/搜索