解决Spark standalone部署模式cores为0的问题

在docker中运行spark程序,发现docker日志打印以下内容:web [Timer-0] o.a.spark.scheduler.TaskSchedulerImpl : Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and
相关文章
相关标签/搜索