深刻理解Spark RDD抽象模型

深刻理解Spark RDD抽象模型和编写RDD函数 Spark revolves around the concept of a resilient distributed dataset (RDD), which is an immutable , fault-tolerant , partitioned collection of elements that can be operated o
相关文章
相关标签/搜索