Spark OneToOneDependency 一对一依赖关系

Spark OneToOneDependency 一对一依赖关系

  • Represents a one-to-one dependency between partitions of the parent and child RDDs.

更多资源

youtub视频演示

<iframe src="//player.bilibili.com/player.html?aid=37442139&cid=65822237&page=1" scrolling="no" border="0" frameborder="no" framespacing="0" allowfullscreen="true"> </iframe>html

输入数据

a bc
a

处理程序scala

package com.opensource.bigdata.spark.local.rdd.operation.dependency.narrow.n_02_RangeDependency

import com.opensource.bigdata.spark.local.rdd.operation.base.BaseScalaSparkContext

object Run1 extends BaseScalaSparkContext{

  def main(args: Array[String]): Unit = {
    val sc = pre()
    val rdd1 = sc.textFile("/opt/data/2/c.txt",2)

    println(rdd1.collect().mkString("\n"))

    //rdd1.partitions(0).asInstanceOf[org.apache.spark.rdd.HadoopPartition]

    sc.stop()
  }

}

数据处理图

一对一依赖关系

相关文章
相关标签/搜索