spark-DataFrame学习记录-[2]解决spark-dataframe的JOIN操做以后产生重复列(Reference '***' is ambiguous问题解决)

【1】转帖部分 转自:http://blog.csdn.net/sparkexpert/article/details/52837269java 如分别建立两个DF,其结果以下: val df = sc.parallelize(Array( ("one", "A", 1), ("one", "B", 2), ("two", "A", 3), ("two", "B", 4) )).toDF(
相关文章
相关标签/搜索