1.欧氏距离
d(x,y) = √( (x[1]-y[1])^2 + (x[1]-y[2])^2 + … + (x[n]-y[n])^2 )
2.squared Euclidean distance平方欧式距离
Spark KMeans的距离公式是使用了平方欧式距离,平方欧氏距离就是欧式距离的平方(去掉了开根号)
d(x,y) = (x[1]-y[1])^2 + (x[1]-y[2])^2 + … + (x[n]-y[n])^2
3.偏差平方和(Sum of Squared Error(SSE))
Spark KMeans使用的偏差评价指标是偏差平方和
公式:∑(acfual - predicted)²
注:也就是各点到簇中心的平方欧式距离
4.Spark相关代码
位于spark/mllib/src/main/scala/org/apache/spark/mllib/clustering/KMeansModel.scalaapache
/** * Return the K-means cost (sum of squared distances of points to their nearest center) for this * model on the given data. */ @Since("0.8.0") def computeCost(data: RDD[Vector]): Double = { val bcCentersWithNorm = data.context.broadcast(clusterCentersWithNorm)//广播簇中心 val cost = data.map(p => distanceMeasureInstance.pointCost(bcCentersWithNorm.value, new VectorWithNorm(p))) .sum()//点到最近簇中心的距离求和 bcCentersWithNorm.destroy() cost }
其中distanceMeasureInstance位于spark/mllib/src/main/scala/org/apache/spark/mllib/clustering/DistanceMeasure.scalaide
/** * @return 离给定点最近的中心的指数,以及成本cost。 */ def findClosest( centers: Array[VectorWithNorm], point: VectorWithNorm): (Int, Double) = { var bestDistance = Double.PositiveInfinity var bestIndex = 0 var i = 0 while (i < centers.length) { val center = centers(i) val currentDistance = distance(center, point)//使用了平方欧式距离 if (currentDistance < bestDistance) { bestDistance = currentDistance bestIndex = i } i += 1 } (bestIndex, bestDistance) } /** * @return 给定点相对于给定簇中心的k-means成本cost。 */ def pointCost( centers: Array[VectorWithNorm], point: VectorWithNorm): Double = { findClosest(centers, point)._2 }
总结:其实spark的偏差平方和代码使用到了寻找簇中心的平方欧氏距离公式,因此说偏差平方和也就是各点到簇中心的平方欧式距离this