青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統(tǒng)計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

k-means clustering

      In statistics and machine learning, k-means clustering is a method of cluster analysis which aims topartition n observations into k clusters in which each observation belongs to the cluster with the nearestmean. It is similar to the expectation-maximization algorithm for mixtures of Gaussians in that they both attempt to find the centers of natural clusters in the data as well as in the iterative refinement approach employed by both algorithms.

 

Description

Given a set of observations (x1, x2, …, xn), where each observation is a d-dimensional real vector, k-means clustering aims to partition the n observations into k sets (k < n) S = {S1, S2, …, Sk} so as to minimize the within-cluster sum of squares (WCSS):

\underset{\mathbf{S}} \operatorname{arg\,min} \sum_{i=1}^{k} \sum_{\mathbf x_j \in S_i} \left\| \mathbf x_j - \boldsymbol\mu_i \right\|^2

where μi is the mean of points in Si.

 

Algorithms

Regarding computational complexity, the k-means clustering problem is:

  • NP-hard in general Euclidean space d even for 2 clusters [4][5]
  • NP-hard for a general number of clusters k even in the plane [6]
  • If k and d are fixed, the problem can be exactly solved in time O(ndk+1 log n), where n is the number of entities to be clustered [7]

Thus, a variety of heuristic algorithms are generally used.

 

所以注意到Algorithm是一個典型的NP問題,所以通常我們尋找使用的是啟發(fā)式方法。

Standard algorithm

The most common algorithm uses an iterative refinement technique.最常用的一個技巧是迭代求精。

Due to its ubiquity it is often called the k-means algorithm; it is also referred to as , particularly in the computer science community.

Given an initial set of k means m1(1),…,mk(1), which may be specified randomly or by some heuristic, the algorithm proceeds by alternating between two steps:[8]

Assignment step: Assign each observation to the cluster with the closest mean (i.e. partition the observations according to the Voronoi diagram generated by the means(這里等價于把原空間根據(jù)Voronoi 圖劃分為k個,此處的范數(shù)指的是2范數(shù),即歐幾里得距離,和Voronoi圖對應)).
S_i^{(t)} = \left\{ \mathbf x_j : \big\| \mathbf x_j - \mathbf m^{(t)}_i \big\| \leq \big\| \mathbf x_j - \mathbf m^{(t)}_{i^*} \big\| \text{ for all }i^*=1,\ldots,k \right\}
 
Update step: Calculate the new means to be the centroid of the observations in the cluster.
\mathbf m^{(t+1)}_i = \frac{1}{|S^{(t)}_i|} \sum_{\mathbf x_j \in S^{(t)}_i} \mathbf x_j
重新計算means

The algorithm is deemed to have converged when the assignments no longer change.

 

整個算法的流程就是如上圖所示

 

As it is a heuristic algorithm, there is no guarantee that it will converge to the global optimum, and the result may depend on the initial clusters. As the algorithm is usually very fast, it is common to run it multiple times with different starting conditions. However, in the worst case, k-means can be very slow to converge: in particular it has been shown that there exist certain point sets, even in 2 dimensions, on whichk-means takes exponential time, that is 2Ω(n), to converge[9][10]. These point sets do not seem to arise in practice: this is corroborated by the fact that the smoothed running time of k-means is polynomial[11].

最壞的時間復雜度是O(2Ω(n)),但是在實踐中,一般表現(xiàn)是一個多項式算法。

The "assignment" step is also referred to as expectation step, the "update step" as maximization step, making this algorithm a variant of the generalized expectation-maximization algorithm.

Variations

  • The expectation-maximization algorithm (EM algorithm) maintains probabilistic assignments to clusters, instead of deterministic assignments, and multivariate Gaussian distributions instead of means.
  • k-means++ seeks to choose better starting clusters.
  • The filtering algorithm uses kd-trees to speed up each k-means step.[12]
  • Some methods attempt to speed up each k-means step using coresets[13] or the triangle inequality.[14]
  • Escape local optima by swapping points between clusters.[15]

Discussion

File:Iris Flowers Clustering kMeans.svg

k-means clustering result for the Iris flower data set and actual species visualized using ELKI. Cluster means are marked using larger, semi-transparent symbols.

File:ClusterAnalysis Mouse.svg

k-means clustering and EM clustering on an artificial dataset ("mouse"). The tendency of k-means to produce equi-sized clusters leads to bad results, while EM benefits from the Gaussian distribution present in the data set

The two key features of k-means which make it efficient are often regarded as its biggest drawbacks:

A key limitation of k-means is its cluster model. The concept is based on spherical clusters that are separable in a way so that the mean value converges towards the cluster center. The clusters are expected to be of similar size, so that the assignment to the nearest cluster center is the correct assignment. When for example applying k-means with a value of k = 3 onto the well-known Iris flower data set, the result often fails to separate the three Iris species contained in the data set. With k = 2, the two visible clusters (one containing two species) will be discovered, whereas withk = 3 one of the two clusters will be split into two even parts. In fact, k = 2 is more appropriate for this data set, despite the data set containing 3 classes. As with any other clustering algorithm, the k-means result relies on the data set to satisfy the assumptions made by the clustering algorithms. It works very well on some data sets, while failing miserably on others.

The result of k-means can also be seen as the Voronoi cells of the cluster means. Since data is split halfway between cluster means, this can lead to suboptimal splits as can be seen in the "mouse" example. The Gaussian models used by the Expectation-maximization algorithm (which can be seen as a generalization of k-means) are more flexible here by having both variances and covariances. The EM result is thus able to accommodate clusters of variable size much better than k-means as well as correlated clusters (not in this example).

 

這篇是概念介紹篇,以后出代碼和一個K均值優(yōu)化的論文

Fast Hierarchical Clustering Algorithm Using Locality-Sensitive Hashing

posted on 2010-10-19 18:57 Sosi 閱讀(1606) 評論(0)  編輯 收藏 引用 所屬分類: Courses

統(tǒng)計系統(tǒng)
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            国产精品亚洲产品| 久久国内精品自在自线400部| 洋洋av久久久久久久一区| 伊人精品在线| 亚洲黄色视屏| 亚洲美女av网站| 亚洲午夜精品久久久久久浪潮| 一区二区三区欧美视频| 99国内精品| 亚洲欧美精品中文字幕在线| 午夜天堂精品久久久久| 欧美在线www| 美女91精品| 亚洲激情第一页| 亚洲另类春色国产| 亚洲欧美伊人| 麻豆av一区二区三区久久| 欧美日韩国产成人精品| 国产日产亚洲精品系列| 亚洲精品护士| 久久国产加勒比精品无码| 欧美a一区二区| 这里只有精品丝袜| 久久精品成人一区二区三区蜜臀| 免费亚洲婷婷| 国产日韩欧美高清免费| 亚洲精品国产视频| 久久久久se| 亚洲精品一区久久久久久| 欧美自拍偷拍午夜视频| 欧美日韩一区二区在线观看| 国产精品色网| 一本大道久久精品懂色aⅴ| 久久久亚洲影院你懂的| 一区二区三区产品免费精品久久75 | 欧美高清视频在线播放| 日韩一区二区福利| 久久高清福利视频| 亚洲人成亚洲人成在线观看| 欧美一区二区私人影院日本 | 悠悠资源网久久精品| 亚洲午夜女主播在线直播| 欧美大片免费| 久久精品中文字幕一区二区三区| 欧美国产一区二区| 亚洲成色精品| 久久久久国内| 亚洲欧美不卡| 国产精品视频在线观看| 亚洲永久在线| 亚洲国产精品一区制服丝袜| 久久精品综合网| 国产亚洲在线观看| 午夜激情亚洲| 亚洲综合第一| 国产精品久久久久7777婷婷| 99国产精品99久久久久久粉嫩| 欧美电影免费观看| 欧美99在线视频观看| 亚洲国产高清视频| 欧美1区视频| 久热综合在线亚洲精品| 尤妮丝一区二区裸体视频| 蜜桃av一区| 欧美不卡在线视频| 99re6这里只有精品| 99视频一区二区| 国产精品视频专区| 久久久青草青青国产亚洲免观| 亚洲欧美日韩国产一区| 国产真实乱偷精品视频免| 久久青草久久| 欧美成人一区二区| 亚洲一区免费视频| 亚洲一区在线观看免费观看电影高清| 欧美四级在线观看| 欧美影院在线播放| 久久高清国产| 亚洲精品国偷自产在线99热| 99re66热这里只有精品3直播| 国产精品成人一区二区| 久久久www成人免费精品| 久久免费黄色| 日韩亚洲视频在线| 在线视频欧美日韩精品| 国产一区二区日韩精品| 亚洲国产精品一区二区第四页av| 欧美日韩一区二区三区在线看| 午夜精品视频在线| 久久久亚洲精品一区二区三区 | 亚洲天天影视| 先锋影音一区二区三区| 亚洲风情亚aⅴ在线发布| 99在线热播精品免费99热| 99国产精品自拍| 国产亚洲一二三区| 亚洲国产一区二区三区a毛片| 欧美性一区二区| 久久综合九色99| 国产精品成人国产乱一区| 久久综合伊人77777| 欧美日韩精品在线视频| 久久久久成人精品| 欧美日韩免费网站| 久久天天躁狠狠躁夜夜爽蜜月 | 欧美先锋影音| 欧美成年人视频| 国产精品久久久久免费a∨| 快射av在线播放一区| 欧美午夜宅男影院在线观看| 另类专区欧美制服同性| 国产精品a久久久久| 欧美成人免费一级人片100| 国产九九精品视频| 一区二区三区 在线观看视| 亚洲剧情一区二区| 久久久亚洲高清| 久久国产日韩| 欧美新色视频| 亚洲人成人一区二区在线观看| 国产日韩精品久久| 中文日韩在线视频| 99热免费精品在线观看| 老司机一区二区三区| 久久亚洲精品中文字幕冲田杏梨| 国产精品成人va在线观看| 亚洲精品美女久久久久| 亚洲精品国产精品国自产观看| 久久国产精品久久国产精品| 亚洲欧美日韩中文播放| 欧美日韩亚洲高清| 日韩一级片网址| 亚洲一区二区3| 欧美日韩另类视频| 亚洲精品视频免费在线观看| 亚洲日本视频| 欧美激情一二区| 91久久精品国产91久久性色tv| 亚洲欧洲日本专区| 欧美承认网站| 亚洲国产精品成人精品| 亚洲人成网站色ww在线| 女同性一区二区三区人了人一| 欧美电影在线观看完整版| 欲色影视综合吧| 免费成人av资源网| 亚洲电影av在线| 亚洲免费福利视频| 欧美精品三级| 夜夜嗨一区二区| 亚洲欧美日韩在线综合| 国产精品私人影院| 久久精品99国产精品| 欧美18av| 日韩性生活视频| 国产老肥熟一区二区三区| 欧美一二三视频| 欧美黄色免费| 一区二区三区波多野结衣在线观看| 欧美日韩国产va另类| 亚洲综合国产| 国产午夜久久久久| 另类亚洲自拍| 欧美亚洲免费电影| 国产伦精品一区二区三区视频黑人| 午夜精品久久久久久久久| 久久久久高清| 亚洲九九精品| 国产精品日韩欧美| 乱中年女人伦av一区二区| av72成人在线| 老司机免费视频一区二区三区 | 另类天堂av| 一区二区国产在线观看| 国产亚洲成年网址在线观看| 免费永久网站黄欧美| 亚洲一区观看| 亚洲高清自拍| 久久激情综合网| 一本久道久久综合狠狠爱| 国产亚洲欧美日韩日本| 欧美精品一卡| 久久久精品国产免费观看同学| 亚洲精品视频一区二区三区| 久久视频免费观看| 午夜激情亚洲| 中文亚洲字幕| 亚洲第一区在线| 国产精品高精视频免费| 麻豆精品在线观看| 性色av一区二区三区在线观看| 亚洲毛片在线看| 欧美激情亚洲激情| 久久综合色一综合色88| 午夜一级久久| 99国产精品99久久久久久粉嫩| 一区二区三区在线免费播放| 国产精品毛片| 欧美日韩一区在线视频| 久久婷婷国产综合精品青草|