青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統(tǒng)計(jì)

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

k-means clustering

      In statistics and machine learning, k-means clustering is a method of cluster analysis which aims topartition n observations into k clusters in which each observation belongs to the cluster with the nearestmean. It is similar to the expectation-maximization algorithm for mixtures of Gaussians in that they both attempt to find the centers of natural clusters in the data as well as in the iterative refinement approach employed by both algorithms.

 

Description

Given a set of observations (x1, x2, …, xn), where each observation is a d-dimensional real vector, k-means clustering aims to partition the n observations into k sets (k < n) S = {S1, S2, …, Sk} so as to minimize the within-cluster sum of squares (WCSS):

\underset{\mathbf{S}} \operatorname{arg\,min} \sum_{i=1}^{k} \sum_{\mathbf x_j \in S_i} \left\| \mathbf x_j - \boldsymbol\mu_i \right\|^2

where μi is the mean of points in Si.

 

Algorithms

Regarding computational complexity, the k-means clustering problem is:

  • NP-hard in general Euclidean space d even for 2 clusters [4][5]
  • NP-hard for a general number of clusters k even in the plane [6]
  • If k and d are fixed, the problem can be exactly solved in time O(ndk+1 log n), where n is the number of entities to be clustered [7]

Thus, a variety of heuristic algorithms are generally used.

 

所以注意到Algorithm是一個(gè)典型的NP問題,所以通常我們尋找使用的是啟發(fā)式方法。

Standard algorithm

The most common algorithm uses an iterative refinement technique.最常用的一個(gè)技巧是迭代求精。

Due to its ubiquity it is often called the k-means algorithm; it is also referred to as , particularly in the computer science community.

Given an initial set of k means m1(1),…,mk(1), which may be specified randomly or by some heuristic, the algorithm proceeds by alternating between two steps:[8]

Assignment step: Assign each observation to the cluster with the closest mean (i.e. partition the observations according to the Voronoi diagram generated by the means(這里等價(jià)于把原空間根據(jù)Voronoi 圖劃分為k個(gè),此處的范數(shù)指的是2范數(shù),即歐幾里得距離,和Voronoi圖對應(yīng))).
S_i^{(t)} = \left\{ \mathbf x_j : \big\| \mathbf x_j - \mathbf m^{(t)}_i \big\| \leq \big\| \mathbf x_j - \mathbf m^{(t)}_{i^*} \big\| \text{ for all }i^*=1,\ldots,k \right\}
 
Update step: Calculate the new means to be the centroid of the observations in the cluster.
\mathbf m^{(t+1)}_i = \frac{1}{|S^{(t)}_i|} \sum_{\mathbf x_j \in S^{(t)}_i} \mathbf x_j
重新計(jì)算means

The algorithm is deemed to have converged when the assignments no longer change.

 

整個(gè)算法的流程就是如上圖所示

 

As it is a heuristic algorithm, there is no guarantee that it will converge to the global optimum, and the result may depend on the initial clusters. As the algorithm is usually very fast, it is common to run it multiple times with different starting conditions. However, in the worst case, k-means can be very slow to converge: in particular it has been shown that there exist certain point sets, even in 2 dimensions, on whichk-means takes exponential time, that is 2Ω(n), to converge[9][10]. These point sets do not seem to arise in practice: this is corroborated by the fact that the smoothed running time of k-means is polynomial[11].

最壞的時(shí)間復(fù)雜度是O(2Ω(n)),但是在實(shí)踐中,一般表現(xiàn)是一個(gè)多項(xiàng)式算法。

The "assignment" step is also referred to as expectation step, the "update step" as maximization step, making this algorithm a variant of the generalized expectation-maximization algorithm.

Variations

  • The expectation-maximization algorithm (EM algorithm) maintains probabilistic assignments to clusters, instead of deterministic assignments, and multivariate Gaussian distributions instead of means.
  • k-means++ seeks to choose better starting clusters.
  • The filtering algorithm uses kd-trees to speed up each k-means step.[12]
  • Some methods attempt to speed up each k-means step using coresets[13] or the triangle inequality.[14]
  • Escape local optima by swapping points between clusters.[15]

Discussion

File:Iris Flowers Clustering kMeans.svg

k-means clustering result for the Iris flower data set and actual species visualized using ELKI. Cluster means are marked using larger, semi-transparent symbols.

File:ClusterAnalysis Mouse.svg

k-means clustering and EM clustering on an artificial dataset ("mouse"). The tendency of k-means to produce equi-sized clusters leads to bad results, while EM benefits from the Gaussian distribution present in the data set

The two key features of k-means which make it efficient are often regarded as its biggest drawbacks:

A key limitation of k-means is its cluster model. The concept is based on spherical clusters that are separable in a way so that the mean value converges towards the cluster center. The clusters are expected to be of similar size, so that the assignment to the nearest cluster center is the correct assignment. When for example applying k-means with a value of k = 3 onto the well-known Iris flower data set, the result often fails to separate the three Iris species contained in the data set. With k = 2, the two visible clusters (one containing two species) will be discovered, whereas withk = 3 one of the two clusters will be split into two even parts. In fact, k = 2 is more appropriate for this data set, despite the data set containing 3 classes. As with any other clustering algorithm, the k-means result relies on the data set to satisfy the assumptions made by the clustering algorithms. It works very well on some data sets, while failing miserably on others.

The result of k-means can also be seen as the Voronoi cells of the cluster means. Since data is split halfway between cluster means, this can lead to suboptimal splits as can be seen in the "mouse" example. The Gaussian models used by the Expectation-maximization algorithm (which can be seen as a generalization of k-means) are more flexible here by having both variances and covariances. The EM result is thus able to accommodate clusters of variable size much better than k-means as well as correlated clusters (not in this example).

 

這篇是概念介紹篇,以后出代碼和一個(gè)K均值優(yōu)化的論文

Fast Hierarchical Clustering Algorithm Using Locality-Sensitive Hashing

posted on 2010-10-19 18:57 Sosi 閱讀(1606) 評論(0)  編輯 收藏 引用 所屬分類: Courses

統(tǒng)計(jì)系統(tǒng)
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            欧美精品久久99| 亚洲国产成人av在线| 一区二区三区**美女毛片| 欧美a级片一区| 巨乳诱惑日韩免费av| 欧美影院在线播放| 久久精品国产77777蜜臀| 亚洲欧美一区二区激情| 午夜在线精品偷拍| 欧美一区免费| 久久综合国产精品| 欧美成人性生活| 亚洲国产日韩欧美一区二区三区| 午夜精品一区二区三区电影天堂| 性欧美大战久久久久久久免费观看 | 奶水喷射视频一区| 欧美电影免费观看高清完整版| 欧美成人一区二区| 国产精品日韩精品| 激情久久五月天| 日韩视频永久免费观看| 中日韩高清电影网| 久久激情婷婷| 亚洲国产午夜| 亚洲视频www| 久久久欧美精品sm网站| 欧美日本精品| 韩日午夜在线资源一区二区| 亚洲人成7777| 久久国产成人| 亚洲精品欧美在线| 久久精品国产在热久久| 欧美三区视频| 亚洲二区三区四区| 欧美在线一二三区| 欧美不卡视频| 亚洲一区二区三区影院| 久久久天天操| 国产伦精品一区二区三| 亚洲激情午夜| 久久精品亚洲精品| 日韩网站在线| 欧美成人在线网站| 狠狠色综合色区| 亚洲视频图片小说| 亚洲电影免费观看高清完整版在线| 久久成人久久爱| 999在线观看精品免费不卡网站| 欧美一级淫片aaaaaaa视频| 欧美日韩国产黄| 亚洲国产精品一区二区三区| 久久国产精品毛片| 亚洲视频在线观看| 欧美三级电影一区| 亚洲免费观看高清完整版在线观看熊| 久久精品国产在热久久| 亚洲图片你懂的| 欧美日韩亚洲天堂| 日韩一区二区免费看| 欧美大片在线观看| 老妇喷水一区二区三区| 激情自拍一区| 久久综合狠狠综合久久综合88| 亚洲综合导航| 国产精品一区二区久久久| 亚洲曰本av电影| 亚洲私人影院| 国产欧美日韩精品专区| 欧美一区1区三区3区公司| 亚洲性夜色噜噜噜7777| 亚洲日本欧美| 欧美日韩国产123| 99riav1国产精品视频| 亚洲黄色尤物视频| 欧美成人免费视频| 日韩午夜免费视频| 亚洲另类在线视频| 欧美日韩1区| 亚洲综合首页| 中文国产成人精品| 国产私拍一区| 亚洲国产高清在线| 亚洲丰满少妇videoshd| 欧美极品一区二区三区| 在线视频精品| 亚洲欧美激情视频在线观看一区二区三区| 国产精品国产三级欧美二区| 亚洲欧美日韩国产一区| 午夜精品久久久久99热蜜桃导演| 国产亚洲一区二区三区在线观看 | 亚洲少妇自拍| 在线视频欧美日韩精品| 国产日本欧美在线观看| 模特精品在线| 欧美日韩一区精品| 小黄鸭精品aⅴ导航网站入口| 午夜精品成人在线视频| 亚洲国产精品综合| 亚洲一区在线免费观看| 原创国产精品91| 一区二区激情| 激情欧美国产欧美| 99av国产精品欲麻豆| 黑人巨大精品欧美一区二区 | 亚洲国产一区在线| 国产精品久久久久久久久借妻| 久久精品91| 麻豆精品传媒视频| 亚洲一区在线播放| 榴莲视频成人在线观看| 欧美亚洲自偷自偷| 欧美久久综合| 久久精品成人一区二区三区| 欧美福利一区二区| 久久久一二三| 国产精品女主播在线观看| 欧美成人伊人久久综合网| 国产精品乱码| 亚洲国产裸拍裸体视频在线观看乱了中文 | 激情久久五月天| 一区二区三区欧美视频| 亚洲精品一区二| 久久全国免费视频| 久久久久久9999| 国产精品午夜在线| 亚洲视频一起| 一区二区三区精品视频在线观看| 久久欧美中文字幕| 久久字幕精品一区| 狠狠色综合网| 午夜精品在线看| 亚洲一区在线免费观看| 欧美日韩国产高清| 91久久国产精品91久久性色| 91久久精品www人人做人人爽 | 尹人成人综合网| 久久国产手机看片| 亚洲精品女人| 亚洲欧美在线免费| 欧美色视频日本高清在线观看| 久久女同精品一区二区| 国产精品高潮呻吟| 中日韩美女免费视频网址在线观看 | 亚洲一区二区在线播放| 免费不卡欧美自拍视频| 久久男人资源视频| 久久成人18免费网站| 久久一区免费| 国产日韩1区| 亚洲最快最全在线视频| 亚洲精品视频在线观看免费| 亚洲自拍偷拍视频| 麻豆精品传媒视频| 狠狠爱www人成狠狠爱综合网| 正在播放亚洲一区| 午夜在线精品| 欧美精品一区二区三区四区| 美女福利精品视频| 一区二区三区在线免费播放| 久久精品国产免费看久久精品| 欧美在线观看天堂一区二区三区| 国产一区二区欧美日韩| 欧美亚洲日本网站| 久久亚洲春色中文字幕久久久| 国产欧美日韩免费| 99视频超级精品| 欧美制服第一页| 国产亚洲激情视频在线| 午夜亚洲福利| 久久久综合激的五月天| 亚洲精品久久7777| 欧美国产三级| 一级日韩一区在线观看| 亚洲一区二区在线观看视频| 欧美激情麻豆| 午夜精品福利电影| 免费欧美视频| 一区二区三区日韩欧美| 欧美性开放视频| 久久综合九色| 亚洲欧洲三级电影| 性刺激综合网| 亚洲国产精品成人| 国产精品久久久久久久一区探花| 亚洲伊人伊色伊影伊综合网| 久久免费精品视频| 亚洲品质自拍| 欧美第一黄色网| 亚洲欧美激情视频| 欧美激情亚洲综合一区| 亚洲视频在线免费观看| 国产综合视频| 国产精品多人| 久久深夜福利免费观看| 亚洲美女在线观看| 久久人人爽人人爽| 久久精品麻豆| 亚洲视频福利| 亚洲成人原创| 国产精品男gay被猛男狂揉视频|