青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

公告

記錄我的生活和工作。。。
<2010年9月>
2930311234
567891011
12131415161718
19202122232425
262728293012
3456789

統(tǒng)計(jì)

  • 隨筆 - 182
  • 文章 - 1
  • 評(píng)論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評(píng)論

閱讀排行榜

評(píng)論排行榜

Expectation-maximization algorithm EM算法

     In statistics, an expectation-maximization (EM) algorithm is a method for finding maximum likelihood ormaximum a posteriori (MAP) estimates of parameters in statistical models, where the model depends on unobserved latent variables. EM is an iterative method which alternates between performing an expectation (E) step, which computes the expectation of the log-likelihood evaluated using the current estimate for the latent variables, and a maximization (M) step, which computes parameters maximizing the expected log-likelihood found on the E step. These parameter-estimates are then used to determine the distribution of the latent variables in the next E step.

 

  EM算法可用于很多問(wèn)題的框架,其中需要估計(jì)一組描述概率分布的參數(shù)\boldsymbol\theta,只給定了由此產(chǎn)生的全部數(shù)據(jù)中能觀察到的一部分!

  EM算法是一種迭代算法,它由基本的兩個(gè)步驟組成:

  E step:估計(jì)期望步驟

  使用對(duì)隱變量的現(xiàn)有估計(jì)來(lái)計(jì)算log極大似然

  M step: 最大化期望步驟

  計(jì)算一個(gè)對(duì)隱變量更好的估計(jì),使其最大化log似然函數(shù)對(duì)隱變量Y的期望。用新計(jì)算的隱變量參數(shù)代替之前的對(duì)隱變量的估計(jì),進(jìn)行下一步的迭代!

 

 

觀測(cè)數(shù)據(jù):觀測(cè)到的隨機(jī)變量X的IID樣本:

image

缺失數(shù)據(jù):未觀測(cè)到的隱含變量(隱變量)Y的值:

image

完整數(shù)據(jù): 包含觀測(cè)到的隨機(jī)變量X和未觀測(cè)到的隨機(jī)變量Y的數(shù)據(jù),Z=(X,Y)

 

似然函數(shù):(似然函數(shù)的幾種寫(xiě)法)

JL})D_HBNI489~H}GCRMWVJ

log似然函數(shù)為:

image

E step:用對(duì)隱變量的現(xiàn)有估計(jì)\boldsymbol\theta^{(t)}計(jì)算隱變量Y的期望

  image

其中需要用到貝葉斯公式:

image 

M step:最大化期望,獲得對(duì)隱變量更好的估計(jì)

image

 

維基中的表述是這樣子:

Given a statistical model consisting of a set \mathbf{X} of observed data, a set of unobserved latent data or missing values Y, and a vector of unknown parameters \boldsymbol\theta, along with a likelihood function L(\boldsymbol\theta; \mathbf{X}, \mathbf{Z}) = p(\mathbf{X}, \mathbf{Z}|\boldsymbol\theta), the maximum likelihood estimate (MLE) of the unknown parameters is determined by the marginal likelihood of the observed data 

       CR%M2I[QD88[N5$3(H))%ZR

However, this quantity is often intractable.

The EM algorithm seeks to find the MLE of the marginal likelihood by iteratively applying the following two steps:

Expectation step (E-step): Calculate the expected value of the log likelihood function, with respect to the conditional distribution of Y given \mathbf{X} under the current estimate of the parameters \boldsymbol\theta^{(t)}:

       A7DFNWMY)KAI]T5)_OMKRUD

Maximization step (M-step): Find the parameter that maximizes this quantity:
\boldsymbol\theta^{(t+1)} = \underset{\boldsymbol\theta} \operatorname{arg\,max} \ Q(\boldsymbol\theta|\boldsymbol\theta^{(t)}) \,

Note that in typical models to which EM is applied:

  1. The observed data points \mathbf{X} may be discrete (taking one of a fixed number of values, or taking values that must be integers) or continuous (taking a continuous range of real numbers, possibly infinite). There may in fact be a vector of observations associated with each data point.
  2. The missing values (aka latent variables) Y are discrete, drawn from a fixed number of values, and there is one latent variable per observed data point.
  3. The parameters are continuous, and are of two kinds: Parameters that are associated with all data points, and parameters associated with a particular value of a latent variable (i.e. associated with all data points whose corresponding latent variable has a particular value).

However, it is possible to apply EM to other sorts of models.

The motivation is as follows. If we know the value of the parameters \boldsymbol\theta, we can usually find the value of the latent variables Y by maximizing the log-likelihood over all possible values of Y, either simply by iterating over Y or through an algorithm such as the Viterbi algorithm for hidden Markov models. Conversely, if we know the value of the latent variables Y, we can find an estimate of the parameters \boldsymbol\theta fairly easily, typically by simply grouping the observed data points according to the value of the associated latent variable and averaging the values, or some function of the values, of the points in each group. This suggests an iterative algorithm, in the case where both \boldsymbol\theta and Y are unknown:

  1. First, initialize the parameters \boldsymbol\theta to some random values.
  2. Compute the best value for Y given these parameter values.
  3. Then, use the just-computed values of Y to compute a better estimate for the parameters \boldsymbol\theta. Parameters associated with a particular value of Y will use only those data points whose associated latent variable has that value.
  4. Finally, iterate until convergence.

The algorithm as just described will in fact work, and is commonly called hard EM. The K-means algorithm is an example of this class of algorithms.

However, we can do somewhat better by, rather than making a hard choice for Y given the current parameter values and averaging only over the set of data points associated with a particular value of Y, instead determining the probability of each possible value of Y for each data point, and then using the probabilities associated with a particular value of Y to compute a weighted average over the entire set of data points. The resulting algorithm is commonly called soft EM, and is the type of algorithm normally associated with EM. The counts used to compute these weighted averages are called soft counts (as opposed to the hard counts used in a hard-EM-type algorithm such as K-means). The probabilities computed for Y areposterior probabilities and are what is computed in the E-step. The soft counts used to compute new parameter values are what is computed in the M-step.

總結(jié):

EM is frequently used for data clustering in machine learning and computer vision.

EM會(huì)收斂到局部極致,但不能保證收斂到全局最優(yōu)。

EM對(duì)初值比較敏感,通常需要一個(gè)好的,快速的初始化過(guò)程。

 

這是我的Machine Learning課程,先總結(jié)到這里, 下面的工作是做一個(gè)GM_EM的總結(jié),多維高斯密度估計(jì)!

posted on 2010-10-20 14:44 Sosi 閱讀(2524) 評(píng)論(0)  編輯 收藏 引用 所屬分類: Courses

統(tǒng)計(jì)系統(tǒng)
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            亚洲全部视频| 亚洲一区精彩视频| 亚洲日本一区二区| 久久九九免费| 国产一区二区av| 亚洲午夜视频在线观看| 亚洲第一在线视频| 欧美成人免费观看| 一本不卡影院| 一区二区av在线| 国产精品一级二级三级| 亚洲一级电影| 欧美亚洲系列| 亚洲第一天堂av| 亚洲大黄网站| 国产精品成人一区二区三区吃奶| 亚洲天堂av在线免费| 亚洲少妇在线| 黄色国产精品| 一区二区三区四区五区在线| 国产精品一区二区久久久| 久久综合国产精品| 欧美日本韩国| 久久亚洲综合网| 欧美性猛交视频| 欧美a级大片| 国产日韩在线亚洲字幕中文| 亚洲国产日韩在线| 国产精品萝li| 亚洲美女中文字幕| 亚洲国产综合视频在线观看| 销魂美女一区二区三区视频在线| 亚洲美女av在线播放| 久久经典综合| 欧美日韩高清在线一区| 欧美一区二区三区四区在线观看| 美女尤物久久精品| 狂野欧美激情性xxxx欧美| 国产精品久久久久久妇女6080| 亚洲高清不卡一区| 亚洲欧洲日韩在线| 久久综合网络一区二区| 欧美成黄导航| 99视频在线精品国自产拍免费观看 | 国产欧美精品| 亚洲无人区一区| av成人免费| 亚洲自拍偷拍色片视频| 国产精品黄视频| 亚洲一区二区三区久久| 午夜精品理论片| 国产亚洲欧美一区| 麻豆精品91| 日韩视频久久| 久久久久免费观看| 亚洲精品网址在线观看| 欧美午夜不卡在线观看免费| 亚洲一区成人| 老色鬼精品视频在线观看播放| 在线日韩精品视频| 欧美日韩国语| 久久―日本道色综合久久| 亚洲欧洲一区二区三区| 欧美一区二区黄色| 亚洲欧洲一区二区在线播放| 国产精品第13页| 久久九九免费视频| 亚洲视频在线观看免费| 欧美成人网在线| 欧美影院视频| 亚洲一区二区三区在线播放| 一区二区三区视频观看| 欧美高清视频www夜色资源网| 亚洲在线观看免费视频| 亚洲美女视频网| 国产日韩在线视频| 国产精品人人爽人人做我的可爱| 久热精品视频| 久久蜜桃精品| 午夜精品一区二区三区四区| 欧美日韩成人激情| 欧美aⅴ99久久黑人专区| 久久大香伊蕉在人线观看热2| 亚洲素人一区二区| 亚洲午夜一二三区视频| 在线视频中文亚洲| 亚洲综合精品四区| 欧美一区二区三区免费观看| 亚洲综合欧美日韩| 欧美一区深夜视频| 久久蜜臀精品av| 欧美福利在线| 欧美极品一区| 国产精品亚洲产品| 亚洲风情在线资源站| 99精品国产在热久久下载| 亚洲午夜精品17c| 久久成人人人人精品欧| 免费日韩一区二区| 亚洲精品在线二区| 性欧美1819sex性高清| 久久免费黄色| 国产精品久久久久久久午夜片 | 亚洲欧美国产高清va在线播| 欧美一区二区三区免费视频| 久热成人在线视频| 国产精品福利影院| 在线成人免费观看| 亚洲欧美一区二区原创| 亚洲第一中文字幕在线观看| 亚洲已满18点击进入久久| 久久国产精品99国产| 欧美成人免费观看| 亚洲一二三区在线| 欧美日韩在线精品一区二区三区| 好吊妞这里只有精品| 欧美一区二区视频在线观看2020 | 久久精品国产欧美激情| 99热精品在线观看| 欧美三级电影一区| 在线精品视频免费观看| 欧美一区二区三区在线视频| 一区二区欧美视频| 欧美电影美腿模特1979在线看| 在线成人av网站| 久久精品视频在线看| 亚洲色无码播放| 一本久久综合亚洲鲁鲁| 亚洲视频碰碰| 欧美视频免费在线观看| 亚洲日本成人网| 麻豆精品视频在线| 久久精品九九| 国产精品乱码妇女bbbb| 久久久久久久久久久成人| 亚洲免费视频网站| 国产精品热久久久久夜色精品三区 | 亚洲免费在线观看视频| 亚洲黄色高清| 欧美影片第一页| 亚洲欧洲一区二区在线播放| 欧美国产日本在线| 亚洲最新中文字幕| 欧美日韩高清在线一区| 亚洲六月丁香色婷婷综合久久| 欧美大香线蕉线伊人久久国产精品| 亚洲欧美视频在线| 亚洲免费一在线| 亚洲国产精品国自产拍av秋霞| 美日韩精品免费观看视频| 91久久精品国产91久久性色tv| 欧美mv日韩mv国产网站app| 欧美激情视频在线播放 | 欧美亚洲第一区| 亚洲国产激情| 久久精品国产99国产精品澳门| 精品动漫3d一区二区三区免费版 | 欧美综合国产精品久久丁香| 久久久久国色av免费观看性色| 亚洲电影在线看| 一区二区三区**美女毛片| 99精品黄色片免费大全| 亚洲欧美日韩中文在线制服| 亚洲欧洲在线播放| 欧美在线免费一级片| 欧美中文字幕精品| 欧美午夜影院| 欧美国产日韩一区二区| 国产精品一区二区久久| 宅男66日本亚洲欧美视频| 最新高清无码专区| 久久全球大尺度高清视频| 亚洲欧美日韩在线高清直播| 亚洲精品一二三| 亚洲永久视频| 欧美人成在线| 亚洲国产精品传媒在线观看| 精品51国产黑色丝袜高跟鞋| 久久国产黑丝| 久久视频一区| 经典三级久久| 美女脱光内衣内裤视频久久影院| 欧美国产在线电影| 亚洲精品免费一二三区| 蜜月aⅴ免费一区二区三区| 美日韩精品视频| 亚洲伊人伊色伊影伊综合网| 欧美日韩国产色视频| 亚洲先锋成人| 欧美日韩亚洲高清| 亚洲国产精品成人综合色在线婷婷| 亚洲国产三级网| 欧美日韩激情小视频| 亚洲男女自偷自拍| 亚洲精品日韩精品| 久久久久久色| 亚洲美女网站| 国产亚洲欧美aaaa| 国产精品第三页| 久久婷婷成人综合色|