青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

Kullback–Leibler divergence KL散度

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P.

Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P.

KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullbackand Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence incalculus. However, the KL divergence can be derived from the Bregman divergence.

 

 

注意P通常指數據集,我們已有的數據集,Q表示理論結果,所以KL divergence 的物理含義就是當用Q來編碼P中的采樣時,比用P來編碼P中的采用需要多用的位數!

 

KL散度,也有人稱為KL距離,但是它并不是嚴格的距離概念,其不滿足三角不等式

 

KL散度是不對稱的,當然,如果希望把它變對稱,

Ds(p1, p2) = [D(p1, p2) + D(p2, p1)] / 2

 

下面是KL散度的離散和連續定義!

D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!

D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty p(x) \log \frac{p(x)}{q(x)} \; dx, \!

注意的一點是p(x) 和q(x)分別是pq兩個隨機變量的PDF,D(P||Q)是一個數值,而不是一個函數,看下圖!

 

注意:KL Area to be Integrated!

 

File:KL-Gauss-Example.png

 

KL 散度一個很強大的性質:

The Kullback–Leibler divergence is always non-negative,

D_{\mathrm{KL}}(P\|Q) \geq 0, \,

a result known as , with DKL(P||Q) zero if and only if P = Q.

 

計算KL散度的時候,注意問題是在稀疏數據集上KL散度計算通常會出現分母為零的情況!

 

 

Matlab中的函數:KLDIV給出了兩個分布的KL散度

Description

KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.

KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two distributions specified over the M variable values in vector X. P1 is a length-M vector of probabilities representing distribution 1, and P2 is a length-M vector of probabilities representing distribution 2. Thus, the probability of value X(i) is P1(i) for distribution 1 and P2(i) for distribution 2. The Kullback-Leibler divergence is given by:

   KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]

If X contains duplicate values, there will be an warning message, and these values will be treated as distinct values. (I.e., the actual values do not enter into the computation, but the probabilities for the two duplicate values will be considered as probabilities corresponding to two unique values.) The elements of probability vectors P1 and P2 must each sum to 1 +/- .00001.

A "log of zero" warning will be thrown for zero-valued probabilities. Handle this however you wish. Adding 'eps' or some other small value to all probabilities seems reasonable. (Renormalize if necessary.)

KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic (2001).

KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article for "Kullback–Leibler divergence". This is equal to 1/2 the so-called "Jeffrey divergence." See Rubner et al. (2000).

EXAMPLE: Let the event set and probability sets be as follow:
   X = [1 2 3 3 4]';
   P1 = ones(5,1)/5;
   P2 = [0 0 .5 .2 .3]' + eps;
Note that the event set here has duplicate values (two 3's). These will be treated as DISTINCT events by KLDIV. If you want these to be treated as the SAME event, you will need to collapse their probabilities together before running KLDIV. One way to do this is to use UNIQUE to find the set of unique events, and then iterate over that set, summing probabilities for each instance of each unique event. Here, we just leave the duplicate values to be treated independently (the default):
   KL = kldiv(X,P1,P2);
   KL =
        19.4899

Note also that we avoided the log-of-zero warning by adding 'eps' to all probability values in P2. We didn't need to renormalize because we're still within the sum-to-one tolerance.

REFERENCES:
1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley, 1991.
2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler distance." IEEE Transactions on Information Theory (Submitted).
3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's distance as a metric for image retrieval." International Journal of Computer Vision, 40(2): 99-121.
4) <a href="
http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence"&gt;Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.

posted on 2010-10-16 15:04 Sosi 閱讀(10034) 評論(2)  編輯 收藏 引用 所屬分類: Taps in Research

評論

# re: Kullback&ndash;Leibler divergence KL散度 2010-11-30 16:17 tintin0324

博主,本人的研究方向需要了解kl距離,有些問題想請教下,怎么聯系呢?
  回復  更多評論    

# re: Kullback&ndash;Leibler divergence KL散度 2010-12-05 22:37 Sosi

@tintin0324
KL 距離本身很簡單,如果就是那樣子定義的,意義也如上面所說。。如果你想深入了解的話,可以讀以下相關文獻
  回復  更多評論    
統計系統
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            亚洲女同在线| 欧美电影在线播放| 久久躁狠狠躁夜夜爽| 一本一道久久综合狠狠老精东影业| 影音欧美亚洲| 国产一区二区三区四区在线观看| 国产午夜精品美女视频明星a级 | 狼人社综合社区| 欧美一区二区在线播放| 久久9热精品视频| 久久一综合视频| 欧美黑人在线观看| 亚洲人成在线影院| 中文欧美字幕免费| 欧美一区二区三区免费大片| 久久精品91久久久久久再现| 蜜臀a∨国产成人精品| 欧美日韩视频免费播放| 国产精品一区2区| 亚洲国产精品久久久久| 中文一区二区| 久久久久九九视频| 亚洲国产婷婷香蕉久久久久久99| 99这里只有久久精品视频| 小黄鸭精品aⅴ导航网站入口| 久久久青草青青国产亚洲免观| 欧美顶级艳妇交换群宴| 国产精品综合久久久| 亚洲高清久久久| 亚洲综合三区| 久久亚洲欧美| 99日韩精品| 久久免费国产| 国产精品久久久久影院色老大| 精品成人乱色一区二区| 亚洲一二三级电影| 亚洲国产精品一区二区尤物区| 欧美一区二区成人| 欧美午夜www高清视频| 亚洲区免费影片| 久久先锋资源| 午夜一区在线| 国产精品狼人久久影院观看方式| 91久久精品视频| 你懂的国产精品| 欧美在线亚洲综合一区| 国产精品人人做人人爽| 一本色道久久综合亚洲精品按摩 | 久久精品视频亚洲| 亚洲乱码国产乱码精品精98午夜| 午夜欧美大片免费观看| 国产精品久久久久国产a级| 亚洲伦理久久| 欧美不卡高清| 久久久久国产精品一区三寸| 国产麻豆成人精品| 亚洲欧美日本国产有色| 亚洲免费av网站| 欧美va天堂va视频va在线| 精品动漫3d一区二区三区| 欧美在线视频免费播放| 亚洲影音一区| 国产欧美精品日韩区二区麻豆天美| 一区二区三区**美女毛片| 欧美激情一区二区三区高清视频| 久久国产精品久久久久久| 国产精品国产一区二区| 午夜久久美女| 久久精品国产清高在天天线| 国内激情久久| 另类专区欧美制服同性| 久久躁狠狠躁夜夜爽| 亚洲激情在线观看| 亚洲精品1区| 欧美日韩在线一二三| 亚洲男人影院| 性色av一区二区三区在线观看 | 欧美日韩福利视频| 99视频精品| 亚洲视频电影图片偷拍一区| 国产精品丝袜白浆摸在线| 欧美一级大片在线观看| 久久精品一二三区| 亚洲国产日韩欧美在线99| 亚洲国产精品久久久久秋霞不卡| 欧美激情女人20p| 亚洲欧美日韩精品久久亚洲区| 亚洲少妇一区| 激情综合久久| 亚洲精品视频在线| 国产欧美一区二区白浆黑人| 久久久久88色偷偷免费| 奶水喷射视频一区| 亚洲影院高清在线| 久久青青草原一区二区| 亚洲天天影视| 久久久久国产精品厨房| 亚洲精品在线视频观看| 中国日韩欧美久久久久久久久| 国产日韩在线亚洲字幕中文| 一区二区日韩| 亚洲欧美国产77777| 欧美一区二区三区四区在线观看地址| 亚洲成色www8888| 一区二区三区四区精品| 精品91久久久久| 99re66热这里只有精品4| 国产一区美女| 一本色道**综合亚洲精品蜜桃冫 | 欧美韩日一区二区三区| 一区二区三区国产盗摄| 欧美伊人久久大香线蕉综合69| 亚洲卡通欧美制服中文| 性色av一区二区三区在线观看| 亚洲欧洲综合另类在线| 欧美一区二区视频在线| 亚洲免费视频网站| 欧美~级网站不卡| 久久香蕉国产线看观看av| 欧美日韩色一区| 免费在线看成人av| 国产精品揄拍500视频| 亚洲精品永久免费| 在线观看的日韩av| 亚洲欧美国产精品va在线观看| 99综合精品| 欧美成人免费网| 久久色在线播放| 国产欧美视频在线观看| 欧美了一区在线观看| 欧美 日韩 国产一区二区在线视频 | 亚洲第一综合天堂另类专| 国产精品视频一区二区三区 | 性欧美在线看片a免费观看| 欧美日本韩国| 亚洲茄子视频| 亚洲毛片网站| 欧美国产一区视频在线观看| 欧美xxx成人| 亚洲国产精品视频一区| 久久久久一区| 久久久人成影片一区二区三区观看 | 中文国产一区| 一区二区免费在线播放| 欧美日韩成人综合| 亚洲伦伦在线| 亚洲欧美成人| 国产日韩欧美在线| 性做久久久久久免费观看欧美| 欧美一区观看| 国产一区二区三区在线观看免费| 亚洲欧美日韩网| 欧美亚洲免费| 亚洲人成艺术| 亚洲网站在线播放| 欧美三级韩国三级日本三斤| 日韩亚洲精品视频| 亚洲综合欧美日韩| 国产欧美日韩免费| 久久久久这里只有精品| 亚洲国产视频一区二区| av成人免费在线观看| 国产精品久久久久久福利一牛影视| 亚洲免费一在线| 麻豆久久久9性大片| 亚洲日本欧美| 欧美日韩在线免费观看| 亚洲一区二区三区四区五区黄| 欧美在线播放视频| 极品尤物av久久免费看| 免费人成精品欧美精品| 一本一道久久综合狠狠老精东影业| 午夜精品区一区二区三| 在线观看91精品国产入口| 欧美日韩综合在线免费观看| 欧美一区二区国产| 亚洲人在线视频| 久久精品免费播放| 日韩一区二区精品葵司在线| 国产精品视频xxx| 美女亚洲精品| 亚洲女性裸体视频| 亚洲激情视频| 乱人伦精品视频在线观看| 夜夜嗨网站十八久久| 国产欧美在线看| 欧美激情一区二区三区四区| 亚洲在线视频| 亚洲精品一区二区三区福利| 久久久久综合网| 亚洲性图久久| 日韩小视频在线观看专区| 国内久久视频| 国产精品区一区二区三| 欧美精品午夜| 久久综合久久88| 先锋影音国产一区| 国产精品99久久99久久久二8 | 国产欧美一区二区精品仙草咪| 蜜桃精品一区二区三区|