青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

Kullback–Leibler divergence KL散度

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P.

Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P.

KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullbackand Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence incalculus. However, the KL divergence can be derived from the Bregman divergence.

 

 

注意P通常指數據集,我們已有的數據集,Q表示理論結果,所以KL divergence 的物理含義就是當用Q來編碼P中的采樣時,比用P來編碼P中的采用需要多用的位數!

 

KL散度,也有人稱為KL距離,但是它并不是嚴格的距離概念,其不滿足三角不等式

 

KL散度是不對稱的,當然,如果希望把它變對稱,

Ds(p1, p2) = [D(p1, p2) + D(p2, p1)] / 2

 

下面是KL散度的離散和連續定義!

D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!

D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty p(x) \log \frac{p(x)}{q(x)} \; dx, \!

注意的一點是p(x) 和q(x)分別是pq兩個隨機變量的PDF,D(P||Q)是一個數值,而不是一個函數,看下圖!

 

注意:KL Area to be Integrated!

 

File:KL-Gauss-Example.png

 

KL 散度一個很強大的性質:

The Kullback–Leibler divergence is always non-negative,

D_{\mathrm{KL}}(P\|Q) \geq 0, \,

a result known as , with DKL(P||Q) zero if and only if P = Q.

 

計算KL散度的時候,注意問題是在稀疏數據集上KL散度計算通常會出現分母為零的情況!

 

 

Matlab中的函數:KLDIV給出了兩個分布的KL散度

Description

KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.

KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two distributions specified over the M variable values in vector X. P1 is a length-M vector of probabilities representing distribution 1, and P2 is a length-M vector of probabilities representing distribution 2. Thus, the probability of value X(i) is P1(i) for distribution 1 and P2(i) for distribution 2. The Kullback-Leibler divergence is given by:

   KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]

If X contains duplicate values, there will be an warning message, and these values will be treated as distinct values. (I.e., the actual values do not enter into the computation, but the probabilities for the two duplicate values will be considered as probabilities corresponding to two unique values.) The elements of probability vectors P1 and P2 must each sum to 1 +/- .00001.

A "log of zero" warning will be thrown for zero-valued probabilities. Handle this however you wish. Adding 'eps' or some other small value to all probabilities seems reasonable. (Renormalize if necessary.)

KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic (2001).

KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article for "Kullback–Leibler divergence". This is equal to 1/2 the so-called "Jeffrey divergence." See Rubner et al. (2000).

EXAMPLE: Let the event set and probability sets be as follow:
   X = [1 2 3 3 4]';
   P1 = ones(5,1)/5;
   P2 = [0 0 .5 .2 .3]' + eps;
Note that the event set here has duplicate values (two 3's). These will be treated as DISTINCT events by KLDIV. If you want these to be treated as the SAME event, you will need to collapse their probabilities together before running KLDIV. One way to do this is to use UNIQUE to find the set of unique events, and then iterate over that set, summing probabilities for each instance of each unique event. Here, we just leave the duplicate values to be treated independently (the default):
   KL = kldiv(X,P1,P2);
   KL =
        19.4899

Note also that we avoided the log-of-zero warning by adding 'eps' to all probability values in P2. We didn't need to renormalize because we're still within the sum-to-one tolerance.

REFERENCES:
1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley, 1991.
2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler distance." IEEE Transactions on Information Theory (Submitted).
3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's distance as a metric for image retrieval." International Journal of Computer Vision, 40(2): 99-121.
4) <a href="
http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence"&gt;Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.

posted on 2010-10-16 15:04 Sosi 閱讀(10034) 評論(2)  編輯 收藏 引用 所屬分類: Taps in Research

評論

# re: Kullback&ndash;Leibler divergence KL散度 2010-11-30 16:17 tintin0324

博主,本人的研究方向需要了解kl距離,有些問題想請教下,怎么聯系呢?
  回復  更多評論    

# re: Kullback&ndash;Leibler divergence KL散度 2010-12-05 22:37 Sosi

@tintin0324
KL 距離本身很簡單,如果就是那樣子定義的,意義也如上面所說。。如果你想深入了解的話,可以讀以下相關文獻
  回復  更多評論    
統計系統
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            国产精品视频精品| 亚洲精品美女91| 好吊一区二区三区| 国产日韩欧美不卡| 国产日韩精品视频一区| 狠狠色狠狠色综合| 亚洲国产免费| 一区二区三区欧美在线| 亚洲午夜久久久| 午夜精品久久久久久久99水蜜桃 | 中日韩在线视频| 亚洲色图在线视频| 久久精品国产亚洲一区二区| 免费亚洲电影在线观看| 欧美日韩综合网| 今天的高清视频免费播放成人| 亚洲福利在线看| 亚洲综合不卡| 久久精品人人做人人综合| 欧美电影电视剧在线观看| 91久久精品国产91性色tv| 亚洲一区二区四区| 另类尿喷潮videofree| 欧美日韩国产一区二区三区| 国产曰批免费观看久久久| 99re66热这里只有精品4| 久久www成人_看片免费不卡| 91久久极品少妇xxxxⅹ软件| 午夜久久一区| 欧美精品一区二区三区蜜臀| 黑人一区二区| 亚洲欧美综合网| 亚洲黄页一区| 久久久欧美精品| 国产欧美日韩激情| 亚洲一区网站| 亚洲美女91| 免费欧美在线| 在线观看一区二区视频| 欧美伊人久久| 一本色道久久综合一区 | 免费观看一级特黄欧美大片| 亚洲手机视频| 欧美精品高清视频| 伊人成综合网伊人222| 亚洲欧美日本伦理| 日韩视频亚洲视频| 欧美黑人多人双交| 亚洲人成网在线播放| 媚黑女一区二区| 久久精品一区蜜桃臀影院| 国产欧美日韩不卡| 欧美在线观看视频一区二区三区| 亚洲精品视频在线| 欧美成人自拍视频| 亚洲人成小说网站色在线| 欧美成在线观看| 麻豆国产精品777777在线 | 午夜视频在线观看一区二区三区 | 久久一区二区三区四区五区| 亚洲午夜在线观看| 国产精品日本一区二区| 亚洲女性裸体视频| 亚洲一区二区三区成人在线视频精品| 欧美日韩国产综合久久| 一区二区三区回区在观看免费视频| 亚洲国产1区| 欧美日韩mv| 亚洲影院免费| 午夜在线精品| 久久精品91| 久久激情视频久久| 一区久久精品| 欧美成人免费大片| 欧美成人免费全部观看天天性色| 在线看视频不卡| 亚洲国产精品视频一区| 欧美精品一区二区三区一线天视频| 亚洲最新色图| 亚洲字幕一区二区| 韩国v欧美v日本v亚洲v| 免费在线日韩av| 欧美精品久久一区二区| 亚洲欧美制服另类日韩| 久久精品欧洲| 在线一区二区三区四区五区| 午夜精品久久久久久久久久久久久| 国产在线观看一区| 亚洲国产综合91精品麻豆| 欧美日韩一区二区三区四区在线观看| 欧美在线网站| 男女激情视频一区| 亚洲欧美激情一区| 久久久久久久久伊人| 一区二区三区久久网| 欧美一区二视频在线免费观看| 亚洲国产激情| 亚洲欧美日韩国产中文| 亚洲精品美女91| 亚洲欧美影院| 日韩亚洲欧美综合| 欧美中文在线免费| 亚洲午夜未删减在线观看| 久久久999成人| 亚洲自拍偷拍视频| 欧美jjzz| 久久久人成影片一区二区三区观看| 免费成人高清视频| 久久精品天堂| 国产精品久久久久久久久久直播 | 99在线|亚洲一区二区| 黄色亚洲网站| 在线亚洲一区二区| 亚洲精品影院| 久久蜜桃精品| 久久九九久精品国产免费直播| 欧美日韩国产成人| 91久久精品国产91久久性色tv | 黄色资源网久久资源365| 亚洲香蕉成视频在线观看| 亚洲美女在线视频| 久久这里有精品视频| 久久久人成影片一区二区三区观看| 国产精品久久久久9999| 亚洲另类在线视频| 亚洲精选视频免费看| 免费成人激情视频| 母乳一区在线观看| 亚洲第一狼人社区| 久久久久久伊人| 你懂的国产精品| 国语自产偷拍精品视频偷| 欧美在线观看视频在线| 国产亚洲欧美aaaa| 美女国内精品自产拍在线播放| 国产日本欧美一区二区| 亚洲欧美激情精品一区二区| 亚洲综合另类| 国产精品日本| 欧美一级精品大片| 久久综合中文| 亚洲国产美女精品久久久久∴| 另类天堂av| 亚洲国产日韩欧美综合久久| 亚洲精品乱码久久久久久| 欧美激情亚洲综合一区| 亚洲美女电影在线| 亚洲欧美制服另类日韩| 国产婷婷成人久久av免费高清| 性欧美1819sex性高清| 久久综合亚州| 亚洲精品女av网站| 欧美午夜一区二区三区免费大片 | 玖玖玖免费嫩草在线影院一区| 蜜臀va亚洲va欧美va天堂| 最新精品在线| 欧美午夜激情小视频| 亚洲欧美视频一区| 蜜臀av国产精品久久久久| 夜夜狂射影院欧美极品| 国产精品视频成人| 久久久综合网| 99综合在线| 麻豆精品一区二区综合av| 亚洲欧洲一区二区在线播放| 欧美日韩综合不卡| 久久久久久久久久久一区| 亚洲伦理网站| 久久久人成影片一区二区三区| 亚洲精选成人| 国产日韩在线亚洲字幕中文| 免费观看成人www动漫视频| 一本一本久久| 免费看亚洲片| 午夜精品久久久| 亚洲精选中文字幕| 国产手机视频一区二区| 欧美日韩一区免费| 久久性天堂网| 午夜激情一区| 亚洲人成在线播放| 另类激情亚洲| 欧美一进一出视频| 亚洲视频一起| 亚洲三级免费| 一区一区视频| 国产欧美亚洲一区| 欧美色图天堂网| 欧美大片国产精品| 久久久久欧美精品| 欧美一级成年大片在线观看| 日韩一区二区福利| 亚洲国产欧美一区二区三区同亚洲| 久久精品男女| 午夜欧美视频| 午夜免费久久久久| 亚洲一区在线观看免费观看电影高清| 在线观看视频免费一区二区三区| 国产精品综合不卡av| 欧美日韩在线看|