青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2025年11月>
2627282930311
2345678
9101112131415
16171819202122
23242526272829
30123456

統計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

Kullback–Leibler divergence KL散度

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P.

Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P.

KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullbackand Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence incalculus. However, the KL divergence can be derived from the Bregman divergence.

 

 

注意P通常指數據集,我們已有的數據集,Q表示理論結果,所以KL divergence 的物理含義就是當用Q來編碼P中的采樣時,比用P來編碼P中的采用需要多用的位數!

 

KL散度,也有人稱為KL距離,但是它并不是嚴格的距離概念,其不滿足三角不等式

 

KL散度是不對稱的,當然,如果希望把它變對稱,

Ds(p1, p2) = [D(p1, p2) + D(p2, p1)] / 2

 

下面是KL散度的離散和連續定義!

D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!

D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty p(x) \log \frac{p(x)}{q(x)} \; dx, \!

注意的一點是p(x) 和q(x)分別是pq兩個隨機變量的PDF,D(P||Q)是一個數值,而不是一個函數,看下圖!

 

注意:KL Area to be Integrated!

 

File:KL-Gauss-Example.png

 

KL 散度一個很強大的性質:

The Kullback–Leibler divergence is always non-negative,

D_{\mathrm{KL}}(P\|Q) \geq 0, \,

a result known as , with DKL(P||Q) zero if and only if P = Q.

 

計算KL散度的時候,注意問題是在稀疏數據集上KL散度計算通常會出現分母為零的情況!

 

 

Matlab中的函數:KLDIV給出了兩個分布的KL散度

Description

KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.

KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two distributions specified over the M variable values in vector X. P1 is a length-M vector of probabilities representing distribution 1, and P2 is a length-M vector of probabilities representing distribution 2. Thus, the probability of value X(i) is P1(i) for distribution 1 and P2(i) for distribution 2. The Kullback-Leibler divergence is given by:

   KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]

If X contains duplicate values, there will be an warning message, and these values will be treated as distinct values. (I.e., the actual values do not enter into the computation, but the probabilities for the two duplicate values will be considered as probabilities corresponding to two unique values.) The elements of probability vectors P1 and P2 must each sum to 1 +/- .00001.

A "log of zero" warning will be thrown for zero-valued probabilities. Handle this however you wish. Adding 'eps' or some other small value to all probabilities seems reasonable. (Renormalize if necessary.)

KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic (2001).

KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article for "Kullback–Leibler divergence". This is equal to 1/2 the so-called "Jeffrey divergence." See Rubner et al. (2000).

EXAMPLE: Let the event set and probability sets be as follow:
   X = [1 2 3 3 4]';
   P1 = ones(5,1)/5;
   P2 = [0 0 .5 .2 .3]' + eps;
Note that the event set here has duplicate values (two 3's). These will be treated as DISTINCT events by KLDIV. If you want these to be treated as the SAME event, you will need to collapse their probabilities together before running KLDIV. One way to do this is to use UNIQUE to find the set of unique events, and then iterate over that set, summing probabilities for each instance of each unique event. Here, we just leave the duplicate values to be treated independently (the default):
   KL = kldiv(X,P1,P2);
   KL =
        19.4899

Note also that we avoided the log-of-zero warning by adding 'eps' to all probability values in P2. We didn't need to renormalize because we're still within the sum-to-one tolerance.

REFERENCES:
1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley, 1991.
2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler distance." IEEE Transactions on Information Theory (Submitted).
3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's distance as a metric for image retrieval." International Journal of Computer Vision, 40(2): 99-121.
4) <a href="
http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence"&gt;Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.

posted on 2010-10-16 15:04 Sosi 閱讀(10051) 評論(2)  編輯 收藏 引用 所屬分類: Taps in Research

評論

# re: Kullback&ndash;Leibler divergence KL散度 2010-11-30 16:17 tintin0324

博主,本人的研究方向需要了解kl距離,有些問題想請教下,怎么聯系呢?
  回復  更多評論    

# re: Kullback&ndash;Leibler divergence KL散度 2010-12-05 22:37 Sosi

@tintin0324
KL 距離本身很簡單,如果就是那樣子定義的,意義也如上面所說。。如果你想深入了解的話,可以讀以下相關文獻
  回復  更多評論    
統計系統
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            好吊成人免视频| 亚洲精品系列| 久久精品免费| 欧美一区二区三区四区视频 | 久久综合九色综合久99| 午夜精彩国产免费不卡不顿大片| 国产精品色婷婷| 欧美在线|欧美| 久久久久久久久久久一区| 亚洲第一级黄色片| 亚洲激情午夜| 欧美天天视频| 久久免费精品视频| 欧美h视频在线| 亚洲综合激情| 久久九九热免费视频| 亚洲高清不卡在线观看| 亚洲精品国产精品国自产在线| 欧美日韩一区二区在线视频 | 亚洲第一精品福利| 亚洲日韩欧美视频| 国产日韩精品久久久| 美女精品自拍一二三四| 欧美人妖在线观看| 久久国产精品99国产精| 老司机午夜免费精品视频| 在线视频你懂得一区| 小处雏高清一区二区三区| 亚洲激情视频在线观看| 亚洲一级黄色片| 亚洲精品黄色| 亚洲国产欧美在线| 亚洲一区二区三区在线视频| 在线不卡中文字幕| 在线一区日本视频| 亚洲国产成人一区| 亚洲午夜视频在线观看| 亚洲国产精品久久久久秋霞蜜臀 | 99国产精品国产精品毛片| 国产午夜精品久久| 亚洲精品一区二区三区蜜桃久| 国产伦精品一区二区三区四区免费| 欧美国产欧美亚洲国产日韩mv天天看完整 | 亚洲国产精品成人精品| 国产欧美一区二区精品婷婷 | 日韩视频免费在线| 在线观看欧美日韩| 亚洲欧美日韩国产精品| aa级大片欧美| 久久午夜影视| 久久精视频免费在线久久完整在线看| 欧美激情精品久久久六区热门 | 亚洲国产另类久久精品| 国产一区二区三区久久 | 久久男人资源视频| 国产精品一区二区a| 亚洲国产精品久久久久秋霞影院 | 在线观看视频一区二区欧美日韩| 亚洲一区在线播放| 亚洲一区二区视频| 欧美精品一区二区三区一线天视频 | 91久久国产综合久久蜜月精品| 先锋亚洲精品| 久久国产乱子精品免费女| 国产精品高清网站| 一区二区三区日韩精品视频| 亚洲视频狠狠| 国产精品99免费看| 在线一区亚洲| 欧美一区二区三区在| 国产精品一区二区视频| 亚洲欧美国产日韩中文字幕| 亚洲欧美中文日韩在线| 国产精品视频网| 亚洲欧美在线网| 久久久99精品免费观看不卡| 国内精品伊人久久久久av一坑| 欧美一区二区三区在| 久久亚洲免费| 亚洲激情啪啪| 欧美午夜剧场| 亚洲男人av电影| 久久久一区二区| 亚洲国产午夜| 欧美日韩在线播放三区四区| 中国女人久久久| 久久成人免费日本黄色| 国内一区二区三区| 男人的天堂亚洲在线| 日韩午夜av电影| 欧美一区二区三区四区在线观看| 狠狠色狠狠色综合日日tαg| 久久婷婷影院| a91a精品视频在线观看| 欧美在线观看一区| 亚洲国产一成人久久精品| 欧美人妖另类| 久久大逼视频| 亚洲乱码一区二区| 欧美在线free| 亚洲乱码精品一二三四区日韩在线| 欧美日韩国产专区| 欧美在线亚洲| 亚洲精品一区二区三区不| 久久av一区二区三区漫画| 在线日韩视频| 国产精品国产三级国产a| 久久久精品tv| 一本久久知道综合久久| 美女成人午夜| 先锋影音国产精品| 亚洲另类一区二区| 狠狠噜噜久久| 国产精品久久国产三级国电话系列| 久久久噜噜噜久久久| 一区二区三区视频观看| 欧美国产亚洲另类动漫| 久久黄色小说| 亚洲一区黄色| 99精品免费| 亚洲国产精品视频| 国产一区二区在线观看免费| 欧美午夜不卡在线观看免费| 久久天天躁夜夜躁狠狠躁2022| 亚洲影院色在线观看免费| 亚洲国产婷婷香蕉久久久久久99| 久久国产精品一区二区| 亚洲欧美清纯在线制服| 99视频在线精品国自产拍免费观看| 韩国亚洲精品| 国产专区欧美精品| 国产精品欧美日韩| 欧美日韩一区不卡| 欧美连裤袜在线视频| 免费视频久久| 久久一二三区| 久久久久久免费| 欧美在线网址| 久久成人精品一区二区三区| 亚洲欧美影院| 欧美一区二区三区日韩| 午夜精品久久久久久久99热浪潮| 中国亚洲黄色| 99精品黄色片免费大全| 99pao成人国产永久免费视频| 亚洲激情影院| 亚洲美女免费精品视频在线观看| 亚洲人成毛片在线播放| 亚洲第一偷拍| 亚洲三级免费电影| 日韩视频在线一区二区三区| 欧美日韩四区| 国产精品va在线播放我和闺蜜| 欧美日韩国产欧美日美国产精品| 欧美日韩成人在线播放| 欧美日韩精品免费观看| 欧美网站在线| 国产欧美日韩视频一区二区三区| 国产欧美日韩三区| 伊人久久大香线蕉综合热线 | 欧美日韩国产91| 欧美日韩一区二区免费视频| 欧美午夜精品伦理| 国产欧美一区二区视频| 国产亚洲制服色| 在线精品视频一区二区| 亚洲精品小视频在线观看| 一区二区免费看| 欧美一级黄色录像| 免费在线观看日韩欧美| 亚洲国产日韩一区| 亚洲视频在线观看网站| 久久www成人_看片免费不卡| 免费成人高清| 国产精品久久久久久久电影| 国产午夜精品一区二区三区视频| 亚洲高清一区二| 在线亚洲激情| 久久一区精品| 99精品国产在热久久| 午夜亚洲激情| 欧美精品日韩| 狠狠色狠狠色综合系列| 99成人免费视频| 久久亚洲高清| 一区二区三区 在线观看视频| 久久gogo国模裸体人体| 欧美精品99| 樱桃成人精品视频在线播放| 亚洲私人影院| 欧美刺激性大交免费视频| 亚洲香蕉伊综合在人在线视看| 久久男人av资源网站| 国产精品日韩一区二区三区| 亚洲激情婷婷| 久久综合一区二区三区| 一区二区三区精品在线 | 午夜亚洲伦理| 欧美三级电影精品| 亚洲精品午夜|