• <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>

            a tutorial on computer science

              C++博客 :: 首頁 :: 新隨筆 :: 聯系 :: 聚合  :: 管理 ::
              21 隨筆 :: 0 文章 :: 17 評論 :: 0 Trackbacks
            stander random forest:  random K features, enum all values as split, find best split.

            LINKS:https://en.wikipedia.org/wiki/Random_forest


            Extremely randomized trees: random K features, random a split value, find best split.
            ensemble Extremely randomized trees: use all data.

            LINKS:http://docs.opencv.org/2.4/modules/ml/doc/ertrees.html

            1. Extremely randomized trees don’t apply the bagging procedure to construct a set of the training samples for each tree. The same input training set is used to train all trees.
            2. Extremely randomized trees pick a node split very extremely (both a variable index and variable splitting value are chosen randomly), whereas Random Forest finds the best split (optimal one by variable index and variable splitting value) among random subset of variables.

              Extremely randomized trees用了所有的樣本作為訓練集;Extremely randomized trees隨機選一個特征和一個值作為分割標準;

              LINKS:http://scikit-learn.org/stable/modules/generated/sklearn.tree.ExtraTreeRegressor.html#sklearn.tree.ExtraTreeRegressor

              This class implements a meta estimator that fits a number of randomized decision trees (a.k.a. extra-trees) on various sub-samples of the dataset and use averaging to improve the predictive accuracy and control over-fitting.

              Extra-trees differ from classic decision trees in the way they are built. When looking for the best split to separate the samples of a node into two groups, random splits are drawn for each of the 
              max_features randomly selected features and the best split among those is chosen. When max_features is set 1, this amounts to building a totally random decision tree.

              extra-trees 的ensemble用了bagging,然后選取多個特征,每個特征隨機選一個值作為分割標準建樹。

              一種實現方法:
                     樣本bagging, random n features & random k values ,求最優,建樹。 

            posted on 2016-02-28 21:01 bigrabbit 閱讀(338) 評論(0)  編輯 收藏 引用
            久久AV高清无码| 人妻精品久久久久中文字幕一冢本| 亚洲国产精品无码久久久蜜芽| 亚洲国产成人久久精品99| 国色天香久久久久久久小说 | 狠狠色丁香久久婷婷综合五月| 人妻无码中文久久久久专区| 色综合色天天久久婷婷基地| 国产精品久久久久久久久软件| 精品久久久久久久久午夜福利| 亚洲天堂久久精品| 日日躁夜夜躁狠狠久久AV| 久久精品无码av| 久久精品无码专区免费东京热| 91久久九九无码成人网站| 亚洲AV日韩精品久久久久久久| 久久精品国内一区二区三区| 精品国产青草久久久久福利| 久久久久无码精品| 国内精品伊人久久久久| 国产69精品久久久久9999APGF | 亚洲AV日韩精品久久久久久| 精品国产一区二区三区久久蜜臀| 久久久噜噜噜久久中文福利| 亚洲国产日韩欧美综合久久| 韩国三级中文字幕hd久久精品| 久久久精品2019免费观看| 久久精品国产亚洲av麻豆图片| 999久久久国产精品| 狠狠色丁香久久婷婷综| 久久国产色AV免费看| 久久人人爽人人爽人人AV| 久久夜色精品国产噜噜亚洲a| 国产日韩久久免费影院| 99久久免费只有精品国产| 国产精品一久久香蕉产线看| 无码人妻久久一区二区三区免费丨| 亚洲精品97久久中文字幕无码| 欧美粉嫩小泬久久久久久久 | 久久国产精品免费| 久久久久九国产精品|