You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.rst
+105Lines changed: 105 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -726,10 +726,115 @@ Boosting and Bagging
726
726
Boosting
727
727
---------
728
728
729
+
.. image:: docs/pic/Boosting.PNG
730
+
731
+
732
+
**Boosting** is a Ensemble learning meta-algorithm for primarily reducing Supervised learning, and also variance in supervised learning, and a family of machine learning algorithms that convert weak learners to strong ones. Boosting is based on the question posed by `Michael Kearns <https://en.wikipedia.org/wiki/Michael_Kearns_(computer_scientist)>`__ and Leslie Valiant (1988, 1989) Can a set of weak learners create a single strong learner. A weak learner is defined to be a Classification that is only slightly correlated with the true classification (it can label examples better than random guessing). In contrast, a strong learner is a classifier that is arbitrarily well-correlated with the true classification.
733
+
734
+
735
+
736
+
737
+
.. code:: python
738
+
739
+
from sklearn.ensemble import GradientBoostingClassifier
740
+
from sklearn.pipeline import Pipeline
741
+
from sklearn import metrics
742
+
from sklearn.feature_extraction.text import CountVectorizer
743
+
from sklearn.feature_extraction.text import TfidfTransformer
0 commit comments