文件名称:
Machine_Learning_algorithm_recipes_in_scikit-learns.pdf.pdf
开发工具:
文件大小: 2mb
下载次数: 0
上传时间: 2019-09-14
详细说明:Machine_Learning_algorithm_recipes_in_scikit-learns.pdf/∠U1b
Get r our Hands dirty vvitn Sci KI t-Learn NoW -Macnine Learning Mvlastery
16 print(metrics confusion-matrix(expected, predicted))
For more information see the API reference for Logistic Regression for details on configuring the
algorithm parameters. Also see the Logistic Regression section of the user guide
Naive Bayes
Naive Bayes uses Bayes Theorem to model the conditional relationship of each attribute to the class
variable
This recipe shows the fitting of an Naive Bayes model to the iris dataset
1# Gaussian Naive bayes
2 from skLearn import datasets
3 from sklearn import metrics
4 from sklearn. naive bayes import gaussianNB
5# load the iris datasets
dataset= datasets Load_iris
7# fit a naive bayes model to the data
8 model= gaussianNBO
9 modeL, fit(dataset, data, dataset target)
10 print(model)
11# make predictions
12 expected =dataset target
13
14 summarize the fit of the model
15print(metrics classification_report(expected, predicted))
16 print(metrics. confusion_matrix(expected, predicted))
For more information see the API reference for the Gaussian Naive Bayes for details on configuring
the algorithm parameters. Also see the Naive bayes section of the user guide
K-Nearest Neighbor
The k-Nearest Neighbor(kNN) method makes predictions by locating similar cases to a given data
instance(using a similarity function)and returning the average or majority of the most similar data
instances. The kNN algorithm can be used for classification or regression
This recipe shows use of the kNN model to make predictions for the iris dataset
1# k-Nearest Neighbor
2 from sklearn import datasets
3 from sklearn import metrics
4 from sklearn neighbors import KNeighborsClassifier
5# Load iris the datasets
6 dataset datasets, load iris
7# fit a k-nearest neighbor model to the data
8 model= KNeighborsCLassifiero
10 print(model)
11# make predictions
12 expected dataset target
http:/machinelearningmasterycom/aet-your-hands-dirt-with-scikit-learn-now/
/∠U1b
Get r our Hands dirty vvitn Sci KI t-Learn NoW -Macnine Learning Mvlastery
14 #f summarize the fit of the modeL
15 print
classification report(
16 print(metrics. confusion_matrix(expected, predicted))
For more information see the API reference for the k-Nearest Neighbor for details on configuring the
algorithm parameters. Also see the k-Nearest Neighbor section of the user guide
Classification and regression Trees
Classification and Regression Trees(CART) are constructed from a dataset by making splits that best
separate the data for the classes or predictions being made. The CART algorithm can be used for
classification or regression
This recipe shows use of the CART model to make predictions for the iris dataset
1#Decision Tree Classifier
2 from sklearn import datasets
3 from sklearn import metrics
from sklearn tree import Decision T recLassifier
5# load the iris datasets
6 dataset= datasets, load_iriso
7# fit a CART model to the data
8 model= DecisionTreeClassifiero
10 print(model)
11# make predictions
12 expected= dataset target
13 predicted model predict(dataset data)
14 summarize the fit of the mode l
15print(metrics classification_report(expected, predicted)
16 print(metrics. confusion_matrix(expected, predicted))
For more information see the API reference for CART for details on configuring the algorithm
parameters. Also see the Decision Tree section of the user guide
Support Vector Machines
Support Vector Machines(SVM) are a method that uses points in a transformed problem space that
best separate classes into two groups Classification for multiple classes is supported by a one-VS-all
method. SVM also supports regression by modeling the function with a minimum amount of allowable
error
This recipe shows use of the svm model to make predictions for the iris dataset
1# Support Vector Machine
2 from sklearn import datasets
om sklearn import metrics
4 from sklearn svm import Svc
5# load the iris datasets
6 dataset datasets Load_iris
httn'/m achinelearninam
com/aet-vou r-hands-dirtvy_with- scikit-learn-now/
/∠U1
Get rour Manas uirty viN SCIKIT-Learn INow-Macnine Learning Mastery
7# fit a Svm model to the data
8 model= Svco
9 model fit(dataset, data, dataset. t
10print(model)
11# make predictions
2 expected datasettarget
13 predicted
el, predict(dataset data
14 #f summarize the fit of the model
classification_report(
16 print(metrics. confusion_matrix(expected, predicted))
For more information see the API reference for SVM for details on configuring the algorithm
parameters. Also see the SVM section of the user guide
Summary
In this post you have seen 5 self-contained recipes demonstrating some of the most popular and
powerful supervised classification problems
Each example is less than 20 lines that you can copy and paste and start using scikit-learn, right now
Stop reading and start practicing. Pick one recipe and run it, then start to play with the parameters and
see what effect that has on the results
Take The Next Step
Are you looking to get started or make the most of the scikit
Jump-start scikit-Learn
learn library without getting bogged down with the mathematics
pply Machine Learning with
Scikit-Learn Now
and theory of the algorithms?
In this 35-page pdf guide you will dis cover 35 standalone scikit
learn recipes that you can copy-paste into your project
Jump-Start Scikit-Learn
Recipes cover data handling, supervised learning algorithm,
LEARNING
regularization, ensemble methods and advanced topics like
feature selection, cross validation and parameter tuning
If you want to get up and running with scikit-learn fast, this recipe
book is for you!
About jason brownlee
Editor and Chief at MachineLearning Mastery. com. Dr Brownlee is a husband, father
professional programmer and a machine learning enthusiast. Learn more about him
View all posts by Jason Brownlee-y
httn'i/m achinelearningm asterv com/oet-vou r-hands_dirty -with- scikit-learn-noMw/
/∠U1
Get Y our Hands Dirty vvitn SclKIt-Learn Now -Macnine Learning Mastery
k The best Machine Learning algorithm
Prepare Data for Machine Learning in Python with Pandas>
N。 comments yet.
Leave a Reply
Name(required)
Email(will not be published)(required)
Website
SUBMIT COMMENT
Search
Resources you can use to learn faster
Feeling overwhelmed?
Download your guide to the best hand-picked machine learning books, course, tools and other resources
httn'i/m achinelearningm asterv com/oet-vou r-hands_dirty -with- scikit-learn-noMw/
/∠U1
Get Y our Hands Dirty vvitn SclKIt-Learn Now -Macnine Learning Mastery
Machine Leaming
Resource Guide
MACHINE
LEARNING
MASTER
POPULAR
A Tour of Machine Learning Algorithms
NOVEMBER 25. 2013
How to Run Your first Class ifier in Weka
FEBRUARY 17 2014
Tutorial To Implement k-Nearest Neighbors in Python From Scratch
SEPTEMBER 12. 2014
Discover Feature Engineering How to Engineer Features and How to get Good at It
SEPTEMBER 26. 2014
Best Machine Learning Resources for Getting Started
NOVEMBER 27 2013
4 Self-Study Machine Learning Projects
JANUARY 3. 2014
5 Mistakes Programmers Make when Starting in Machine Learning
JANUARY 29.2014
Why Get Into Machine Learning? Dis cover Your Personal Why and Use our Handy Map
DECEMBER 2. 2013
What if I'm Not Good at Mathematics
JANUARY 14.2014
httn'i/m achinelearningm asterv com/oet-vou r-hands_dirty -with- scikit-learn-noMw/
/∠U1
Get Y our Hands Dirty vvitn SclKIt-Learn Now -Macnine Learning Mastery
Using OpenCV, Python and Template Matching to play Where's Waldo?
MAY16,2014
o 2015 Machine Learning Mastery. All Rights Reserved
Privacy Contact About
httn'i/m achinelearningm asterv com/oet-vou r-hands_dirty -with- scikit-learn-noMw/
(系统自动生成,下载前可以参看下载内容)
下载文件列表
相关说明
- 本站资源为会员上传分享交流与学习,如有侵犯您的权益,请联系我们删除.
- 本站是交换下载平台,提供交流渠道,下载内容来自于网络,除下载问题外,其它问题请自行百度。
- 本站已设置防盗链,请勿用迅雷、QQ旋风等多线程下载软件下载资源,下载后用WinRAR最新版进行解压.
- 如果您发现内容无法下载,请稍后再次尝试;或者到消费记录里找到下载记录反馈给我们.
- 下载后发现下载的内容跟说明不相乎,请到消费记录里找到下载记录反馈给我们,经确认后退回积分.
- 如下载前有疑问,可以通过点击"提供者"的名字,查看对方的联系方式,联系对方咨询.