Supervised Learning: Classification
During this course, we will continue to learn another important application in supervised learning  solving classification problems. In the following lessons, you will be exposed to: logistic regression, Knearest neighbor algorithm, naive Bayes, support vector machine, perceptron and artificial neural network, decision tree and random forest, and bagging and boosting methods. The course will start with the principle of each of these methods. You are supposed to fully understand the implementation process of the core algorithm, and then learn to use scikitlearn to construct your own models. The course content of this week is much larger than that of the first week. We hope you can devote due attention to the lessons hereof.
The course will be available to you when you upgrade to Pro.

LabLogistic Regression

LabKNearest Neighbor Algorithm

LabNaive Bayes

ChallengeImplementation of Gaussian Distribution Function and Draw

LabSupport Vector Machines

LabPerceptron and Artificial Neural Network

ChallengeTrain Handwritten Digits Recognition Neural Network

LabDecision Tree

LabBagging and Boosting Method

ChallengeQuickly Select Models with Crossvalidation
Ask a Question
Most recent comments
1
answers
385
views