scikit-learn Tutorials

scikit-learn offers a systematic approach to Python machine learning. Our tutorials cover various ML algorithms, model selection, and evaluation techniques, suitable for both beginner and intermediate data scientists. With free labs and practical code examples, you'll get hands - on experience in building ML models. Our data science playground enables you to experiment with scikit-learn functions and datasets in real - time.

Classifying Iris Using SVM

Classifying Iris Using SVM

In this project, you will learn how to classify the iris dataset using a Support Vector Classifier (SVC) model. The iris dataset is a classic machine learning dataset that contains information about different species of irises, including their sepal length, sepal width, petal length, and petal width.
Pythonscikit-learn
Kernel Ridge Regression

Kernel Ridge Regression

In this lab, we will learn about Kernel Ridge Regression (KRR) and its implementation using the scikit-learn library in Python. KRR combines ridge regression with the kernel trick to learn a linear function in the space induced by the kernel. It is a non-linear regression method that can handle non-linear relationships between input and output variables.
Machine Learningscikit-learn
Linear Models in Scikit-Learn

Linear Models in Scikit-Learn

In this lab, we will explore linear models in scikit-learn. Linear models are a set of methods used for regression and classification tasks. They assume that the target variable is a linear combination of the features. These models are widely used in machine learning due to their simplicity and interpretability.
Machine Learningscikit-learn
Discriminant Analysis Classifiers Explained

Discriminant Analysis Classifiers Explained

Linear and Quadratic Discriminant Analysis (LDA and QDA) are two classic classifiers used in machine learning. LDA uses a linear decision surface, while QDA uses a quadratic decision surface. These classifiers are popular because they have closed-form solutions, work well in practice, and have no hyperparameters to tune.
Machine Learningscikit-learn
Exploring Scikit-Learn Datasets and Estimators

Exploring Scikit-Learn Datasets and Estimators

In this lab, we will explore the setting and the estimator object in scikit-learn, a popular machine learning library in Python. We will learn about datasets, which are represented as 2D arrays, and how to preprocess them for scikit-learn. We will also explore the concept of estimator objects, which are used to learn from data and make predictions.
Machine Learningscikit-learn
Exploring Scikit-Learn SGD Classifiers

Exploring Scikit-Learn SGD Classifiers

In this lab, we will explore Stochastic Gradient Descent (SGD), which is a powerful optimization algorithm commonly used in machine learning for solving large-scale and sparse problems. We will learn how to use the SGDClassifier and SGDRegressor classes from the scikit-learn library to train linear classifiers and regressors.
Machine Learningscikit-learn
Working with Text Data

Working with Text Data

In this lab, we will explore how to work with text data using scikit-learn, a popular machine learning library in Python. We will learn how to load text data, preprocess it, extract features, train a model, and evaluate its performance.
Machine Learningscikit-learn
Ensemble Methods Exploration with Scikit-Learn

Ensemble Methods Exploration with Scikit-Learn

In this lab, we will explore ensemble methods using scikit-learn. Ensemble methods are machine learning techniques that combine multiple models to achieve better performance than a single model. We will specifically focus on two popular ensemble methods: Bagging and Random Forests.
Machine Learningscikit-learn
Nonlinear Regression with Isotonic

Nonlinear Regression with Isotonic

In this lab, we will explore isotonic regression using scikit-learn. Isotonic regression is a technique that fits a non-decreasing function to one-dimensional data. It is useful when you have data that does not satisfy the assumption of linearity in a regression model.
Machine Learningscikit-learn
Multiclass and Multioutput Algorithms

Multiclass and Multioutput Algorithms

In this lab, we will explore the functionality and usage of multiclass and multioutput algorithms in scikit-learn. Multiclass classification is a classification task where samples are assigned to more than two classes. Multioutput classification, on the other hand, predicts multiple properties for each sample. We will cover the following topics:
Machine Learningscikit-learn
Supervised Learning with Scikit-Learn

Supervised Learning with Scikit-Learn

In supervised learning, we want to learn the relationship between two datasets: the observed data X and an external variable y that we want to predict.
Machine Learningscikit-learn
Gaussian Process Regression and Classification

Gaussian Process Regression and Classification

In this lab, we will explore Gaussian Processes (GP), a supervised learning method used for regression and probabilistic classification problems. Gaussian Processes are versatile and can interpolate observations, provide probabilistic predictions, and handle different types of kernels. In this lab, we will focus on Gaussian Process Regression (GPR) and Gaussian Process Classification (GPC) using the scikit-learn library.
Machine Learningscikit-learn
Decision Tree Classification with Scikit-Learn

Decision Tree Classification with Scikit-Learn

In this lab, we will learn how to use Decision Trees for classification using scikit-learn. Decision Trees are a non-parametric supervised learning method used for classification and regression. They are simple to understand and interpret, and can handle both numerical and categorical data.
Machine Learningscikit-learn
Model Selection: Choosing Estimators and Their Parameters

Model Selection: Choosing Estimators and Their Parameters

In machine learning, model selection is the process of choosing the best model for a given dataset. It involves selecting the appropriate estimator and tuning its parameters to achieve optimal performance. This tutorial will guide you through the process of model selection in scikit-learn.
Machine Learningscikit-learn
Semi-Supervised Learning Algorithms

Semi-Supervised Learning Algorithms

In this lab, we will explore the concept of semi-supervised learning, which is a type of machine learning where some of the training data is labeled and some is unlabeled. Semi-supervised learning algorithms can leverage the unlabeled data to improve the model's performance and generalize better to new samples. This is particularly useful when we have a small amount of labeled data but a large amount of unlabeled data.
Machine Learningscikit-learn
Neural Network Models

Neural Network Models

In this lab, we will learn about neural network models and how they can be used in supervised learning tasks. Neural networks are a popular type of machine learning algorithm that can learn non-linear patterns in data. They are often used for classification and regression tasks.
Machine Learningscikit-learn
Implementing Stochastic Gradient Descent

Implementing Stochastic Gradient Descent

Stochastic Gradient Descent (SGD) is a popular optimization algorithm used in machine learning. It is a variation of the gradient descent algorithm that uses a randomly selected subset of the training data at each iteration. This makes it computationally efficient and suitable for handling large datasets. In this lab, we will walk through the steps of implementing SGD in Python using scikit-learn.
Machine Learningscikit-learn
Naive Bayes Example

Naive Bayes Example

In this lab, we will go through an example of using Naive Bayes classifiers from the scikit-learn library in Python. Naive Bayes classifiers are a set of supervised learning algorithms that are commonly used for classification tasks. These classifiers are based on applying Bayes' theorem with the assumption of conditional independence between every pair of features given the value of the class variable.
Machine Learningscikit-learn
  • Prev
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • ...
  • 15
  • Next