2024 Python for Linear Regression in Machine Learning


Linear & Non-Linear Regression, Lasso & Ridge Regression, SHAP, LIME, Yellowbrick, Feature Selection & Outliers Removal

What you will learn

Analyse and visualize data using Linear Regression

Plot the graph of results of Linear Regression to visually analyze the results

Learn how to interpret and explain machine learning models

Do in-depth analysis of various forms of Linear and Non-Linear Regression

Use YellowBrick, SHAP, and LIME to interact with predictions of machine learning models

Do feature selection and transformations to fine tune machine learning models

Course contains result oriented algorithms and data explorations techniques

Description

Unlock the power of machine learning with our comprehensive Python course on linear regression. Learn how to use Python to analyze data and build predictive models. This course is perfect for beginners with little or no programming experience and experienced Python developers looking to expand their skill set.

You’ll start with the basics of Python and work your way up to advanced techniques like linear regression, which is a powerful tool for predicting future outcomes based on historical data. Along the way, you’ll gain hands-on experience with popular Python libraries such as NumPy, Pandas, and Matplotlib. We will also cover the important aspect of model optimization, interpretability, and feature selection. You will learn how to optimize your model to improve its performance and how to interpret the model results and understand the underlying relationships in your data. We will also discuss feature selection techniques that are used to identify the most essential features that drive the predictions.

By the end of the course, you’ll have a solid understanding of how to use Python to build linear regression models and make accurate predictions. You’ll also be able to apply your new skills to a wide range of machine learning and data science projects. So, if you’re ready to take your Python skills to the next level and start using machine learning to analyze and predict real-world outcomes, this is the course for you!


Get Instant Notification of New Courses on our Telegram channel.

Noteβž› Make sure your π”ππžπ¦π² cart has only this course you're going to enroll it now, Remove all other courses from the π”ππžπ¦π² cart before Enrolling!


What is covered in this course?

This course teaches you, step-by-step coding for Linear Regression in Python. The Linear Regression model is one of the widely used in machine learning and it is one the simplest ones, yet there is so much depth that we are going to explore in 14+ hours of videos.

Below are the course contents of this course:

  • Section 1- IntroductionThis section gets you to get started with the setup. Download resources files for code along.
  • Section 2- Python Crash CourseThis section introduces you to the basics of Python programming.
  • Section 3- Numpy IntroductionThis section is optional, you may skip it but I would recommend you to watch it if you are not comfortable with NumPy.
  • Section 4- Pandas IntroductionThis section introduces you to the basic concepts of Pandas. It will help you later in the course to catch up on the coding.
  • Section 5- Matplotlib IntroductionDo not skip this section. We will be using matplotlib plots extensively in the coming sections. It builds a foundation for a strong visualization of linear regression results.
  • Section 6- Linear Regression IntroductionWe will kick-start our Linear Regression learning. You will learn the basics of linear regression. You will see some examples so that you can understand how Linear Regression works and how to analyze the results.
  • Section 7- Data Preprocessing for Linear RegressionThis section is the most important section. DO NOTΒ SKIP IT. It builds the foundation of data preprocessing for linear regression and other linear machine learning models. You will be learning, what are the techniques which we can use to improve the performance of the model. You will also learn how to check if your data is satisfying the coding of Linear Model Assumptions.
  • Section 8- Machine Learning Models Interpretability and ExplainerThis section teaches you how to open-up any machine learning models. Now you don’t need to treat machine learning models as black-box, you will get to learn how to open this box and how to analyze each and every component of machine learning models.
  • Section 9- Linear Regression Model OptimizationThis section extensively uses the knowledge of previous sections so don’t skip those. You will learn various techniques to improve model performance. We will show you how to do outliers removal and feature transformations.
  • Section 10- Feature Selection for Linear RegressionThis section teaches you some of the best techniques of feature selection. Feature selection reduces the model complexity and chances of model overfitting. Sometimes the model also gets trained faster but mostly depends on how many features are selected and the types of machine learning models.
  • Section 11- Ridge & Lasso Regression, ElasticNet, and Nonlinear RegressionThis section covers, various types of regression techniques. You will be seeing how to achieve the best accuracy by using the above techniques.

By the end of this course, your confidence will boost in creating and analyzing the Linear Regression model in Python. You’ll have a thorough understanding of how to use regression modeling to create predictive models and solve real-world business problems.

English
language

Content

Introduction

Introduction
Resources Folder | DO NOT SKIP
Install Anaconda and Python 3 on Windows 10
Install Anaconda and Python 3 on Ubuntu Machine
Jupyter Notebook Shortcuts

Python Crash Course

Introduction
Data Types
Variable Assignment
String Assignment
List
Set
Tuple
Dictionary
Boolean and Comparison Operator
Logical Operator
If, Else, Elif
Loops in Python
Methods and Lambda Function

Numpy Introduction [Optional]

Introduction
Array
NaN and INF
Statistical Operations
Shape, Reshape, Ravel, Flatten
Sequence, Repetitions, and Random Numbers
Where(), ArgMax(), ArgMin()
File Read and Write
Concatenate and Sorting
Working with Dates

Pandas Introduction

Introduction
DataFrame and Series
File Reading and Writing
Info, Shape, Duplicated and Drop
Columns
NaN and Null Values
Imputation
Lambda Functions

Matplotlib Introduction

Introduction
Line Plot
Label for X-Axis and Y-Axis
Scatter Plot, Bar Plot, and Hist Plot
Box Plot
Subplot
xlim, ylim, xticks, and yticks
Pie Plot
Pie Plot Text Color
Nested Pie Plot
Labeling a Pie Plot
Bar Chart on Polar Axis
Line Plot on a Polar Axis
Scatter Plot on a Polar Axis
Integral in Calculus Plot as Area Under the Curve

Linear Regression Introduction

Linear Regression Introduction
Regression Examples
Types of Linear Regression
Assessing the performance of the model
Bias-Variance tradeoff
What is sklearn and train-test-split
Python Package Upgrade and Import
Load Boston Housing Dataset
Dataset Analysis
Exploratory Data Analysis- Pair Plot
Exploratory Data Analysis- Hist Plot
Exploratory Data Analysis- Heatmap
Train Test Split and Model Training
How to Evaluate the Regression Model Performance
Plot True House Price vs Predicted Price
Plotting Learning Curves Part 1
Plotting Learning Curves Part 2
Machine Learning Model Interpretability- Residuals Plot
Machine Learning Model Interpretability- Prediction Error Plot

Data Preprocessing for Linear Regression

Linear Model Assumption for Linear Regression
Definitions of Linear Model Assumptions
Load Boston Dataset
Create Reference Data
Check Linear Assumption for Boston Dataset Part 1
Check Linear Assumption for Boston Dataset Part 2
Log Transformation of Variables
Types of Variable Transformations
Reciprocal Transformation
sqrt and exp Transformation
Box-Cox Transformation
Yeo-Johnson Transformation
Check Variables Normality with Histogram
Check Variables Normality with Q-Q Plot
Variable Transformation for Normality
Check Variables Homocedasticity
Variable Transformation for Homoscedasticity Part 1
Variable Transformation for Homoscedasticity Part 2
How to Check Multicolinearity
Normalization and Standardization Introduction
Normalization and Standardization Coding

Machine Learning Models Interpretability and Explainer

Machine Learning Models Interpretability
Recap
Prediction Error Plot
Residuals Plot
Explain Machine Learning Models with LIME Part 1
Explain Machine Learning Models with LIME Part 2
Explain Machine Learning Models with LIME Part 3
Machine Learning Models Explainer Summary with SHAP
Explain Machine Learning Models with Dependence Plot
Explain Machine Learning Models with The Individual Force Plot
Explain Machine Learning Models with The Collective Force Plot
Explain Machine Learning Models with Shap Heatmap
Explain Machine Learning Models with with SHAP Waterfall Plots
Explain Feature Importance in Machine Learning Models
Explain Feature Selection with SHAP

Linear Regression Model Optimization

Recap
Check Linear Model Assumptions on Selected Features
Check Linear Model Assumptions on Selected Features Part 2
Detect Outliers in Machine Learning Datasets
Outliers Visualization Plot
Outlier Detection for Normal Variables
Outlier Detection for Skewed Variables
Types of Outliers Removal Techniques
Outliers Removal by Using Feature-Engine
Model Evaluation After Removing the Outliers
Model Evaluation After Feature Transformations and Outliers Removal

Feature Selection for Linear Regression

Introduction to Feature Selection
Introduction to A Python API for Intelligent Visual Discovery
Recap
Visualization of Data with LUX API
Select Most Correlated Features with House Price Part 1
Select Most Correlated Features with House Price Part 2
Model Performance Evaluation
Remove Correlated Input Features (Multicollinearity)
Recursive Feature Elimination (RFE) Introduction
10 Recursive Feature Elimination (RFE) Coding
Increamental RFE
Exhaustive Feature Selection (EFS)
Feature Selection by Linear Regression Coefficients

Ridge & Lasso Regression, ElasticNet, and Nonlinear Regression

Introduction to Regularization
Recap
Ridge Regression
Lasso Regression
Elastic Net
Polynomial Regression
Polynomial Regression with Variable Transformations
Polynomial Regression with Feature Selection