## Love Data Science and Deep Learning but get confused by all the maths and formulas and just want a simple explanation and set of examples? Me too. So let’s correct that now, shall we?

The goal of this article is to try and make a simple explainable example of building PyTorch Deep Learning models to give an example of Regression, Classification (Binary and Multi-Class), and Multi-Label Classification.

I found there were dozens if not hundreds of examples of each but none I found that gave examples of each in a similar way and often descended into big math formulas, so I flitted and swapped between different styles and approaches and as such confused myself time after time and so wanted something simple I could re-use.

I also wanted to do them as bare-bones as possible, using the lovely widely available libraries and modules, with as little bespoke ‘my interpretation’ as possible.

As such, I did not want to get bogged down in EDA, Feature Engineering, and all that stuff that is hugely important (I would say more so than the model), and wanted to keep this as simple and ‘just a framework for models’ as possible.

With that in mind, we will be using the wonderful **make_regression**, **make_classification,** and **make_multilabel_classification** from **sklearn.datasets**, therefore mimicking the state your data will be in once you have done all your EDA and Feature Engineering ready for your first baseline model, meaning we will not be doing any Label Encoding, addressing Imbalance, etc.

I also wanted to stay away from maths completely. I will explain why we are doing what we are doing without symbols and formulas and algorithms.

This is not to just give you some code to cut/paste, but rather to show you some of the error’s I faced along the way, resulting in (I hope) a useful set of functions and information.

I wrote this to try and help me to have a starter notebook I can use for a wide array of purposes, and in doing so I hope it helps others, so here we go.

**Prepare the Notebook**

First, load the relevant modules. Basically just **learn** , **torch**, **NumPy**, and **matplotlib**.

from sklearn import metricsfrom sklearn.model_selection import train_test_split

from sklearn.datasets import make_regression, make_classification, make_multilabel_classification

from sklearn.preprocessing import LabelEncoder, StandardScaler, MinMaxScalerimport torch

from torch.utils.data import Dataset, DataLoader

import torch.optim as torch_optim

import torch.nn as nn

import torch.nn.functional as Fimport numpy...

Continue reading: https://towardsdatascience.com/a-simple-maths-free-pytorch-model-framework-3eedfd738bd4?source=rss—-7f60cf5620c9—4

Source: towardsdatascience.com

## Comments by halbot