import numpy as np
import pandas as pd
import plotly.express as px
import plotly.graph_objects as go
from plotly.subplots import make_subplots
In this notebook, we provide a very quick (shallow?) introduction to neural networks and deep learning. We review the basic challenge of binary classification and linear decision functions and then show how features can be composed to express more complex decision surfaces. We then build a basic neural network to learn the feature functions and ultimately build more complex models for image classification.
We start by reviewing logistic regression. We construct a linearly separable data set and show how a logistic regression model fits this data.
n = 50
np.random.seed(42)
x = np.random.randn(4*n, 2) + 3*np.tile([[1,1], [-1,1], [-1,-1], [1,-1]],(n, 1))
y = x[:,0]>0
data = pd.DataFrame(np.hstack([x,y[:,np.newaxis]]), columns=["X1", "X2", "Y"]).sample(frac=1)
pos_ind = data["Y"]==1.0
pos_scatter = go.Scatter(x=data.loc[pos_ind,"X1"], y=data.loc[pos_ind,"X2"],
mode="markers", marker_symbol="cross", name="Pos")
neg_scatter = go.Scatter(x=data.loc[~pos_ind,"X1"], y=data.loc[~pos_ind,"X2"],
mode="markers", name="Neg")
go.Figure([pos_scatter, neg_scatter])