Perceptron

perceptron.png

\begin{align} y = 0 & (w_1x_1 + w_2x_2 <= \theta) \end{align}

\begin{align} y = 1 & (w_1x_1 + w_2x_2 > \theta) \end{align}

How to learn perceptron

perceptron_vector.png

Classification rule $sign(W^TX)$

updatedVector.png

updatedVector2.png

Perceptron with added weight

bias_perceotron.png

\begin{align} y = h(b + w_1x_1+w_2x_2) \end{align}

\begin{align} h(x) = 0&(x<=0) \end{align}

\begin{align} h(x) = 1&(x>0) \end{align}

Activation Function

Step Function

1
2
3
4
5
6
7
8
9
10
11
12
import numpy as np
import matplotlib.pylab as plt

def step_function_np(x):
    y = x > 0
    return y.astype(np.int)

x = np.arange(-5.0, 5.0, 0.1)
y = step_function_np(x)
plt.plot(x, y)
plt.ylim(-0.1, 1.1)
plt.show()

stepFunction.png

Sigmoid Function

\begin{align} h(x) = 1 / 1+\exp(-x) \end{align}

1
2
3
4
5
6
7
8
9
10
11
import numpy as np
import matplotlib.pylab as plt

def sigmpoid(x):
    return 1 / (1 + np.exp(-x))

x = np.arange(-5.0, 5.0, 0.1)
y = sigmpoid(x)
plt.plot(x, y)
plt.ylim(-0.1, 1.1)
plt.show()

sigmoid.png

ReLU Function

\begin{align} h(x) = x &(x>0) \end{align}

\begin{align} h(x) = 0 &(x<=0) \end{align}

1
2
3
4
5
6
7
8
9
10
11
import numpy as np
import matplotlib.pylab as plt

def ReLU(x):
    return np.maximum(0, x)

x = np.arange(-5.0, 5.0, 0.1)
y = ReLU(x)
plt.plot(x, y)
plt.ylim(-0.1, 1.1)
plt.show()

ReLU.png

References

강의: CMU Introduction to Deep Learning
코드: 밑바닥부터 시작하는 딥러닝