Chapter 5: Autoencoders
Activity 8: Modeling Neurons with a ReLU Activation Function
Solution:
- Import
numpy
and matplotlib:import numpy as np import matplotlib.pyplot as plt
- Allow latex symbols to be used in labels:
plt.rc('text', usetex=True)
- Define the ReLU activation function as a Python function:
def relu(x): return np.max((0, x))
- Define the inputs (
x
) and tunable weights (theta
) for the neuron. In this example, the inputs (x
) will be 100 numbers linearly spaced between -5 and 5. Settheta = 1
:theta = 1 x = np.linspace(-5, 5, 100) x
The output is as follows:
Figure 5.35: Printing the inputs
- Compute the output (
y
):y = [relu(_x * theta) for _x in x]
- Plot the output of the neuron versus the input:
fig = plt.figure(figsize=(10, 7)) ax = fig.add_subplot(111) ax.plot(x, y) ax.set_xlabel('$x$', fontsize=22); ax.set_ylabel('$h(x\Theta)$', fontsize=22); ax.spines['left'].set_position(('data', 0)); ax.spines...