Training Example | x1 | x2 | Class |
a. | 0 | 1 | -1 |
b. | 2 | 0 | -1 |
c. | 1 | 1 | +1 |
w0 = -1.5
w1 = 0
w2 = 2
In your answer, you should clearly indicate the new weight values at the end of each training step.
Construct by hand a Neural Network (or Multi-Layer Perceptron) that computes the XOR function of two inputs. Make sure the connections, weights and biases of your network are clearly visible.
Challenge:
Can you construct a Neural Network to compute XOR which has only one
hidden unit, but also includes shortcut connections from the two
inputs directly to the (one) output.
Hint: start with a network that computes the inclusive OR, and then try to think of how it could be modified.
Assuming False=0 and True=1, explain how each of the following could be constructed:
Hint: in each case, first decide on the input-to-output or input-to-hidden weights, then determine the bias.