Chapter 4 – Become an Unconventional Innovator
1. Can the perceptron alone solve the XOR problem? (Yes | No)
The answer was no in 1969. A neural network, or some other mathematical process, is necessary to solve this problem. For the record, this is a common problem for electric circuits that function with "feedforward" electricity, and was solved long ago.
2. Is the XOR function linearly separable? (Yes | No)
The answer is no if you use a single neuron and yes if you use a hidden layer with at least two neurons. That is a major problem to address in deep learning. If you cannot separate the features of a face, for example, in a picture, recognizing that face will prove difficult. Imagine a picture with one half of the face in shadow and the other half in bright sunlight. Since the eye and features of one half are in shadow, a poor deep learning program might only capture half of the face, separating the face in the wrong place with a poor edge detection function. Linear separability is thus a key aspect of machine learning.
3. One of the main goals of layers in a neural network is classification. (Yes | No)
The answer is yes. Once the data is identifiable with a given neural network architecture, predictions, and many other functions become possible. The key to deep learning is to be able to transform data into pieces of information that will make sense.
4. Is deep learning the only way to classify data? (Yes | No)
The answer is no. You can classify data with an SQL query, artificial intelligence, machine learning, and standard source code. Deep learning becomes vital when many dimensions of classification are involved: first finding the edges of objects in a picture, then forms, and then determining what the object represents. To do this with millions of pictures is beyond the scope of standard programming or early AI and ML programs.
5. A cost function shows the increase in the cost of neural network. (Yes | No)
The answer is no. A cost function determines how much the training costs you. Running 100,000 epochs is more expensive than running 50,000 epochs. So at each epoch, the cost of training (how far the system is from its goal) must be estimated. Thus, a good cost function will decrease the cost of running a neural network.
6. Can simple arithmetic be enough to optimize a cost function? (Yes | No)
The answer is yes. As long as you know to what extent your cost function is increasing or decreasing, anything that works is fine.
7. A feedforward network requires inputs, layers, and an output. (Yes | No)
The answer is yes. Without layers, there is no network.
8. A feedforward network always requires training with backpropagation. (Yes | No)
The answer is often yes in changing environments.. Since the field is new, we tend to think that once the training is done, the work is done. If the datasets are very stable in a repetitive environment, such as recognizing the difference between various constant products in a shop, warehouse or factory, then the neural network will do the classification it is designed for. If new products are introduced, then training can be initiated again.
9. In real-life applications, solutions are only found by respecting academic theory. (Yes | No)
The answer is no. Without academic research, deep learning would not even exist. Without universities, the ideas used would be so simple that they would never work well. On the other hand, researchers need real-life feedback. If we find new ways of doing things they recommended, we should publish them to help global research. It's a two-way street.