Pain Point #5: Utilizing alternate sources for trained images
Sometimes there are just not enough resources available to perform a convolutional neural network. The resources could be limited from a computational perspective or a data collection perspective. In situations like these, we rely on other sources to help us with classifying our images.
Getting ready
The technique for utilizing pre-trained models as the source for testing outcomes on other datasets is referred to as transfer learning. The advantage here is that much of the CPU resources allotted for training images is outsourced to a pre-trained model. Transfer learning has become a common extension of deep learning more recently.
How to do it...
This section explains how the process of transfer learning works.
- Collect a series of datasets or images that you are interested in classifying, just as you would with traditional machine learning or deep learning.
- Split the dataset into a training and testing split such as 75/25 or 80/20....