If you’re a developer and expect deep expert insights about machine learning and neural networks, this “Make Your Own Neural Network in Python” review is not for you.
But if you are a tech addict and like reading and learning about new software applications, then this will most likely hit a spot. If you like to improve general knowledge on the matter, then you may find this reading experience useful as well.
The Special Spot for Neural Networks in Machine Learning
Alongside developments in the complex field of AI in recent years, machine learning (ML) went from an emerging to mainstream technology. It will become even more essential as the years go by.
In parallel, I’ve asked myself many questions about what machine learning is and what is all the fuss about it – will computers truly be able to learn as humans do? Can they replicate learning from an example? Are we looking at a doomsday where machines take over?
I was sure that this last scenario is not likely to take place, but I was still curious about the essential differences between machine learning (ML) algorithms and existing code.
Although conventional algorithms are not that simple, they can be packed into neat boxes for classification purposes.
How Machine Learning Algorithms Are Different from the Rest of Code
But a machine learning algorithm is one of a kind. It doesn’t require continuous instructions by the coder – it is rather capable of instructing itself on the principle of learning from example by trial and error.
And as the machine goes through the process of making sense of the input data, it improves by one sort of a sieving process, leaving the coarse mistakes behind, and making fewer and tinier mistakes as it goes forward and generates the desired output.
This is, of course, an analogy and not an expert elaboration of how machines learn. It is an aspect that relates to neural networks, as well.
With visual aids, sketches, and diagrams, the authors Michael Taylor and Mark Koning explain a substantial segment of machine learning – neural networks, which is a sub-segment of deep learning. If you don’t have some programming background (and I don’t) it won’t be as easy to delve into the details.
Nevertheless, this is not only a coding book, but a text that ventures into the math and the logic of neural networks and can be interesting for someone with a mathematical or data science background.
Neural Networks Architecture
Again, if you are not a Python developer or don’t have another experience with creating algorithms, you will still find the book useful, but maybe only partially.
However, you do need to have some knowledge of high-level math – matrices and complex functions, and statistics – regression principles, for instance, to be able to decipher the logic of neural networks explained in this text.
Here is what I gained from this book:
- Understanding of the concentric circles around neural networks, with them being the center-most, while AI being the outermost layer, with deep learning and machine learning positioned in-between.
- Key terminology about neural networks, for example – nodes, synapses, connections, layers, and the ability to identify synonyms, for example, node=neuron.
- Why deep learning is called “deep” and what are hidden, input, and output layers.
- What is supervised, unsupervised, and semi-supervised machine learning
- The critical value of partial derivatives and why you need to understand them before reaching out for this book to build upon that knowledge and make your own network.
Thanks to my neverending interest in psychology, the last point was intriguing for me. Neural networks are structurally set to correspond to the neuron synapses in the human brain, thus the similar technology.
I haven’t seen many visual presentations of how neural networks work, but I like this one. Once I was done with the reading, it became clear that this field has only scratched the surface and that there are many new insights skilled experts need to make before we have a better grasp of it.
Now, will I be able to make my own neural network? Absolutely not. But will I read about technology development and innovations with a bit more confidence and acumen behind me? A definite yes.