## An amazing introduction to how Deep Learning works under the hood, a small glance of what is inside the black box of Artificial Neural Networks: Grokking Deep Learning!

Grokking Deep Learning: The following is a review of the book Grokking Deep Learning by Andrew Trask.

## Review

We like to think of ** Grokking Deep Learning** as the ‘

*Hello World*‘ of Deep Learning and Deep ANN.

This beautifully crafted text teaches you how to build Deep Neural Networks from scratch by using a first principles approach and getting you to code and understand the most basic building blocks of ANNs while avoiding the use of a pre-made framework like Tensorflow or Keras.

All of this with very shallow math, so it is perfect for those that are afraid of the scary monsters of Calculus or Algebra and want to start learning about Deep Learning (sorry for the redundancy), in an easy, engaging, and practical manner.

We’ve said it before with books like Deep Learning from Scratch: this bottom-up and code from 0 approach is great for understanding how the guts of these models function individually, and also how they are integrated together to form these incredible learning and prediction systems.

Using Python and Numpy, you will build and train Artificial Neural Networks which can understand images, be used for text translation, and even for generating poetry! As we’ve said, this will lay out the core concepts of how ANNs work, so that you can feel perfectly comfortable when using any Deep Learning framework afterwards.

Overall, Grokking Deep Learning is a perfect place to begin the path to building your own Deep Learning applications without burning your brain or simply copy-pasting lines of code from a premade framework. It will get you to understand everything that happens in a Deep Artificial Neural Network, so that you can later on best leverage the power, flexibility, and time-saving capabilities of the most popular frameworks like Tensorflow, Keras, or Pytorch.

## Contents of Grokking Deep Learning

The contents of the book are the following:

- Introducing deep learning: why you should learn it
- Fundamental concepts: how do machines learn?
- Introduction to neural prediction: forward propagation
- Introduction to neural learning: gradient descent
- Learning multiple weights at a time: generalizing gradient descent
- Building your first deep neural network: introduction to backpropagation
- How to picture neural networks: in your head and on paper
- Learning signal and ignoring noise:introduction to regularization and batching
- Modeling probabilities and nonlinearities: activation functions
- Neural learning about edges and corners: intro to convolutional neural networks
- Neural networks that understand language: king – man + woman == ?
- Neural networks that write like Shakespeare: recurrent layers for variable-length data
- Introducing automatic optimization: let’s build a deep learning framework
- Learning to write like Shakespeare: long short-term memory
- Deep learning on unseen data: introducing federated learning
- Where to go from here: a brief guide

Throughout the book you will find code examples and snippets that are carefully explained, and that won’t leave you asking ‘*what the hell does that line do*‘, like it happens with other texts.

Also, there is an official github repo with the notebooks and the code used in the book. You can find it here: GitHub Repository of Grokking Deep Learning.

Lastly, the official website for the book can be found on the following link: Manning Publications: Grokking Deep Learning.

## About the book

**Author:** Andrew Trask, research scientist at DeepMind and PhD student at the University of Oxford. You can find his Twitter account at @iamtrask.

**Pages**: The book is 336 pages long. Despite covering a lot of content, it is not one of those textbooks that make your backpack feel like you are carrying stones.

**Publication:** January 2019

## Prerequisites

This book is for those readers that have at least a high-school level maths knowledge, and that are quite comfortable with programming in Python. If you don’t have either of those, don’t worry, we got you covered.

First, in our Tutorials category, in the Deep Learning or Maths for Machine Learning sections, check out the awesome resources there are out there to see what fits you best.

Also, if you are not at an intermediate level of Python yet, take a look at our Python books or Python online-courses categories to find free and paid material that can guide you in your path to improve your knowledge of this great programming language.

## Summary of Grokking Deep Learning

Grokking Deep Learning is the perfect place to begin your deep learning journey. By building the main building blocks of Artificial Neural Networks from scratch you will learn their under-the-hood details while appreciating the benefits in a framework can provide, with a very thin mathematical layer on top.

It is a wonderful discussion of the mechanics of what happens inside neural networks, written without any complex mathematical jibber-jabber, that is digestible with only a little previous knowledge of Python programming and even fewer math. Bon appetit!

You can find the book on Amazon at the best price here:

- Trask, Andrew (Author)
- English (Publication Language)
- 336 Pages - 01/25/2019 (Publication Date) - Manning (Publisher)

For other text from the same author check out:

- Grokking Artificial Intelligence Algorithms: Understand and apply the core algorithms of deep learning and artificial intelligence in this friendly illustrated guide including exercises and examples
- Grokking Deep Reinforcement Learning
- Grokking Algorithms: An Illustrated Guide for Programmers and Other Curious People

Thank you so much for reading How to Learn Machine Learning, have a wonderful day!

Some further resources from and about the book:

- Grokking Deep Learning Github Repo.
- Grokking Deep Learning Andrew Trask personal website.

**Tags: ***Grokking Deep Learning, Deep Learning Illustrated, Deep Learning Book, Machine Learning Book.*

## 3 Comments

Comments are closed.