Udacity Deep Learning Nanodegree Notes and Thoughts [Lesson 2]

Sachi Parikh
4 min readJun 29, 2018
My Code for the Perceptron Algorithm Quiz
Gradient Descent and Activation Functions

Lesson 2[1]- 2[7]:

Course Content:

The course starts out with defining machine learning at an elementary level; separating two dimensional with a linear line in the form Wx + b, and later on, how to separate data with 3, 4 or n dimensions. You see how the data goes through the perceptron visually, and how you can combine perceptrons to separate data that looks linearly inseparable, like using the AND and NOT perceptron to create the XOR perceptron.

We learned how Gradient Descent worked, and how the different types of activation functions created discrete or continuous outputs depending on the activation functions put before the output layer. The most interesting thing about this section was how you could combine multiple linear models to get a model that could classify non-linear data, then combine those models to create even more complicated models inside the section of the neural net known as the hidden layer.

Cross Entropy

Now when it came time to code, something I especially liked about this lesson is that they explained how and why you use certain equations so when I started coding the perceptrons, it made complete sense to me, and I didn’t have to go back and review videos because they explained it so clearly. They were able to teach higher level concepts in a way that I would never be able to understand by just reading the documentation or an article.

Cross Entropy

Some of the more difficult concepts like Gradient Decent and Error functions were a bit more challenging, and they were aware of that. Instead of just giving you the equation and asking you to code it, they go through the math step by step. They ask you, the student, how you would help the neural net classify data with simple questions like: “What type of function turns a product into a sum?” or “What type of function turns all numbers positive?”, and through this process, you build the algorithms by yourself and get a deeper understanding of how it works.

The diagrams and animations provided in the videos are handy, especially if you have a hard time wrapping around your head how to understand why specific equations and concepts work the way they do. It’s clear that they put a lot of time and effort to make the animations clear and understandable to students.

Error Function and Gradient Descent Math

Thoughts:

From just my experience with this course up until Lesson 2, I am extremely satisfied with the course material and the interactive way they teach the more math intensive parts of deep learning. I also love how the forums for the course are very active- nearly every question you have will probably already be answered, and the slack channels are always available for any questions you want answered quickly. The projects for the course are not that difficult if you don’t have trouble with the coursework and understand it well enough.

Combining Linear Models to Separate Non-Linear Data

Most of the time if you can’t understand something and there isn’t an answer on the forums, just submit the code you have. The reviewer always does a great job answering your questions and helping you improve your code for your next submission, as well as answering any of the questions you ask. Additionally, they provide you with extra links and documentation of specific concepts that you may not understand based on the level of understanding they see from your code.

Different Types of Ways to Create Neural Networks

This course does require prior programming and Python knowledge, and they expect you to understand how to code all the perceptrons, algorithms, and activation functions just by knowing how they operate. Without this experience, some of the coding exercises that are pretty straightforward will be difficult for someone who is not well versed in Python. However, the need to be completely fluent in calculus is not necessary, so don’t let that scare you off. As I am a freshman in high school, I have not learned calculus yet, but I was able to understand the gradient descent math with a good enough understanding by watching a couple of videos here and there.

Overall, I am enjoying this course quite a lot and learning from Udacity has been an excellent experience for me so far. I am looking forward to the next lessons!

--

--