Home Artificial Intelligence Hyperparameter Tuning: Neural Networks 101 Background Quick Recap: What are Neural Networks?

Hyperparameter Tuning: Neural Networks 101 Background Quick Recap: What are Neural Networks?

0
Hyperparameter Tuning: Neural Networks 101
Background
Quick Recap: What are Neural Networks?

How you’ll be able to improve the “learning” and “training” of neural networks through tuning hyperparameters

Towards Data Science
Neural-network icons created by Vectors Tank — Flaticon. neural-network icons. https://www.flaticon.com/free-icons/neural

In my previous post, we discussed how neural networks predict and learn from the info. There are two processes chargeable for this: the forward pass and backward pass, also referred to as backpropagation. You possibly can learn more about it here:

This post will dive into how we will optimise this “learning” and “training” process to extend the performance of our model. The areas we are going to cover are computational improvements and hyperparameter tuning and how one can implement it in PyTorch!

But, before all that great things, let’s quickly jog our memory about neural networks!

Neural networks are large mathematical expressions that try to search out the “right” function that may map a set of inputs to their corresponding outputs. An example of a neural network is depicted below:

A basic two-hidden multi-layer perceptron. Diagram by creator.

Each hidden-layer neuron carries out the next computation:

The method carried out inside each neuron. Diagram by creator.
  • Inputs: These are the features of our dataset.
  • Weights: Coefficients that scale the inputs. The goal of the algorithm is to search out essentially the most optimal coefficients through gradient descent.
  • Linear Weighted Sum: Sum up the products of the inputs and weights and add a bias/offset term, b.
  • Hidden Layer: Multiple neurons are stored to learn patterns within the dataset. The superscript refers back to the layer and the subscript to the variety of neuron in that layer.

LEAVE A REPLY

Please enter your comment!
Please enter your name here