Believe in AdaBelief

Photo by Nadine Shaabana on Unsplash

Introduction

All types of neural networks and many machine learning algorithms optimize their loss functions using gradient-based optimization algorithms. There are several such optimization algorithms, or optimizers, that exist and are used to train models - RMSprop, Stochastic Gradient Descent(SGD), Adaptive Moment Estimation(Adam) and so many more.

There are two primary metrics to look at while determining the efficacy of an optimizer:

  1. The speed of convergence, that is, how quickly the minima of the loss function is achieved.
  2. Generalization of the model, that is, how well the model performs on new unseen data.

Adaptive algorithms like Adam have a good convergence…


Photo by Kaleb Nimz on Unsplash

Introduction

We have all heard about Generative Adversarial Networks (GANs) and the amazing things that they can do. If you haven't, be sure to check out this incredibly interesting paper by Ian J. Goodfellow and co-authors that introduced GANs to the world: Generative Adversarial Networks.

One of the many applications of GANs is Facial Inpainting. In this post, we will be going over a rather interesting architecture discussed in an IEEE paper titled: Face Inpainting via Nested Generative Adversarial Networks which can be found here.

Note: Prior knowledge of deep learning concepts like Convolutional Neural Networks(CNNs) along with Generative Adversarial Networks(GANs)…

Kaustubh Mhaisekar

AI | Deep Learning | Data Science

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store