Recurrent neural networks are a class of artificial neural networks which are often used with sequential data. The 3 most common types of recurrent neural networks are vanilla recurrent neural network (RNN), long short-term memory … Read more
Gradient descent is an optimisation method for finding the minimum of a function. It is commonly used in deep learning models to update the weights of the neural network through backpropagation.
In this post, I will summarise the common gradient descent optimisation algorithms used in popular deep learning frameworks (e.g. TensorFlow, Keras, PyTorch, Caffe). The purpose of this post is to make it easy to read and digest (using consistent nomenclature) since there aren’t many of such summaries out there, and as a cheat sheet if you want to implement them from scratch.
A semi-supervised graph-based approach for text classification and inference In this article, I will walk you through the details of text-based Graph Convolutional Network (GCN) and its implementation using PyTorch and standard libraries. The text-based … Read more
Step-by-step illustration on how one can implement AlphaZero on games using just PyTorch and standard python libraries In 2016, Google DeepMind created a big stir when its computer program AlphaGo defeated reigning Go world champion … Read more