My Deep Learning Workstation Setup

Lately, a lot of my friends have been asking about my deep learning workstation setup. In this post I am going to describe my hardware, OS, and different packages that I use. In particular, based on the question, I found that the most of the interest have been around managing different python versions, and modules like pytorch/tensorflow libraries etc.

Reading Time: 9 minutes       Read more…

A Practical guide to Autoencoders

Usually in a conventional neural network, one tries to predict a target vector $y$ from input vectors $x$. In an auto-encoder network, one tries to predict $x$ from $x$. It is trivial to learn a mapping from $x$ to $x$ if the network has no constraints, but if the network is constrained the learning process becomes more interesting. In this article, we are going to take a detailed look at the mathematics of different types of autoencoders (with different constraints) along with a sample implementation of it using Keras, with a tensorflow back-end.

Reading Time: 22 minutes       Read more…

My Arch Linux Setup with GNOME 3

If you have been following me on this space, you would have known by now, I am very particular about my computers, its operating systems, looks, softwares etc. Before you start getting any wrong ideas, my love for Arch Linux is still going strong. However, I have moved on to Gnome 3 as my choice desktop. This post is an updated version of my previous post with the latest configuration of my machine.

Reading Time: 19 minutes       Read more…

Understanding Boosted Trees Models

In the previous post, we learned about tree based learning methods - basics of tree based models and the use of bagging to reduce variance. We also looked at one of the most famous learning algorithms based on the idea of bagging- random forests. In this post, we will look into the details of yet another type of tree-based learning algorithms: boosted trees.

Reading Time: 26 minutes       Read more…