August 2017

A Comprehensive Look into Neural Artistic Style Transfer

August 18, 2017

This past year, I took Stanford’s CS 231n course on Convolutional Neural Networks. My final project for the course dealt with a super cool concept called neural style transfer, in which the style of a piece of artwork is transferred onto a picture. Here’s a classic example–a picture of Hoover Tower at Stanford, in the style of The Starry Night:

Read More

Understanding Recurrent Neural Networks

August 13, 2017

In my last post, we used our micro-framework to learn about and create a Convolutional Neural Network. It was super cool, so check it out if you haven’t already. Now, in my final post for this tutorial series, we’ll be similarly learning about and building Recurrent Neural Networks (RNNs). RNNs are neural networks that are fantastic at time-dependent tasks, especially tasks that have to do with time series as an input. RNNs can serially process each time step of the series in order to build a semantic representation of the whole time series, one step at a time.

Read More

July 2017

Introducing Krikos - A Python ML Framework for Learning and Experimentation

July 23, 2017

I am pleased to announce that I have published my first Python library: Krikos!

Read More

Understanding Convolutional Neural Networks

July 22, 2017

In my last post, we learned about some advanced neural network topics and built them into our NN micro-framework. Now, we put that advanced framework to use to understand Convolutional Neural Networks (CNNs). CNNs are neural networks which are mostly employed on vision tasks–that is, problems that have to do with pictures. The benefit of using a CNN over a fully-connected network on images is that CNNs preserve spatial relationships and can gain insights into the visual structure of the input picture.

Read More

Advanced Exploration into Neural Networks

July 21, 2017

In my last post, we built a modular neural network micro-framework that will help us learn more about NNs. Although we were going to start building a Convolutional Neural Network in this tutorial, I thought it might be useful to first learn a little bit more about more advanced aspects of neural networks and build them out in our framework. That way, when we start making CNNs and RNNs soon, we will have more of the necessary infrastructure built out already. By the end of this tutorial, we will learn about activation functions, regularization, batch normalization, and dropout. We will also build out these features in our micro-framework.

Read More

Building a Modular Neural Network Mini-Framework

July 7, 2017

My first Medium post introduced the concept of neural networks at a fairly high level. Now, as we dive deeper into neural networks, it is necessary that we create a more robust framework for experimentation and understanding. In this short tutorial, we will develop a modular micro-framework that we will leverage and continuously develop as we journey into more complex neural network architectures. Going through this exercise will also put you in a great state of mind to explore more into popular NN frameworks such as PyTorch or Keras.

Read More

Hello, World!

July 6, 2017

Welcome to my blog! I’m so glad you’re here. :D

Read More