I'm a sophomore studying Computer Science at Stanford University. I am obsessed with Machine Learning and am involved in AI initiatives such as the Stanford AI Lab. I am interested in CS education and have headed many education initiatives, such as teaching workshops during my membership in BASES and leading the creation of an AI elective series through CS + Social Good. I am fascinated with Philosophy and enjoy reading metaphysics papers when I have some time. I can often be found working on code projects or bingeing House of Cards, FRIENDS, or How I Met Your Mother in my free time.

Download Resume


Arbitrary Neural Style
The Commuter Chicago
RNN Encoder-Decoder for Machine Translation
Iron Drive
Style Transfer App
Concepts iOS


Research Assistant
Stanford Artificial Intelligence Lab, Fall 2017 - Current

I recently began my position as a Research Assistant in Dr. Andrew Ng's group in the Stanford AI Lab (SAIL). I am working with Awni Hannun, a PhD student, to do research on AI applications to healthcare. Specifically, we are using patient scans to generate natural-language "doctor's notes" describing the diagnosis based on the image.

Machine Learning Intern
Nasdaq, Summer 2017

This past summer, I interned at Nasdaq as a part of the Machine Learning team. The team undertook two projects, both of which I was involved in. The first was using customer data to predict attrition, for which I prepared the learning target set and assisted in developing a Recurrent Neural Network for prediction. The second was using trade history data to predict Nasdaq network latency. I spearheaded this initiative and designed the overall architecture of the neural network that we will be developing. I was heavily involved in the network's development and training, and extended the internship by two weeks as I took charge of the project handoff to Nasdaq.

Research Fellow
IDEO CoLab, January 2017

I spent two weeks in January at the IDEO CoLab, the innovation arm of IDEO. I was on a small team which reasearched the applications of Blockchain in society. We were specifically interested in using the trust established from the Blockchain to create decentralized resource-sharing initiatives. I was the "tech person" on the team, providing insight on technical feasibility and leading prototype development. We went through three design sprints, rapidly iterating by creating prototypes and gathering feedback. Our final three high-fidelity deliverables were handed off to IDEO.

Software Engineering Intern
State Farm, Summer 2016

During the summer of 2016, I was on the Research & Development Team at State Farm Insurance. I was tasked with implementing a fully-trained neural network on an iPhone. I used an open-source execution engine called DeepLearningKit to accomplish this. The final app was able to classify images belonging to the CIFAR-10 dataset and run the neural network on the iPhone's Metal GPU, in effect meaning that the user could download an entire neural network to their phone.
I was also the Dev Team Lead for the Summer Intern Website project. I was in charge of a team of 30 developers who all had differing proficiency levels with frontend development. Using Agile methodoligies such as sprints, stories, and a Kanban board, I was able to lead this diverse team to quickly and effeciently develop a beautiful internal website in just 8 weeks.


Neural Artistic Style Transfer: A Comprehensive Look

Artists and Machine Intelligence

An article describing the current state of Neural Style Transfer, as well as my contributions to the space

A Soft Introduction to Neural Networks

Towards Data Science

An article that introduces the reader to neural networks by guiding them through building a simple NN

End-to-End Learning of One Objective Function to Represent Multiple Styles for Neural Style Transfer

Stanford CS231n

Final paper for "Convolutional Neural Networks" class project, a cutting-edge research project on neural style transfer


A Comprehensive Look into Neural Artistic Style Transfer

August 18, 2017

This past year, I took Stanford’s CS 231n course on Convolutional Neural Networks. My final project for the course dealt with a super cool concept called neural style transfer, in which the style of a piece of artwork is transferred onto a picture. Here’s a classic example–a picture of Hoover Tower at Stanford, in the style of The Starry Night:

Read More

Understanding Recurrent Neural Networks

August 13, 2017

In my last post, we used our micro-framework to learn about and create a Convolutional Neural Network. It was super cool, so check it out if you haven’t already. Now, in my final post for this tutorial series, we’ll be similarly learning about and building Recurrent Neural Networks (RNNs). RNNs are neural networks that are fantastic at time-dependent tasks, especially tasks that have to do with time series as an input. RNNs can serially process each time step of the series in order to build a semantic representation of the whole time series, one step at a time.

Read More

Introducing Krikos - A Python ML Framework for Learning and Experimentation

July 23, 2017

I am pleased to announce that I have published my first Python library: Krikos!

Read More

See All