Projects and Presentations

Deep RL for Algorithmic Cryptocurrency Trading

[Code]

Custom deep Q-learning system for cryptocurrency/equity trading.


Deep Reinforcement Learning Seminar

[Slides]

This talk, given in 2020 as a seminar for the CS496 Advanced Deep Learning course at Northwestern University, covered deep Q-Learning and how it may be used in quantitative finance.


Forecasting Cryptocurrency Price Movements with GNNs

[Code]

Novel graph neural network architecture and reward scheme for learning relational dynamics between cryptocurrency pairs.


Graph Neural Networks Seminar

[Slides]

This talk, given in 2020 as a seminar for the CS496 Statistical Machine Learning course at Northwestern University, covered the operating principles, methodology and limitations of graph neural networks.


Azure Machine Learning: CPU Optimizations

[Intel Blog] [Microsoft Blog]

Microsoft Azure’s Data Science Virtual Machine is its most popular VM in the ML space. This project introduced an extension of this offering with ML frameworks optimized for execution on Intel CPUs.


Optimized Machine Learning VMs on AWS

[Blog]

Developed orchestration software to ensure ML frameworks are built and configured optimally for the underlying hardware they’re run on. If you launch a machine learning instance on CPU, you can be confident you’re getting the right stuff!


Distributed Deep Learning: Convolutional Neural Networks for Brain Tumor Segmentation

[White Paper] [Code]

In collaboration with the AI team at General Electric’s Global Research Center, we developed a distributed deep learning system for brain tumor segmentation.


Accelerating Memory Bound Machine Learning Applications

[White Paper]

Operating on matrix-vectors can substantially degrade training throughput in machine learning. In cooperation with Siemens AI Research, and using a memory-bound meta-learning model, we demonstrate how decreasing the number of threads utilized can yield throughput increases of up to 100x.


Biomedical Image Segmentation: Upsampling vs. Transposed Convolution

[Blog]

The semantic segmentation topology U-Net is designed somewhat like an autoencoder. This work investigated the effects of upsampling using standard nearest neighbors as well as transposed convolution (also called fractionally strided convolution).


ML Scientist (Volunteer)

[Project]

Thorn uses machine learning to identify trafficked children in online ads and collaborates with federal law enforcement to rescue them.

While volunteering as an ML Scientist, I built machine learning tools for face-identification and age-detection, though what I’m most proud of is a method for scraping large amounts of labeled data from the web (for free!). This method enabled us to grow our training set by five orders of magnitude and increase the accuracy and confidence of our age-detector.