Jacob Kelly

Jacob Kelly

I'm an undergrad in Computer Science, Math, and Stats at the University of Toronto. I'm currently a Machine Learning Research Intern at Deep Genomics. I'm also very fortunate to work with David Duvenaud at the Vector Institute. I'm interested in energy-based models, latent variable models, neural ODEs, variational inference, and genomics. My research goal is to use machine learning combined with new sources of data to develop tools for diagnosis and treatment of patients.

Previously, I did computational biology research with Benjamin Haibe-Kains at Princess Margaret Cancer Research. Some of my hobbies include running, rock climbing, and reading. I'm always open to meeting new people, feel free to get in touch if you'd like to chat.

Publications

No MCMC for me: Amortized Sampling for Fast and Stable Training of Energy-Based Models.

Energy-Based Models (EBMs) present a flexible and appealing way to represent uncertainty. In this work, we present a simple method for training EBMs at scale which uses an entropy-regularized generator to amortize the MCMC sampling typically used in EBM training. We apply our estimator to the recently proposed Joint Energy Model (JEM), where we match the original performance with faster and stable training. This allows us to extend JEM models to semi-supervised classification on tabular data from a variety of continuous domains.

Will Grathowhl*, Jacob Kelly*, Milad Hashemi, Mohammad Norouzi, Kevin Swersky, David Duvenaud

Preprint, in submission.

pdf | code | bibtex

Learning Differential Equations that are Easy to Solve

Neural ODEs become expensive to solve numerically as training progresses. We introduce a differentiable surrogate for the time cost of standard numerical solvers using higher-order derivatives of solution trajectories. These derivatives are efficient to compute with Taylor-mode automatic differentiation. Optimizing this additional objective trades model performance against the time cost of solving the learned dynamics.

Jacob Kelly*, Jesse Bettencourt*, Matthew James Johnson, David Duvenaud

Neural Information Processing Systems (NeurIPS), 2020

pdf | code | bibtex

*equal contribution

Teaching

I have been a Teaching Assistant for the following:

Projects

Elastic Net Regression to Predict Drug Response from Gene Expression

We develop drug-specific models to detect biomarkers in gene expression data for predicting the area under the drug-dose response curve (AUC) of cell lines for acute myeloid leukemia (AML). We apply our model to two pharmacogenomic datasets, one consisting of data from immortalized cell lines, the other ex-vivo primary cell lines from patients. We use linear regression with elastic net regularization as our model and compare methods for feature selection.

Jacob Kelly, Arvind Mer, Sisira Nair, Hassan Mahmoud, Benjamin Haibe-Kains

poster

Genomic Sequencing

To learn about genomic sequencing, I implemented Boyer-Moore for performing exact sequence alignment. I also implemented the Z algorithm to efficiently generate indices of the query string needed for the Boyer-Moore algorithm.

Jacob Kelly

code

ICLR 2019 Reproducibility Challenge

We implemented the ICLR 2019 submission (later accepted) "Initialized Equilibrium Propagation for Backprop-Free Training" by O'Connor et al. as part of the ICLR 2019 Reproducibility Challenge.

Matthieu Chan Chee, Jad Ghalayini, Jacob Kelly, Winnie Xu

code

DeepSort

I implemented a sequence to sequence model to learn to sort numbers. I used an encoder-decoder LSTM architecture with a "pointer" attention mechanism developed by Vinyals et al. (Pointer Networks) to take advantage of the fact that the output of the model is a permutation of its input.

Jacob Kelly

blog | code

Pocket Guide

We developed an Android app for indoor navigation through SickKids hospital using bluetooth signals from Estimote Beacons. We use SPFA on a map of beacon locations to efficiently generate the shortest path from the user's current location to their destination. We estimate the current location using signal strengths from nearby beacons and track over time using an exponentially weighted average. Developed in summer 2016 at Cossette.

Bruno Almeida, Jacob Kelly, Gabriel Yeung

slides | code

alphabetical order