I'm currently a research assistant in the UW SAMPL research group, working on programming languages (PL) and machine learning (ML) systems research. I received my B.S. in Computer Science from the Allen School at UW in June 2020, under the supervision of Zachary Tatlock. My senior thesis on simulating DTR can be found under the "Publications and Preprints" section.
Broadly speaking, my research lies in (and around) the intersection of programming languages, machine learning, and systems. I enjoy synthesizing new techniques using ideas from each area, with an emphasis on improving large systems. You can find some research projects which I have contributed to and more specific interests below.
DTR. Dynamic Tensor Rematerialization (DTR) is a dynamic runtime technique for reducing peak memory requirements when training deep learning models. DTR is a dynamic "checkpointing" method which frees and recomputes intermediate computations as needed, thus trading more compute for less space. Unlike existing checkpointing methods which require offline planning, DTR operates fully in the runtime, enabling checkpointing for arbitrarily dynamic models. Check out our preprint for more details.
Program synthesis. Forthcoming.
Publications and Preprints
 Dynamic Tensor Rematerialization
Marisa Kirisame,† Steven Lyubomirsky,† Altan Haan,† Jennifer Brennan, Mike He, Jared Roesch, Tianqi Chen, Zachary Tatlock.
arXiv preprint, 2020.
 Simulating Dynamic Tensor Rematerialization*
Altan Haan, supervised by Zachary Tatlock.
Honors thesis, 2020.