Synthropy Labs
I hear that you can make yourself sound cooler by making a website titled “Your Name
Labs”. This is mine.
At Synthropy Labs, our mission is to make ML training slightly less painful through open source tools and resources.
Who am I?
I’m Emily Shepperd, an ex-google engineer ex-physicist who quit the life to work on machine learning in 2021. I’m a software engineer at heart, but also a mathematician and a ML researcher. Trans rights are human rights.
Contact
Research we’ve been involved in
- https://arxiv.org/abs/2408.04093 Tree Attention: Topology-aware Decoding for Long-Context Attention on GPU clusters
- For this paper, I consulted with Zyphra and provided advice and assistance with the implementation in JAX.
Open source contributions
- jaxtorch
- A JAX-based library for training neural networks, intended to be as close to a drop-in replacement for pytorch as possible.
- flash_attn_jax
- A JAX binding for Tri Dao’s Flash Attention v2.
- clip-jaxtorch
- A straightforward JAX implementation of OpenAI’s CLIP model.
- gpt-2
- More of historical interest at this point, I released the first finetuning scripts for gpt-2 in 2019.