Tri Dao

tri_photo_2021_04.jpeg
tri [at] tridao (dot) me

Assistant Professor of Computer Science at Princeton University, leading the Dao AI Lab.
Co-founder & Chief Scientist of Together AI.

CV (updated 01/2026)

Previously: PhD, Department of Computer Science, Stanford University

Research Interests

Machine learning and systems, with a focus on efficient training and inference:

  • Hardware-aware algorithms.
  • Sequence models with long-range memory.

Current PhD Students

Selected Honors and Awards

  • Schmidt Sciences AI2050 Fellowship, 2025.
  • Google ML and Systems Junior Faculty Awards, 2025.
  • Google Research Scholar, 2025.
  • Conference on Machine Learning and Systems (MLSys), Outstanding Paper Honorable Mention, 2025.
  • Conference on Language Modeling (COLM), Outstanding Paper, 2024.
  • International Conference on Machine Learning (ICML), Outstanding Paper runner-up, 2022.

latest posts

selected publications

  1. Marconi: Prefix Caching for the Era of Hybrid LLMs
    Rui Pan, Zhuang Wang, Zhen Jia, and 5 more authors
    In Machine Learning and Systems (MLSys), 2025
  2. FlashAttention-3: Fast and Accurate Attention with Asynchrony and Low-precision
    Jay Shah*, Ganesh Bikshandi*, Ying Zhang, and 3 more authors
    In Advances in Neural Information Processing Systems (NeurIPS), 2024
  3. Transformers are SSMs: Generalized Models and Efficient Algorithms Through Structured State Space Duality
    Tri Dao* and Albert Gu*
    In International Conference on Machine Learning (ICML), 2024
  4. Mamba: Linear-Time Sequence Modeling with Selective State Spaces
    Albert Gu* and Tri Dao*
    Conference on Language Modeling (COLM), 2023
  5. FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness
    Tri Dao, Daniel Y. Fu, Stefano Ermon, and 2 more authors
    In Advances in Neural Information Processing Systems, 2022
  6. Monarch: Expressive Structured Matrices for Efficient and Accurate Training
    Tri Dao, Beidi Chen, Nimit Sohoni, and 7 more authors
    In International Conference on Machine Learning (ICML), 2022