news

Mar 04, 2026 Excited to share that our work, “Fed-SB: A Silver Bullet for Extreme Communication Efficiency and Performance in (Private) Federated LoRA Fine-Tuning”! has been accepted at TMLR 2026🚀
Jan 20, 2026 Excited to share that our work, “ABBA-Adapters: Efficient and Expressive Fine-Tuning of Foundation Models”! has been accepted at ICLR 2026🚀
Jun 26, 2025 Excited to share our latest work, ““What’s Up, Doc?”: Analyzing How Users Seek Health Information in Large-Scale Conversational AI Datasets”! Now available on arXiv🚀
May 21, 2025 🚀 Just dropped: “ABBA-Adapters: Efficient and Expressive Fine-Tuning of Foundation Models” is now on arXiv
Mar 13, 2025 Excited to share our latest work, “Fed-SB: A Silver Bullet for Extreme Communication Efficiency and Performance in (Private) Federated LoRA Fine-Tuning”! Now available on arXiv🚀