Anke Tang

Ph.D Student. Machine Learning. AI Research.

avatar.png

School of Computer Science

Wuhan University, China

I am currently pursuing my Ph.D. degree at the School of Computer Science, Wuhan University, under the supervision of Prof. Yong Luo and Assoc. Prof. Shen Li. I received my Bachelor degree at the School of Physics and Technology, Wuhan University in 2020. My research interests include machine learning, transfer learning, and multi-task learning.

news

Sep 20, 2025 Three of our papers have been accepted to NeurIPS 2025, with two on continual model merging.

selected publications

  1. TPAMI
    smile_upscaling.png
    Zero-Shot Sparse Mixture of Low-Rank Experts Construction From Pre-Trained Foundation Models
    Anke Tang, Li Shen, Yong Luo, and 5 more authors
    IEEE Transactions on Pattern Analysis and Machine Intelligence, 2025
  2. IJCV
    Data-adaptive weight-ensembling for multi-task model fusion
    Anke Tang, Li Shen, Yong Luo, and 4 more authors
    International Journal of Computer Vision, 2025
  3. NeurIPS
    opcm.png
    Merging models on the fly without retraining: A sequential approach to scalable continual model merging
    Anke Tang, Enneng Yang, Li Shen, and 4 more authors
    The Thirty-Ninth Annual Conference on Neural Information Processing Systems, 2025
  4. NeurIPS
    Continual Model Merging without Data: Dual Projections for Balancing Stability and Plasticity
    Enneng Yang, Anke Tang, Li Shen, and 4 more authors
    The Thirty-Ninth Annual Conference on Neural Information Processing Systems, 2025
  5. ICML
    Targeted Low-rank Refinement: Enhancing Sparse Language Models with Precision
    Li Shen, Anke Tang, Yong Luo, and 3 more authors
    In Forty-second International Conference on Machine Learning, 2025
  6. ICML
    Modeling Multi-Task Model Merging as Adaptive Projective Gradient Descent
    Yongxian Wei, Anke Tang, Li Shen, and 3 more authors
    In Forty-second International Conference on Machine Learning, 2025
  7. ICLR
    Mitigating the Backdoor Effect for Multi-Task Model Merging via Safety-Aware Subspace
    Jinluan Yang, Anke Tang, Didi Zhu, and 3 more authors
    In The 13th International Conference on Learning Representations (ICLR), 2025
  8. NMI
    Learning from models beyond fine-tuning
    Hongling Zheng, Li Shen, Anke Tang, and 5 more authors
    Nature Machine Intelligence, 2025
  9. ICML
    wemoe.png
    Merging Multi-Task Models via Weight-Ensembling Mixture of Experts
    Anke Tang, Li Shen, Yong Luo, and 3 more authors
    In The 41th International Conference on Machine Learning (ICML), 2024
  10. ICLR
    Parameter efficient multi-task model fusion with partial linearization
    Anke Tang, Li Shen, Yong Luo, and 5 more authors
    In the 12th International Conference on Learning Representations, 2024
  11. IJCAI
    Improving Heterogeneous Model Reuse by Density Estimation
    Anke Tang, Yong Luo, Han Hu, and 5 more authors
    In Thirty-Second International Joint Conference on Artificial Intelligence, 2023