publications

2025

  1. L4DC
    Multi-Agent Stochastic Bandits Robust to Adversarial Corruptions
    Fatemeh Ghaffari , Xuchuang Wang, Jinhang Zuo , and Mohammad Hajiesmaili
    In 7th Annual Learning for Dynamics & Control Conference , 2025
  2. ICLR
    Stochastic Bandits Robust to Adversarial Attacks
    Xuchuang Wang, Maoli Liu , Jinhang Zuo , Xutong Liu , John C.S. Lui , and Mohammad Hajiesmaili
    In International Conference on Learning Representations , 2025
  3. SIGMETRICS
    Asynchronous Multi-Agent Bandits: Fully Distributed vs. Leader-Coordinated Algorithms
    Xuchuang Wang*, Yu-Zhen Janice Chen* , Lin Yang , Xutong Liu , Mohammad Hajiesmaili , Don Towsley , and John C.S. Lui
    In ACM International Conference on Measurement and Modeling of Computer Systems , 2025
  4. SIGMETRICS
    Combinatorial Logistic Bandits
    Xutong Liu , Xiangxiang Dai , Xuchuang Wang, Mohammad Hajiesmaili , and John C.S. Lui
    In ACM International Conference on Measurement and Modeling of Computer Systems , 2025
  5. AAAI
    Best Arm Identification with Quantum Oracles
    Xuchuang Wang, Yu-Zhen Janice Chen , Matheus Andrade , Jonathan Allcock , Mohammad Hajiesmaili , John C.S. Lui , and Don Towsley
    In The 39th Annual AAAI Conference on Artificial Intelligence , 2025
  6. AAAI
    Heterogeneous Multi-Agent Bandits with Parsimonious Hints
    Amirmahdi Mirfakhar , Xuchuang Wang, Jinhang Zuo , Yair Zick , and Mohammad Hajiesmaili
    In The 39th Annual AAAI Conference on Artificial Intelligence , 2025
  7. INFOCOM
    Learning Best Paths in Quantum Networks
    Xuchuang Wang, Maoli Liu , Xutong Liu , Zhuohua Li , Mohammad Hajiesmaili , John C.S. Lui , and Don Towsley
    In Proceedings of the IEEE Conference on Computer Communications , 2025

2024

  1. ICML
    Combinatorial Multivariant Multi-Armed Bandits with Applications to Episodic Reinforcement Learning and Beyond
    Xutong Liu , Siwei Wang , Jinhang Zuo , Han Zhong , Xuchuang Wang, Zhiyong Wang , Shuai Li , Mohammad Hajiesmaili , John CS Lui , and Wei Chen
    In Forty-first International Conference on Machine Learning , 2024
  2. INFOCOM

2023

  1. NeurIPS
    Multi-Fidelity Multi-Armed Bandits Revisited
    Xuchuang Wang, Qingyun Wu , Wei Chen , and John C.S. Lui
    In Advances in Neural Information Processing Systems , 2023
  2. TMC
    Analyzing Queueing Problems via Bandits with Linear Reward and Nonlinear Workload Fairness
    Xuchuang Wang, Hong Xie , and John C.S. Lui
    In The IEEE Transactions on Mobile Computing , 2023
  3. PEVA
    Optimizing Recommendations under Abandonment Risks: Models and Algorithms
    Xuchuang Wang, Hong Xie , Pinghui Wang , and John C.S. Lui
    Performance Evaluation, 2023
  4. UAI
    Exploration for Free: How Does Reward Heterogeneity Improve Regret in Cooperative Multi-agent Bandits?
    Xuchuang Wang, Lin Yang , Yu-Zhen Janice Chen , Xutong Liu , Mohammad Hajiesmaili , Don Towsley , and John C.S. Lui
    In The 39th Conference on Uncertainty in Artificial Intelligence , 2023
  5. ICLR
    Achieve Near-Optimal Individual Regret & Low Communications in Multi-Agent Bandits
    Xuchuang Wang, Lin Yang , Yu-Zhen Janice Chen , Xutong Liu , Mohammad Hajiesmaili , Don Towsley , and John C.S. Lui
    In International Conference on Learning Representations , 2023
  6. AISTATS
    On-Demand Communication for Asynchronous Multi-Agent Bandits
    Yu-Zhen Janice Chen , Lin Yang , Xuchuang Wang, Xutong Liu , Mohammad Hajiesmaili , John C.S. Lui , and Don Towsley
    In The 26th International Conference on Artificial Intelligence and Statistics , 2023

2022

  1. IJCAI
    Multi-Player Multi-Armed Bandits with Finite Shareable Resources Arms: Learning Algorithms & Applications
    Xuchuang Wang, Hong Xie , and John C.S. Lui
    In Proceedings of the 31st International Joint Conference on Artificial Intelligence, IJCAI-22 , 2022
  2. ICML
    Multiple-Play Stochastic Bandits with Shareable Finite-Capacity Arms
    Xuchuang Wang, Hong Xie , and John C.S. Lui
    In Proceedings of the 39th International Conference on Machine Learning , 2022