• Feb. 2023. “Hybrid Gate-Pulse Model for Variational Quantum Algorithms” accepted to DAC 2023.
  • Jan. 2023. Received Best Pitch Award in MIT Microsystems Annual Research Conference.
  • Dec. 2022. Quantum Antivirus accepted to HOST 2023.
  • Nov. 2022. Received 1st Place in ACM Student Research Competition SRC.
  • Nov. 2022. Received 1/150 Place in ACM/IEEE TinyML Design Contest in memory size track and 3/150 place for overall score.
  • Aug. 2022. Received NSF Athena AI Institute Best Poster Award rank #1.
  • Aug. 2022. “Graph Transformer for Quantum Circuit Reliability Prediction” will appear at ICCAD 2022.
  • Jun. 2022. Variational Quantum Pulse Learning accepted to QCE 2022.
  • Jun. 2022. Welcome to attend Quantum Computer Systems Lecture Series.
  • May. 2022. DeepVS accepted to ACM-BCB 2022.
  • Apr. 2022. Quantum Antivirus accepted to NEHWS 2022.
  • Mar. 2022. RaM-SAR accepted to VLSI 2022.
  • Feb. 2022. QOC accepted to DAC 2022.
  • Feb. 2022. QuantumNAT accepted to DAC 2022.
  • Fed. 2022. A survey on efficient DL accepted to TODAES.
  • Oct. 2021. Check out our QNN library TorchQuantum.
  • Oct. 2021. “QuantumNAS: Noise-Adaptive Search for Robust Quantum Circuits” accepted to HPCA 2022.
  • Oct. 2021. “RoQNN: Noise-Aware Training for Robust Quantum Neural Networks” preprint available.
  • Aug. 2021. “QuantumNAS: Noise-Adaptive Search for Robust Quantum Circuits” preprint available.
  • Jul. 2021. “PointAcc: Efficient Point Cloud Accelerator” accepted to MICRO 2021.
  • Jun. 2021. Got awarded Qualcomm Innovation Fellowship.
  • Apr. 2021. Selected as Global Top 100 Chinese Rising Stars in AI by Baidu.
  • Feb. 2021. SpAtten project is selected as MIT homepage spotlight.
  • Jan. 2021. Got awarded Baidu Scholarship.
  • Dec. 2020. Got awarded Analog Devices outstanding student designer.
  • Nov. 2020. Got awarded Nvidia Graduate Fellowship Finalist.
  • Oct. 2020. “SpAtten: Efficient Sparse Attention Architecture with Cascade Token and Head Pruning” accepted to HPCA 2021.
  • Aug. 2020. Finished a great summer internship @Nvidia Research. Thank you my incredible mentors!
  • Jul. 2020. Received DAC Young Fellowship and Best Research Video Award.
  • Jun. 2020. One co-authored paper accepted to ECCV 2020.
  • May. 2020. HAT is open-sourced!
  • Apr. 2020. “MicroNet for Efficient Language Modeling” accepted to Journal of Machine Learning Research 2020.
  • Apr. 2020. “HAT: Hardware-Aware Transformers for Efficient Natural Language Processing” accepted to ACL 2020.
  • Mar. 2020. “APQ: Joint Search for Network Architecture, Pruning and Quantization Policy” accepted to CVPR 2020.
  • Feb. 2020. I gave a talk at Qualcomm Research Center on “GCN-RL Circuit Designer: Transferable Transistor Sizing With Graph Neural Networks and Reinforcement Learning”.
  • Feb. 2020. I gave a talk in HPCA 2020 on “SpArch: Efficient Architecture for Sparse Matrix Multiplication”.
  • Feb. 2020. “GCN-RL Circuit Designer: Transferable Transistor Sizing with Graph Neural Networks and Reinforcement Learning” accepted to DAC 2020.
  • Dec. 2020. I gave a talk on Efficient Langauge Modeling at NeurIPS 2019 MicroNet Challenge.
  • Nov. 2019. “SpArch: Efficient Architecture for Sparse Matrix Multiplication” accepted to HPCA 2020.
  • Nov. 2019. I won the NeurIPS 2019 MicroNet Challenge, code open-sourced