- π Iβm currently working on MoE (Mixture-of-Experts) and Multilingual LLM in Nanjing University
- π Now located in Shanghai
- π« How to reach me: [email protected]
- π¬ Ask me about everything interesting
- π You can see me at https://zjwang21.github.io/
-
Breaking the Representation Bottleneck of Chinese Characters: Neural Machine Translation with Stroke Sequence Modeling
-
MoE-LPR: Multilingual Extension of Large Language Models through Mixture-of-Experts with Language Priors Routing
-
Investigating and Scaling up Code-Switching for Multilingual Language Model Pre-Training