Xiaozhe Ren
Xiaozhe Ren
Noah's Ark Lab, Huawei Technologies
Verified email at huawei.com
Title
Cited by
Cited by
Year
Nezha: Neural contextualized representation for chinese language understanding
J Wei, X Ren, X Li, W Huang, Y Liao, Y Wang, J Lin, X Jiang, X Chen, ...
arXiv preprint arXiv:1909.00204, 2019
302019
PanGu-: Large-scale Autoregressive Pretrained Chinese Language Models with Auto-parallel Computation
W Zeng, X Ren, T Su, H Wang, Y Liao, Z Wang, X Jiang, ZZ Yang, K Wang, ...
arXiv preprint arXiv:2104.12369, 2021
222021
SparseBERT: Rethinking the Importance Analysis in Self-attention
H Shi, J Gao, X Ren, H Xu, X Liang, Z Li, JT Kwok
arXiv preprint arXiv:2102.12871, 2021
72021
Autobert-zero: Evolving bert backbone from scratch
J Gao, H Xu, X Ren, PLH Yu, X Liang, X Jiang, Z Li
arXiv preprint arXiv:2107.07445, 2021
22021
Large-Scale Deep Learning Optimizations: A Comprehensive Survey
X He, F Xue, X Ren, Y You
arXiv preprint arXiv:2111.00856, 2021
2021
EfficientBERT: Progressively Searching Multilayer Perceptron via Warm-up Knowledge Distillation
C Dong, G Wang, H Xu, J Peng, X Ren, X Liang
arXiv preprint arXiv:2109.07222, 2021
2021
NumGPT: Improving Numeracy Ability of Generative Pre-trained Models
Z Jin, X Jiang, X Wang, Q Liu, Y Wang, X Ren, H Qu
arXiv preprint arXiv:2109.03137, 2021
2021
The system can't perform the operation now. Try again later.
Articles 1–7