Prati
Ziqing Yang
Ziqing Yang
iFLYTEK Research
Potvrđena adresa e-pošte na iflytek.com
Naslov
Citirano
Citirano
Godina
Pre-training with whole word masking for chinese bert
Y Cui, W Che, T Liu, B Qin, Z Yang
IEEE/ACM Transactions on Audio, Speech, and Language Processing 29, 3504-3514, 2021
6302021
TextBrewer: an open-source knowledge distillation toolkit for natural language processing
Z Yang, Y Cui, Z Chen, W Che, T Liu, S Wang, G Hu
arXiv preprint arXiv:2002.12620, 2020
272020
On the evaporation of solar dark matter: spin-independent effective operators
ZL Liang, YL Wu, ZQ Yang, YF Zhou
Journal of Cosmology and Astroparticle Physics 2016 (09), 018, 2016
202016
Pre-training with whole word masking for chinese bert. arXiv 2019
Y Cui, W Che, T Liu, B Qin, Z Yang, S Wang, G Hu
arXiv preprint arXiv:1906.08101, 0
19
Benchmarking robustness of machine reading comprehension models
C Si, Z Yang, Y Cui, W Ma, T Liu, S Wang
arXiv preprint arXiv:2004.14004, 2020
182020
Improving machine reading comprehension via adversarial training
Z Yang, Y Cui, W Che, T Liu, S Wang, G Hu
arXiv preprint arXiv:1911.03614, 2019
162019
PERT: pre-training BERT with permuted language model
Y Cui, Z Yang, T Liu
arXiv preprint arXiv:2203.06906, 2022
142022
The leptophilic dark matter in the Sun: the minimum testable mass
ZL Liang, YL Tang, ZQ Yang
Journal of Cosmology and Astroparticle Physics 2018 (10), 035, 2018
102018
A sentence cloze dataset for Chinese machine reading comprehension
Y Cui, T Liu, Z Yang, Z Chen, W Ma, W Che, S Wang, G Hu
arXiv preprint arXiv:2004.03116, 2020
92020
Critical behaviors and universality classes of percolation phase transitions on two-dimensional square lattice
Y Zhu, ZQ Yang, X Zhang, XS Chen
Communications in Theoretical Physics 64 (2), 231, 2015
82015
Pre-Training with Whole Word Masking for Chinese BERT. arXiv e-prints, art
Y Cui, W Che, T Liu, B Qin, Z Yang, S Wang, G Hu
arXiv preprint arXiv:1906.08101, 2019
52019
CINO: A Chinese Minority Pre-trained Language Model
Z Yang, Z Xu, Y Cui, B Wang, M Lin, D Wu, Z Chen
arXiv preprint arXiv:2202.13558, 2022
42022
Interactive gated decoder for machine reading comprehension
Y Cui, W Che, Z Yang, T Liu, B Qin, S Wang, G Hu
Transactions on Asian and Low-Resource Language Information Processing 21 (4 …, 2022
42022
Criticality of networks with long-range connections
ZQ Yang, MX Liu, XS Chen
Science China Physics, Mechanics, and Astronomy 60 (2), 20521, 2017
22017
Hit at semeval-2022 task 2: Pre-trained language model for idioms detection
Z Chu, Z Yang, Y Cui, Z Chen, M Liu
arXiv preprint arXiv:2204.06145, 2022
12022
TextPruner: A model pruning toolkit for pre-trained language models
Z Yang, Y Cui, Z Chen
arXiv preprint arXiv:2203.15996, 2022
12022
Cross-lingual text classification with multilingual distillation and zero-shot-aware training
Z Yang, Y Cui, Z Chen, S Wang
arXiv preprint arXiv:2202.13654, 2022
12022
Adversarial training for machine reading comprehension with virtual embeddings
Z Yang, Y Cui, C Si, W Che, T Liu, S Wang, G Hu
arXiv preprint arXiv:2106.04437, 2021
12021
Bilingual Alignment Pre-Training for Zero-Shot Cross-Lingual Transfer
Z Yang, W Ma, Y Cui, J Ye, W Che, S Wang
arXiv preprint arXiv:2106.01732, 2021
12021
Gradient-based Intra-attention Pruning on Pre-trained Language Models
Z Yang, Y Cui, X Yao, S Wang
arXiv preprint arXiv:2212.07634, 2022
2022
Sustav trenutno ne može provesti ovu radnju. Pokušajte ponovo kasnije.
Članci 1–20