Yichun Yin
Yichun Yin
Huawei Noah's Ark Lab
Verified email at
Cited by
Cited by
TinyBERT: Distilling BERT for Natural Language Understanding
X Jiao, Y Yin, L Shang, X Jiang, X Chen, L Li, F Wang, Q Liu
EMNLP-findings (most influential paper of EMNLP-2020), 2019
Unsupervised word and dependency path embeddings for aspect term extraction
Y Yin, F Wei, L Dong, K Xu, M Zhang, M Zhou
IJCAI 2016, 2016
Document-level multi-aspect sentiment classification as machine comprehension
Y Yin, Y Song, M Zhang
EMNLP 2017, 2044-2054, 2017
TernaryBERT: Distillation-aware Ultra-low Bit BERT
W Zhang, L Hou, Y Yin, L Shang, X Chen, X Jiang, Q Liu
EMNLP 2020, 2020
Dialog State Tracking with Reinforced Data Augmentation
Y Yin, L Shang, X Jiang, X Chen, Q Liu
AAAI 2020, 2019
Nnembs at semeval-2017 task 4: Neural twitter sentiment classification: a simple ensemble method with different embeddings
Y Yin, Y Song, M Zhang
Proceedings of the 11th International Workshop on Semantic Evaluation …, 2017
Socialized Word Embeddings.
Z Zeng, Y Yin, Y Song, M Zhang
IJCAI, 3915-3921, 2017
Splusplus: a feature-rich two-stage classifier for sentiment analysis of tweets
L Dong, F Wei, Y Yin, M Zhou, K Xu
Proceedings of the 9th international workshop on semantic evaluation …, 2015
Generate & Rank: A Multi-task Framework for Math Word Problems
J Shen, Y Yin, L Li, L Shang, X Jiang, M Zhang, Q Liu
EMNLP 2021 findings, 2021
AutoTinyBERT: Automatic Hyper-parameter Optimization for Efficient Pre-trained Language Models
Y Yin, C Chen, L Shang, X Jiang, X Chen, Q Liu
ACL 2021, 2021
Improving Task-Agnostic BERT Distillation with Layer Mapping Search
X Jiao, H Chang, Y Yin, L Shang, X Jiang, X Chen, L Li, F Wang, Q Liu
Neurocomputing 2021, 2020
PoD: Positional Dependency-Based Word Embedding for Aspect Term Extraction
Y Yin, C Wang, M Zhang
COLING 2020, 2019
bert2BERT: Towards Reusable Pretrained Language Models
C Chen, Y Yin, L Shang, X Jiang, Y Qin, F Wang, Z Wang, X Chen, Z Liu, ...
ACL 2022, 2021
Extract then Distill: Efficient and Effective Task-Agnostic BERT Distillation
C Chen, Y Yin, L Shang, Z Wang, X Jiang, X Chen, Q Liu
ICANN 2021, 2021
Text processing model training method, and text processing method and apparatus
Y Yin, L Shang, X Jiang, X Chen
US Patent App. 17/682,145, 2022
LightMBERT: A Simple Yet Effective Method for Multilingual BERT Distillation
X Jiao, Y Yin, L Shang, X Jiang, X Chen, L Li, F Wang, Q Liu
arXiv preprint arXiv:2103.06418, 2021
More Chinese women needed to hold up half the computing sky
M Zhang, Y Yin
Proceedings of the ACM Turing Celebration Conference-China, 1-4, 2019
Integrating Regular Expressions with Neural Networks via DFA
S Li, Q Liu, X Jiang, Y Yin, C Sun, B Liu, Z Ji, L Shang
arXiv preprint arXiv:2109.02882, 2021
The Solution of Huawei Cloud & Noah’s Ark Lab to the NLPCC-2020 Challenge: Light Pre-Training Chinese Language Model for NLP Task
Y Zhang, J Yu, K Wang, Y Yin, C Chen, Q Liu
CCF International Conference on Natural Language Processing and Chinese …, 2020
尹伊淳, 张铭
中文信息学报 32 (11), 112-116, 2018
The system can't perform the operation now. Try again later.
Articles 1–20