Follow
Xiaonan Li
Xiaonan Li
Verified email at fudan.edu.cn - Homepage
Title
Cited by
Cited by
Year
FLAT: Chinese NER using flat-lattice transformer
X Li, H Yan, X Qiu, X Huang
ACL 2020, 2020
3952020
TENER: Adapting Transformer Encoder for Name Entity Recognition
H Yan, B Deng, X Li, X Qiu
arXiv preprint arXiv:1911.04474, 2019
3342019
Learning Sparse Sharing Architectures for Multiple Tasks
T Sun, Y Shao, X Li, P Liu, H Yan, X Qiu, X Huang
AAAI 2020, 2019
1272019
Does It Make Sense? And Why? A Pilot Study for Sense Making and Explanation
C Wang, S Liang, Y Zhang, X Li, T Gao
ACL 2019, 2019
982019
Backdoor attacks on pre-trained models by layerwise weight poisoning
L Li, D Song, X Li, J Zeng, R Ma, X Qiu
EMNLP 2021, 2021
792021
Finding support examples for in-context learning
X Li, X Qiu
EMNLP 2023 (findings), 2023
48*2023
Unified Demonstration Retriever for In-Context Learning
X Li, K Lv, H Yan, T Lin, W Zhu, Y Ni, G Xie, X Wang, X Qiu
ACL 2023, 2023
372023
CodeRetriever: Large-scale Contrastive Pre-training for Code Search
X Li, Y Gong, Y Shen, X Qiu, H Zhang, B Yao, W Qi, D Jiang, W Chen, ...
EMNLP 2022, 2022
26*2022
Accelerating BERT Inference for Sequence Labeling via Early-Exit
X Li, Y Shao, T Sun, H Yan, X Qiu, X Huang
ACL 2021, 2021
262021
An embarrassingly easy but strong baseline for nested named entity recognition
H Yan, Y Sun, X Li, X Qiu
ACL 2023, 2022
202022
BERT for Monolingual and Cross-Lingual Reverse Dictionary
H Yan, X Li, X Qiu
EMNLP 2020 (findings), 2020
202020
Soft-Labeled Contrastive Pre-training for Function-level Code Representation
X Li, D Guo, Y Gong, Y Lin, Y Shen, X Qiu, D Jiang, W Chen, N Duan
EMNLP 2022 (findings), 2022
122022
MoT: Memory-of-Thought Enables ChatGPT to Self-Improve
X Li, X Qiu
EMNLP 2023, 2023
9*2023
LLatrieval: LLM-Verified Retrieval for Verifiable Generation
X Li, C Zhu, L Li, Z Yin, T Sun, X Qiu
NAACL 2024, 2023
52023
Utc-ie: A unified token-pair classification architecture for information extraction
H Yan, Y Sun, X Li, Y Zhou, X Huang, X Qiu
22022
The system can't perform the operation now. Try again later.
Articles 1–15