Follow
Xiaolong Huang
Xiaolong Huang
Verified email at microsoft.com
Title
Cited by
Cited by
Year
Text embeddings by weakly-supervised contrastive pre-training
L Wang, N Yang, X Huang, B Jiao, L Yang, D Jiang, R Majumder, F Wei
arXiv preprint arXiv:2212.03533, 2022
1562022
Simlm: Pre-training with representation bottleneck for dense passage retrieval
L Wang, N Yang, X Huang, B Jiao, L Yang, D Jiang, R Majumder, F Wei
arXiv preprint arXiv:2207.02578, 2022
592022
Improving text embeddings with large language models
L Wang, N Yang, X Huang, L Yang, R Majumder, F Wei
arXiv preprint arXiv:2401.00368, 2023
342023
Lexmae: Lexicon-bottlenecked pretraining for large-scale retrieval
T Shen, X Geng, C Tao, C Xu, X Huang, B Jiao, L Yang, D Jiang
arXiv preprint arXiv:2208.14754, 2022
282022
Large Search Model: Redefining Search Stack in the Era of LLMs
L Wang, N Yang, X Huang, L Yang, R Majumder, F Wei
ACM SIGIR Forum 57 (2), 1-16, 2024
32024
Effective and efficient query-aware snippet extraction for web search
J Yi, F Wu, C Wu, X Huang, B Jiao, G Sun, X Xie
arXiv preprint arXiv:2210.08809, 2022
32022
Multilingual e5 text embeddings: A technical report
L Wang, N Yang, X Huang, L Yang, R Majumder, F Wei
arXiv preprint arXiv:2402.05672, 2024
22024
The system can't perform the operation now. Try again later.
Articles 1–7