Prati
Siyu Ren
Siyu Ren
Potvrđena adresa e-pošte na sjtu.edu.cn - Početna stranica
Naslov
Citirano
Citirano
Godina
Multi-turn response selection using dialogue dependency relations
Q Jia, Y Liu, S Ren, KQ Zhu, H Tang
EMNLP 2020, 2020
342020
Knowledge-driven distractor generation for cloze-style multiple choice questions
S Ren, KQ Zhu
Proceedings of the AAAI conference on artificial intelligence 35 (5), 4339-4347, 2021
282021
Taxonomy of Abstractive Dialogue Summarization: Scenarios, Approaches, and Future Directions
Q Jia, Y Liu, S Ren, KQ Zhu
ACM Computing Surveys 56 (3), 1-38, 2023
52023
Symbol-llm: Towards foundational symbol-centric interface for large language models
F Xu, Z Wu, Q Sun, S Ren, F Yuan, S Yuan, Q Lin, Y Qiao, J Liu
arXiv preprint arXiv:2311.09278, 2023
42023
Leaner and Faster: Two-Stage Model Compression for Lightweight Text-Image Retrieval
S Ren, KQ Zhu
NAACL 2022, 2022
42022
Zero-shot faithfulness evaluation for text summarization with foundation language model
Q Jia, S Ren, Y Liu, KQ Zhu
arXiv preprint arXiv:2310.11648, 2023
32023
Context compression for auto-regressive transformers with sentinel tokens
S Ren, Q Jia, KQ Zhu
arXiv preprint arXiv:2310.08152, 2023
32023
Low-rank prune-and-factorize for language model compression
S Ren, KQ Zhu
arXiv preprint arXiv:2306.14152, 2023
12023
Pruning Pre-trained Language Models with Principled Importance and Self-regularization
S Ren, KQ Zhu
arXiv preprint arXiv:2305.12394, 2023
12023
On the Efficacy of Eviction Policy for Key-Value Constrained Generative Language Model Inference
S Ren, KQ Zhu
arXiv preprint arXiv:2402.06262, 2024
2024
EMO: Earth Mover Distance Optimization for Auto-Regressive Language Modeling
S Ren, Z Wu, KQ Zhu
arXiv preprint arXiv:2310.04691, 2023
2023
Context Compression for Auto-regressive Transformers with Sentinel Tokens
SR , Qi Jia, Kenny Q. Zhu
EMNLP 2023, 2023
2023
Combating Short Circuit Behavior in Natural Language Reasoning: Crossover and Mutation Operations for Enhanced Robustness
S Huanga, S Renb, KQ Zhuc
2023
Specializing Pre-trained Language Models for Better Relational Reasoning via Network Pruning
S Ren, K Zhu
Findings of the Association for Computational Linguistics: NAACL 2022, 2195-2207, 2022
2022
Sustav trenutno ne može provesti ovu radnju. Pokušajte ponovo kasnije.
Članci 1–14