Prati
Anh Tuan Nguyen
Anh Tuan Nguyen
Potvrđena adresa e-pošte na microsoft.com
Naslov
Citirano
Citirano
Godina
BERTweet: A pre-trained language model for English Tweets
DQ Nguyen, T Vu, AT Nguyen
Proceedings of Empirical Methods in Natural Language Processing - EMNLP 2020 …, 2020
8232020
PhoBERT: Pre-trained language models for Vietnamese
DQ Nguyen, AT Nguyen
Findings of Empirical Methods in Natural Language Processing - EMNLP 2020, 2020
3482020
Codet: Code generation with generated tests
B Chen, F Zhang, A Nguyen, D Zan, Z Lin, JG Lou, W Chen
International Conference on Learning Representation - ICLR 2023, 2022
1542022
A Pilot Study of Text-to-SQL Semantic Parsing for Vietnamese
AT Nguyen, MH Dao, DQ Nguyen
Findings of Empirical Methods in Natural Language Processing - EMNLP 2020, 2020
462020
Phi-2: The surprising power of small language models
M Javaheripi, S Bubeck, M Abdin, J Aneja, S Bubeck, CCT Mendes, ...
Microsoft Research Blog, 2023
312023
Meet in the Middle: A New Pre-training Paradigm
A Nguyen, N Karampatziakis, W Chen
Conference on Neural Information Processing Systems - NeurIPS 2023, 2023
92023
ViDeBERTa: A powerful pre-trained language model for Vietnamese
CD Tran, NH Pham, A Nguyen, TS Hy, T Vu
European Chapter of the Association for Computational Linguistics - EACL 2023, 2023
52023
Phi-3 Technical Report: A Highly Capable Language Model Locally on Your Phone
M Abdin, SA Jacobs, AA Awan, J Aneja, A Awadallah, H Awadalla, ...
arXiv preprint arXiv:2404.14219, 2024
2024
TinyGSM: achieving 80% on GSM8k with one billion parameters
B Liu, S Bubeck, R Eldan, J Kulkarni, Y Li, A Nguyen, R Ward, Y Zhang
The 3rd Workshop on Mathematical Reasoning and AI at NeurIPS'23, 2023
2023
Neural Multigrid Memory For Computational Fluid Dynamics
DM Nguyen, MC Vu, TA Nguyen, T Huynh, NT Nguyen, TS Hy
arXiv preprint arXiv:2306.12545, 2023
2023
Sustav trenutno ne može provesti ovu radnju. Pokušajte ponovo kasnije.
Članci 1–10