Prati
Ben Peters
Ben Peters
Instituto de Telecomunicações
Potvrđena adresa e-pošte na uw.edu
Naslov
Citirano
Citirano
Godina
Sparse sequence-to-sequence models
B Peters, V Niculae, AFT Martins
arXiv preprint arXiv:1905.05702, 2019
992019
Massively multilingual neural grapheme-to-phoneme conversion
B Peters, J Dehdari, J van Genabith
arXiv preprint arXiv:1708.01464, 2017
392017
One-size-fits-all multilingual models
B Peters, AFT Martins
Proceedings of the 17th SIGMORPHON Workshop on Computational Research in …, 2020
172020
Interpretable structure induction via sparse attention
B Peters, V Niculae, AFT Martins
Proceedings of the 2018 EMNLP workshop blackboxnlp: analyzing and …, 2018
142018
It–ist at the sigmorphon 2019 shared task: Sparse two-headed models for inflection
B Peters, AFT Martins
Proceedings of the 16th Workshop on Computational Research in Phonetics …, 2019
122019
Smoothing and shrinking the sparse Seq2Seq search space
B Peters, AFT Martins
arXiv preprint arXiv:2103.10291, 2021
82021
Beyond characters: Subword-level morpheme segmentation
B Peters, AFT Martins
Proceedings of the 19th SIGMORPHON Workshop on Computational Research in …, 2022
12022
DeepSPIN: Deep Structured Prediction for Natural Language Processing
AFT Martins, B Peters, C Zerva, C Lyu, G Correia, M Treviso, P Martins, ...
Proceedings of the 23rd Annual Conference of the European Association for …, 2022
12022
Massively Multilingual Neural Grapheme-to-Phoneme Conversion Open Website
B Peters, J Dehdari, J van Genabith
One-Size-Fits-All Multilingual Models Open Website
B Peters, A Martins
Smoothing and Shrinking the Sparse Seq2Seq Search Space Open Website
B Peters, A Martins
IT–IST at the SIGMORPHON 2019 Shared Task: Sparse Two-headed Models for Inflection Open Website
B Peters, A Martins
Sparse Sequence-to-Sequence Models Open Website
B Peters, V Niculae, A Martins
Sustav trenutno ne može provesti ovu radnju. Pokušajte ponovo kasnije.
Članci 1–13