Prati
Graham Neubig
Graham Neubig
Associate Professor of Computer Science, Carnegie Mellon University
Potvrđena adresa e-pošte na cs.cmu.edu - Početna stranica
Naslov
Citirano
Citirano
Godina
A Syntactic Neural Model for General-Purpose Code Generation
P Yin, G Neubig
ACL 2017, 2017
5062017
Are Sixteen Heads Really Better than One?
P Michel, O Levy, G Neubig
NeurIPS 2019, 2019
4432019
XTREME: A massively multilingual multi-task benchmark for evaluating cross-lingual generalization
J Hu, S Ruder, A Siddhant, G Neubig, O Firat, M Johnson
ICML 2020, 2020
4242020
Dynet: The dynamic neural network toolkit
G Neubig, C Dyer, Y Goldberg, A Matthews, W Ammar, A Anastasopoulos, ...
arXiv preprint arXiv:1701.03980, 2017
408*2017
Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing
P Liu, W Yuan, J Fu, Z Jiang, H Hayashi, G Neubig
arXiv preprint arXiv:2107.13586, 2021
3402021
How can we know what language models know?
Z Jiang, FF Xu, J Araki, G Neubig
TACL 8, 423-438, 2020
3142020
Pointwise prediction for robust, adaptable Japanese morphological analysis
G Neubig, Y Nakata, S Mori
ACL 2011, 529-533, 2011
3072011
When and Why are Pre-trained Word Embeddings Useful for Neural Machine Translation?
Y Qi, DS Sachan, M Felix, SJ Padmanabhan, G Neubig
NAACL 2018, 2018
2692018
Learning to generate pseudo-code from source code using statistical machine translation (t)
Y Oda, H Fudaba, G Neubig, H Hata, S Sakti, T Toda, S Nakamura
ASE 2015, 574-584, 2015
2542015
Lagging Inference Networks and Posterior Collapse in Variational Autoencoders
J He, D Spokoyny, G Neubig, T Berg-Kirkpatrick
ICLR 2019, 2019
2332019
Controllable Invariance through Adversarial Feature Learning
Q Xie, Z Dai, Y Du, E Hovy, G Neubig
NIPS 2017, 2017
2232017
Stress Test Evaluation for Natural Language Inference
A Naik, A Ravichander, N Sadeh, C Rose, G Neubig
COLING 2018, 2018
2102018
Incorporating discrete translation lexicons into neural machine translation
P Arthur, G Neubig, S Nakamura
EMNLP 2016, 2016
1892016
Neural machine translation and sequence-to-sequence models: A tutorial
G Neubig
arXiv preprint arXiv:1703.01619, 2017
1852017
Controlling output length in neural encoder-decoders
Y Kikuchi, G Neubig, R Sasano, H Takamura, M Okumura
EMNLP 2016, 2016
1832016
TaBERT: Pretraining for Joint Understanding of Textual and Tabular Data
P Yin, G Neubig, W Yih, S Riedel
ACL 2020, 2020
1782020
Stack-Pointer Networks for Dependency Parsing
X Ma, Z Hu, J Liu, N Peng, G Neubig, E Hovy
ACL 2018, 2018
1602018
Learning to translate in real-time with neural machine translation
J Gu, G Neubig, K Cho, VOK Li
EACL 2017, 2016
1582016
Competence-based Curriculum Learning for Neural Machine Translation
EA Platanios, O Stretcu, G Neubig, B Poczos, TM Mitchell
NAACL 2019, 2019
1542019
What Do Recurrent Neural Network Grammars Learn About Syntax?
A Kuncoro, M Ballesteros, L Kong, C Dyer, G Neubig, NA Smith
EACL 2017, 2017
1442017
Sustav trenutno ne može provesti ovu radnju. Pokušajte ponovo kasnije.
Članci 1–20