Do you want BuboFlash to help you learning these things? Or do you want to add or correct something? Click here to log in or create user.



#Autoregressive #BERT #nlp
Unsupervised representation learning has been highly successful in the domain of natural language processing [ 7 , 19 , 24 , 25 , 10 ]. Typically, these methods first pretrain neural networks on large-scale unlabeled text corpora, and then finetune the models or representations on downstream tasks.
If you want to change selection, open document below and click on "Move attachment"

pdf

owner: ronaldokun - (no access) - XLNet: Generalized Autoregressive Pretraining for Language Understanding, p1


Summary

statusnot read reprioritisations
last reprioritisation on suggested re-reading day
started reading on finished reading on

Details



Discussion

Do you want to join discussion? Click here to log in or create user.