This is a PyTorch version of fairseq, a sequence-to-sequence learning toolkit from Facebook AI Research. The original authors of this reimplementation are (in no particular order) Sergey Edunov, Myle ...
**add a new model named len_pre_transformer.py. It takes advantages of transfomer's encoder and get rid of the decoder. As for predictign sentence length, it stacks three fully connected layers above ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results