GitXplorerGitXplorer
o

finetune-transformer-lm

public
2139 stars
499 forks
28 issues

Commits

List of commits on branch master.
Verified
a69b5c43b0452462890bca8ff92fb75dee9290cf

Merge pull request #35 from christopherhesse/update-readme

ccberner committed 6 years ago
Unverified
c3f1dc4e67a0700e869cfd67ff0ac417f8784181

update README with repo status

cchristopherhesse committed 6 years ago
Unverified
ae7e86b55254339eba3929f786b04e82602cd51d

remove unused ema code

committed 6 years ago
Unverified
bd1cf7d678926041e6d19193cab7e5cd8ce2fce6

updated readme

committed 6 years ago
Unverified
f7c13308472857da7317a8b6628e9e628d9f6a73

ROCStories demo

committed 6 years ago
Verified
1eca5683bd98178c890aabcc0c4230289be256bd

Initial commit

NNewmu committed 6 years ago

README

The README file for this repository.

Status: Archive (code is provided as-is, no updates expected)

finetune-transformer-lm

Code and model for the paper "Improving Language Understanding by Generative Pre-Training"

Currently this code implements the ROCStories Cloze Test result reported in the paper by running: python train.py --dataset rocstories --desc rocstories --submit --analysis --data_dir [path to data here]

Note: The code is currently non-deterministic due to various GPU ops. The median accuracy of 10 runs with this codebase (using default hyperparameters) is 85.8% - slightly lower than the reported single run of 86.5% from the paper.

The ROCStories dataset can be downloaded from the associated website.