bert-base-uncased finetuned on CoLA.
bert-base-uncased
CoLA
batch size is 32, learning rate is 2e-5.
matthews_corr: 0.6295