reporting error while trying to directly train Gtransformer on finetune datasets from scratch · Issue #5 · tencent-ailab/grover · GitHub
More Web Proxy on the site http://driver.im/
You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I find your work really interesting and I want to implement the codes myself. I want to see the performance of the Gtransformer without pretraining on bbbp. So I remove the check point path input.
Hi,
I find your work really interesting and I want to implement the codes myself. I want to see the performance of the Gtransformer without pretraining on bbbp. So I remove the check point path input.
python main.py finetune --data_path exampledata/finetune/bbbp.csv

--features_path exampledata/finetune/bbbp.npz
--save_dir model/finetune/bbbp/
--dataset_type classification
--split_type scaffold_balanced
--ensemble_size 1
--num_folds 3
--no_features_scaling
--ffn_hidden_size 200
--batch_size 32
--epochs 10
--init_lr 0.00015
However, it reports error:
The text was updated successfully, but these errors were encountered: