Hyperparameter Optimization
chemprop.hyperparameter_optimization.py runs hyperparameter optimization on Chemprop models.
Optimizes hyperparameters using Bayesian optimization.
- chemprop.hyperparameter_optimization.chemprop_hyperopt() None [source]
Runs hyperparameter optimization for a Chemprop model.
This is the entry point for the command line command
chemprop_hyperopt
.
- chemprop.hyperparameter_optimization.hyperopt(args: HyperoptArgs) None [source]
Runs hyperparameter optimization on a Chemprop model.
Hyperparameter optimization optimizes the following parameters:
hidden_size
: The hidden size of the neural network layers is selected from {300, 400, …, 2400}depth
: The number of message passing iterations is selected from {2, 3, 4, 5, 6}dropout
: The dropout probability is selected from {0.0, 0.05, …, 0.4}ffn_num_layers
: The number of feed-forward layers after message passing is selected from {1, 2, 3}
The best set of hyperparameters is saved as a JSON file to
args.config_save_path
.- Parameters:
args – A
HyperoptArgs
object containing arguments for hyperparameter optimization in addition to all arguments needed for training.