chemprop.nn.loss
#
Module Contents#
Classes#
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
Calculate the loss using Eq. 9 from [nix1994] |
|
Calculate the loss using Eqs. 8, 9, and 10 from [amini2020] |
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
Calculate a soft Matthews correlation coefficient ([mccWiki]) loss for multiclass |
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
Uses the loss function from [sensoy2018] based on the implementation at [sensoyGithub] |
|
Uses the loss function from [sensoy2018] based on the implementation at [sensoyGithub] |
|
Uses the loss function from [sensoy2018] based on the implementation at [sensoyGithub] |
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
Attributes#
- class chemprop.nn.loss.LossFunction(task_weights=1.0)[source]#
Bases:
torch.nn.Module
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- forward(preds, targets, mask, weights, lt_mask, gt_mask)[source]#
Calculate the mean loss function value given predicted and target values
- Parameters:
preds (Tensor) – a tensor of shape b x (t * s) (regression), b x t (binary classification), or b x t x c (multiclass classification) containing the predictions, where b is the batch size, t is the number of tasks to predict, s is the number of targets to predict for each task, and c is the number of classes.
targets (Tensor) – a float tensor of shape b x t containing the target values
mask (Tensor) – a boolean tensor of shape b x t indicating whether the given prediction should be included in the loss calculation
weights (Tensor) – a tensor of shape b or b x 1 containing the per-sample weight
lt_mask (Tensor)
gt_mask (Tensor)
- Returns:
a scalar containing the fully reduced loss
- Return type:
Tensor
- chemprop.nn.loss.LossFunctionRegistry#
- class chemprop.nn.loss.MSELoss(task_weights=1.0)[source]#
Bases:
LossFunction
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.loss.BoundedMSELoss(task_weights=1.0)[source]#
Bases:
MSELoss
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.loss.MVELoss(task_weights=1.0)[source]#
Bases:
LossFunction
Calculate the loss using Eq. 9 from [nix1994]
References
[nix1994] (1,2)Nix, D. A.; Weigend, A. S. “Estimating the mean and variance of the target probability distribution.” Proceedings of 1994 IEEE International Conference on Neural Networks, 1994 https://doi.org/10.1109/icnn.1994.374138
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.loss.EvidentialLoss(task_weights=None, v_kl=0.2, eps=1e-08)[source]#
Bases:
LossFunction
Calculate the loss using Eqs. 8, 9, and 10 from [amini2020]
References
[amini2020] (1,2)Amini, A; Schwarting, W.; Soleimany, A.; Rus, D.; “Deep Evidential Regression” Advances in Neural Information Processing Systems;2020; Vol.33. https://proceedings.neurips.cc/paper_files/paper/2020/file/aab085461de182608ee9f607f3f7d18f-Paper.pdf
[soleimany2021]Soleimany, A.P.; Amini, A.; Goldman, S.; Rus, D.; Bhatia, S.N.; Coley, C.W.; “Evidential Deep Learning for Guided Molecular Property Prediction and Discovery.” ACS Cent. Sci. 2021, 7, 8, 1356-1367. https://doi.org/10.1021/acscentsci.1c00546
- Parameters:
task_weights (torch.Tensor | None)
v_kl (float)
eps (float)
- class chemprop.nn.loss.BCELoss(task_weights=1.0)[source]#
Bases:
LossFunction
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.loss.CrossEntropyLoss(task_weights=1.0)[source]#
Bases:
LossFunction
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.loss.MccMixin[source]#
Calculate a soft Matthews correlation coefficient ([mccWiki]) loss for multiclass classification based on the implementataion of [mccSklearn]
References
- class chemprop.nn.loss.BinaryMCCLoss(task_weights=1.0)[source]#
Bases:
LossFunction
,MccMixin
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.loss.MulticlassMCCLoss(task_weights=1.0)[source]#
Bases:
LossFunction
,MccMixin
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.loss.DirichletMixin(task_weights=None, v_kl=0.2)[source]#
Uses the loss function from [sensoy2018] based on the implementation at [sensoyGithub]
References
[sensoy2018]Sensoy, M.; Kaplan, L.; Kandemir, M. “Evidential deep learning to quantify classification uncertainty.” NeurIPS, 2018, 31. https://doi.org/10.48550/arXiv.1806.01768
- Parameters:
task_weights (torch.Tensor | None)
v_kl (float)
- class chemprop.nn.loss.BinaryDirichletLoss(task_weights=None, v_kl=0.2)[source]#
Bases:
DirichletMixin
,LossFunction
Uses the loss function from [sensoy2018] based on the implementation at [sensoyGithub]
References
[sensoy2018]Sensoy, M.; Kaplan, L.; Kandemir, M. “Evidential deep learning to quantify classification uncertainty.” NeurIPS, 2018, 31. https://doi.org/10.48550/arXiv.1806.01768
- Parameters:
task_weights (torch.Tensor | None)
v_kl (float)
- class chemprop.nn.loss.MulticlassDirichletLoss(task_weights=None, v_kl=0.2)[source]#
Bases:
DirichletMixin
,LossFunction
Uses the loss function from [sensoy2018] based on the implementation at [sensoyGithub]
References
[sensoy2018]Sensoy, M.; Kaplan, L.; Kandemir, M. “Evidential deep learning to quantify classification uncertainty.” NeurIPS, 2018, 31. https://doi.org/10.48550/arXiv.1806.01768
- Parameters:
task_weights (torch.Tensor | None)
v_kl (float)
- class chemprop.nn.loss.SIDLoss(task_weights=None, threshold=None)[source]#
Bases:
LossFunction
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (torch.Tensor | None)
threshold (float | None)
- class chemprop.nn.loss.WassersteinLoss(task_weights=None, threshold=None)[source]#
Bases:
LossFunction
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (torch.Tensor | None)
threshold (float | None)