chemprop.nn.metrics
#
Module Contents#
Classes#
|
|
|
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
|
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
|
|
|
|
|
|
|
|
|
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
|
Base class for all neural network modules. |
Attributes#
- class chemprop.nn.metrics.Metric(task_weights=1.0)[source]#
Bases:
chemprop.nn.loss.LossFunction
- Parameters:
task_weights (ArrayLike = 1.0) –
Important
Ignored. Maintained for compatibility with
LossFunction
- minimize: bool = True#
- forward(preds, targets, mask, weights, lt_mask, gt_mask)[source]#
Calculate the mean loss function value given predicted and target values
- Parameters:
preds (Tensor) – a tensor of shape b x (t * s) (regression), b x t (binary classification), or b x t x c (multiclass classification) containing the predictions, where b is the batch size, t is the number of tasks to predict, s is the number of targets to predict for each task, and c is the number of classes.
targets (Tensor) – a float tensor of shape b x t containing the target values
mask (Tensor) – a boolean tensor of shape b x t indicating whether the given prediction should be included in the loss calculation
weights (Tensor) – a tensor of shape b or b x 1 containing the per-sample weight
lt_mask (Tensor)
gt_mask (Tensor)
- Returns:
a scalar containing the fully reduced loss
- Return type:
Tensor
- chemprop.nn.metrics.MetricRegistry#
- class chemprop.nn.metrics.MAEMetric(task_weights=1.0)[source]#
Bases:
Metric
- Parameters:
task_weights (ArrayLike = 1.0) –
Important
Ignored. Maintained for compatibility with
LossFunction
- class chemprop.nn.metrics.MSEMetric(task_weights=1.0)[source]#
Bases:
chemprop.nn.loss.MSELoss
,Metric
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.metrics.RMSEMetric(task_weights=1.0)[source]#
Bases:
MSEMetric
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- forward(preds, targets, mask, weights, lt_mask, gt_mask)[source]#
Calculate the mean loss function value given predicted and target values
- Parameters:
preds (Tensor) – a tensor of shape b x (t * s) (regression), b x t (binary classification), or b x t x c (multiclass classification) containing the predictions, where b is the batch size, t is the number of tasks to predict, s is the number of targets to predict for each task, and c is the number of classes.
targets (Tensor) – a float tensor of shape b x t containing the target values
mask (Tensor) – a boolean tensor of shape b x t indicating whether the given prediction should be included in the loss calculation
weights (Tensor) – a tensor of shape b or b x 1 containing the per-sample weight
lt_mask (Tensor)
gt_mask (Tensor)
- Returns:
a scalar containing the fully reduced loss
- Return type:
Tensor
- class chemprop.nn.metrics.BoundedMAEMetric(task_weights=1.0)[source]#
Bases:
MAEMetric
,BoundedMixin
- Parameters:
task_weights (ArrayLike = 1.0) –
Important
Ignored. Maintained for compatibility with
LossFunction
- class chemprop.nn.metrics.BoundedMSEMetric(task_weights=1.0)[source]#
Bases:
MSEMetric
,BoundedMixin
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.metrics.BoundedRMSEMetric(task_weights=1.0)[source]#
Bases:
RMSEMetric
,BoundedMixin
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.metrics.R2Metric(task_weights=1.0)[source]#
Bases:
Metric
- Parameters:
task_weights (ArrayLike = 1.0) –
Important
Ignored. Maintained for compatibility with
LossFunction
- minimize = False#
- forward(preds, targets, mask, *args, **kwargs)[source]#
Calculate the mean loss function value given predicted and target values
- Parameters:
preds (Tensor) – a tensor of shape b x (t * s) (regression), b x t (binary classification), or b x t x c (multiclass classification) containing the predictions, where b is the batch size, t is the number of tasks to predict, s is the number of targets to predict for each task, and c is the number of classes.
targets (Tensor) – a float tensor of shape b x t containing the target values
mask (Tensor) – a boolean tensor of shape b x t indicating whether the given prediction should be included in the loss calculation
weights (Tensor) – a tensor of shape b or b x 1 containing the per-sample weight
lt_mask (Tensor)
gt_mask (Tensor)
- Returns:
a scalar containing the fully reduced loss
- Return type:
Tensor
- class chemprop.nn.metrics.BinaryAUROCMetric(task_weights=1.0)[source]#
Bases:
Metric
- Parameters:
task_weights (ArrayLike = 1.0) –
Important
Ignored. Maintained for compatibility with
LossFunction
- minimize = False#
- forward(preds, targets, mask, *args, **kwargs)[source]#
Calculate the mean loss function value given predicted and target values
- Parameters:
preds (Tensor) – a tensor of shape b x (t * s) (regression), b x t (binary classification), or b x t x c (multiclass classification) containing the predictions, where b is the batch size, t is the number of tasks to predict, s is the number of targets to predict for each task, and c is the number of classes.
targets (Tensor) – a float tensor of shape b x t containing the target values
mask (Tensor) – a boolean tensor of shape b x t indicating whether the given prediction should be included in the loss calculation
weights (Tensor) – a tensor of shape b or b x 1 containing the per-sample weight
lt_mask (Tensor)
gt_mask (Tensor)
- Returns:
a scalar containing the fully reduced loss
- Return type:
Tensor
- class chemprop.nn.metrics.BinaryAUPRCMetric(task_weights=1.0)[source]#
Bases:
Metric
- Parameters:
task_weights (ArrayLike = 1.0) –
Important
Ignored. Maintained for compatibility with
LossFunction
- minimize = False#
- forward(preds, targets, *args, **kwargs)[source]#
Calculate the mean loss function value given predicted and target values
- Parameters:
preds (Tensor) – a tensor of shape b x (t * s) (regression), b x t (binary classification), or b x t x c (multiclass classification) containing the predictions, where b is the batch size, t is the number of tasks to predict, s is the number of targets to predict for each task, and c is the number of classes.
targets (Tensor) – a float tensor of shape b x t containing the target values
mask (Tensor) – a boolean tensor of shape b x t indicating whether the given prediction should be included in the loss calculation
weights (Tensor) – a tensor of shape b or b x 1 containing the per-sample weight
lt_mask (Tensor)
gt_mask (Tensor)
- Returns:
a scalar containing the fully reduced loss
- Return type:
Tensor
- class chemprop.nn.metrics.BinaryAccuracyMetric(task_weights=1.0)[source]#
Bases:
Metric
,ThresholdedMixin
- Parameters:
task_weights (ArrayLike = 1.0) –
Important
Ignored. Maintained for compatibility with
LossFunction
- minimize = False#
- forward(preds, targets, mask, *args, **kwargs)[source]#
Calculate the mean loss function value given predicted and target values
- Parameters:
preds (Tensor) – a tensor of shape b x (t * s) (regression), b x t (binary classification), or b x t x c (multiclass classification) containing the predictions, where b is the batch size, t is the number of tasks to predict, s is the number of targets to predict for each task, and c is the number of classes.
targets (Tensor) – a float tensor of shape b x t containing the target values
mask (Tensor) – a boolean tensor of shape b x t indicating whether the given prediction should be included in the loss calculation
weights (Tensor) – a tensor of shape b or b x 1 containing the per-sample weight
lt_mask (Tensor)
gt_mask (Tensor)
- Returns:
a scalar containing the fully reduced loss
- Return type:
Tensor
- class chemprop.nn.metrics.BinaryF1Metric(task_weights=1.0)[source]#
Bases:
Metric
,ThresholdedMixin
- Parameters:
task_weights (ArrayLike = 1.0) –
Important
Ignored. Maintained for compatibility with
LossFunction
- minimize = False#
- forward(preds, targets, mask, *args, **kwargs)[source]#
Calculate the mean loss function value given predicted and target values
- Parameters:
preds (Tensor) – a tensor of shape b x (t * s) (regression), b x t (binary classification), or b x t x c (multiclass classification) containing the predictions, where b is the batch size, t is the number of tasks to predict, s is the number of targets to predict for each task, and c is the number of classes.
targets (Tensor) – a float tensor of shape b x t containing the target values
mask (Tensor) – a boolean tensor of shape b x t indicating whether the given prediction should be included in the loss calculation
weights (Tensor) – a tensor of shape b or b x 1 containing the per-sample weight
lt_mask (Tensor)
gt_mask (Tensor)
- Returns:
a scalar containing the fully reduced loss
- Return type:
Tensor
- class chemprop.nn.metrics.BCEMetric(task_weights=1.0)[source]#
Bases:
chemprop.nn.loss.BCELoss
,Metric
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.metrics.CrossEntropyMetric(task_weights=1.0)[source]#
Bases:
chemprop.nn.loss.CrossEntropyLoss
,Metric
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.metrics.BinaryMCCMetric(task_weights=1.0)[source]#
Bases:
chemprop.nn.loss.BinaryMCCLoss
,Metric
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.metrics.MulticlassMCCMetric(task_weights=1.0)[source]#
Bases:
chemprop.nn.loss.MulticlassMCCLoss
,Metric
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (numpy.typing.ArrayLike)
- class chemprop.nn.metrics.SIDMetric(task_weights=None, threshold=None)[source]#
Bases:
chemprop.nn.loss.SIDLoss
,Metric
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (torch.Tensor | None)
threshold (float | None)
- class chemprop.nn.metrics.WassersteinMetric(task_weights=None, threshold=None)[source]#
Bases:
chemprop.nn.loss.WassersteinLoss
,Metric
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:
import torch.nn as nn import torch.nn.functional as F class Model(nn.Module): def __init__(self): super().__init__() self.conv1 = nn.Conv2d(1, 20, 5) self.conv2 = nn.Conv2d(20, 20, 5) def forward(self, x): x = F.relu(self.conv1(x)) return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call
to()
, etc.Note
As per the example above, an
__init__()
call to the parent class must be made before assignment on the child.- Variables:
training (bool) – Boolean represents whether this module is in training or evaluation mode.
- Parameters:
task_weights (torch.Tensor | None)
threshold (float | None)