differential
Contains classes for marking differential privacy on models.
Classes
DPModellerConfig
class DPModellerConfig( max_epsilon: float, max_grad_norm: Union[float, List[float]] = 1.0, alphas: List[float] = <factory>, target_delta: float = 1e-06, loss_reduction: "Literal[('mean', 'sum')]" = 'mean', auto_fix: bool = True,):
Modeller configuration options for Differential Privacy.
Arguments
max_epsilon
: The maximum epsilon value to use.max_grad_norm
: The maximum gradient norm to use. Defaults to 1.0.alphas
: The alphas to use. Defaults to floats from 1.1 to 63.0 (inclusive) with increments of 0.1 up to 11.0 followed by increments of 1.0 up to 63.0. Note that none of the alphas should be equal to 1.target_delta
: The target delta to use. Defaults to 1e-6.loss_reduction
: The loss reduction to use. Available options are "mean" and "sum". Defaults to "mean".auto_fix
: Whether to automatically fix the model if it is not DP-compliant. Currently, this just converts allBatchNorm
layers toGroupNorm
. Defaults to True.
Raises
ValueError
: If loss_reduction is not one of "mean" or "sum".
info
max_epsilon
and target_delta
are also set by the Pods involved in the task and
take precedence over the values supplied here.
Variables
- static
alphas : List[float]
- static
auto_fix : bool
- static
fields_dict
- static
loss_reduction : Literal['mean', 'sum']
- static
max_epsilon : float
- static
max_grad_norm : Union[float, List[float]]
- static
nested_fields
- static
target_delta : float
DPPodConfig
class DPPodConfig( max_epsilon: float, max_target_delta: float = 1e-06, fields_dict: _StrAnyDict = <factory>,):
Pod configuration options for Differential Privacy.
Primarily used as caps and bounds for what options may be set by the modeller.
Arguments
max_epsilon
: The maximum epsilon value to use.max_target_delta
: The maximum target delta to use. Defaults to 1e-6.
Ancestors
- bitfount.federated.privacy.differential._BaseDPConfig
Variables
- static
fields_dict : Dict[str, Any]
- static
max_epsilon : float
- static
max_target_delta : float