art.defences.transformer

Module implementing transformer-based defences against adversarial attacks.

Transformer

class art.defences.transformer.Transformer(classifier: Classifier)

Abstract base class for transformation defences.

abstract __call__(x: numpy.ndarray, transformed_classifier: Classifier) → Classifier

Perform the transformation defence and return a robuster classifier.

Parameters
  • x (ndarray) – Dataset for training the transformed classifier.

  • transformed_classifier – A classifier to be transformed for increased robustness.

Returns

The transformed classifier.

__init__(classifier: Classifier) → None

Create a transformation object.

Parameters

classifier – A trained classifier.

abstract fit(x: numpy.ndarray, y: Optional[numpy.ndarray] = None, **kwargs) → None

Fit the parameters of the transformer if it has any.

Parameters
  • x (ndarray) – Training set to fit the transformer.

  • y – Labels for the training set.

  • kwargs – Other parameters.

get_classifier() → Classifier

Get the internal classifier.

Returns

The internal classifier.

property is_fitted

Return the state of the transformation object.

Returns

True if the transformation model has been fitted (if this applies).

set_params(**kwargs) → None

Take in a dictionary of parameters and apply checks before saving them as attributes.

Defensive Distillation

class art.defences.transformer.DefensiveDistillation(classifier: Classifier, batch_size: int = 128, nb_epochs: int = 10)

Implement the defensive distillation mechanism.

__call__(x: numpy.ndarray, transformed_classifier: Classifier) → Classifier

Perform the defensive distillation defence mechanism and return a robuster classifier.

Parameters
  • x (ndarray) – Dataset for training the transformed classifier.

  • transformed_classifier – A classifier to be transformed for increased robustness. Note that, the objective loss function used for fitting inside the input transformed_classifier must support soft labels, i.e. probability labels.

Returns

The transformed classifier.

__init__(classifier: Classifier, batch_size: int = 128, nb_epochs: int = 10) → None

Create an instance of the defensive distillation defence.

Parameters
  • classifier – A trained classifier.

  • batch_size (int) – Size of batches.

  • nb_epochs (int) – Number of epochs to use for training.

fit(x: numpy.ndarray, y: Optional[numpy.ndarray] = None, **kwargs) → None

No parameters to learn for this method; do nothing.