trojai.modelgen.architectures package

Submodules

trojai.modelgen.architectures.cifar10_architectures module

class trojai.modelgen.architectures.cifar10_architectures.AlexNet(num_classes=10)[source]

Bases: torch.nn.Module

Modified AlexNet for CIFAR From: https://github.com/icpm/pytorch-cifar10/blob/master/models/AlexNet.py

forward(x)[source]
class trojai.modelgen.architectures.cifar10_architectures.Bottleneck(in_planes, growth_rate)[source]

Bases: torch.nn.Module

Bottleneck module in DenseNet Arch. See: https://arxiv.org/abs/1608.06993

forward(x)[source]
class trojai.modelgen.architectures.cifar10_architectures.DenseNet(block, num_block, growth_rate=12, reduction=0.5, num_classes=10)[source]

Bases: torch.nn.Module

From: https://github.com/icpm/pytorch-cifar10/blob/master/models/DenseNet.py

forward(x)[source]
trojai.modelgen.architectures.cifar10_architectures.DenseNet121()[source]
trojai.modelgen.architectures.cifar10_architectures.DenseNet161()[source]
trojai.modelgen.architectures.cifar10_architectures.DenseNet169()[source]
trojai.modelgen.architectures.cifar10_architectures.DenseNet201()[source]
class trojai.modelgen.architectures.cifar10_architectures.Transition(in_planes, out_planes)[source]

Bases: torch.nn.Module

Transition module in DenseNet Arch. See: https://arxiv.org/abs/1608.06993

forward(x)[source]
trojai.modelgen.architectures.cifar10_architectures.densenet_cifar()[source]

trojai.modelgen.architectures.mnist_architectures module

class trojai.modelgen.architectures.mnist_architectures.BadNetExample[source]

Bases: torch.nn.Module

Mnist network from BadNets paper Input - 1x28x28 C1 - 1x28x28 (5x5 kernel) -> 16x24x24 ReLU S2 - 16x24x24 (2x2 kernel, stride 2) Subsampling -> 16x12x12 C3 - 16x12x12 (5x5 kernel) -> 32x8x8 ReLU S4 - 32x8x8 (2x2 kernel, stride 2) Subsampling -> 32x4x4 F6 - 512 -> 512 tanh F7 - 512 -> 10 Softmax (Output)

forward(img)[source]
class trojai.modelgen.architectures.mnist_architectures.ModdedLeNet5Net(channels=1)[source]

Bases: torch.nn.Module

A modified LeNet architecture that seems to be easier to embed backdoors in than the network from the original badnets paper Input - (1 or 3)x28x28 C1 - 6@28x28 (5x5 kernel) ReLU S2 - 6@14x14 (2x2 kernel, stride 2) Subsampling C3 - 16@10x10 (5x5 kernel) ReLU S4 - 16@5x5 (2x2 kernel, stride 2) Subsampling C5 - 120@1x1 (5x5 kernel) F6 - 84 ReLU F7 - 10 (Output)

forward(img)[source]

Module contents