Figure - available via license: Creative Commons Attribution 4.0 International
Content may be subject to copyright.
Hyper-parameter setting for all baselines. Method Hyper-parameters CIFAR-10 (PreActResNet-18) ImageNet-100 (ResNet-18)
Source publication
Adversarial training has been considered an imperative component for safely deploying neural network-based applications to the real world. To achieve stronger robustness, existing methods primarily focus on how to generate strong attacks by increasing the number of update steps, regularizing the models with the smoothed loss function, and injecting...
Context in source publication
Context 1
... is compared to both non-iterative (FBF, GAT, and NuAT) and iterative (FBF, GAT, and NuAT) methods (PGD, TRADES, and AWP). The hyper-parameter settings of each baseline are listed in Table 7. Since the evaluation results on ImageNet come from previous works ( Sriramanan et al. 2020Sriramanan et al. , 2021, the table only includes the parameters reported in these works (unknown parameters are denoted with −.). ...Similar publications
According to the intent-to-treat principle, analyses should be based on the grouping of patients as they were randomized and all patients should be followed to the endpoint or the end of study. For an unbiased comparison with valid p-values, an intent-to-treat analysis is essential. Increasingly in oncology, progression-free survival (PFS) is used...