Publication | Open Access
Fully Test-time Adaptation by Entropy Minimization
25
Citations
45
References
2020
Year
Mathematical ProgrammingConvolutional Neural NetworkEngineeringMachine LearningEntropy MinimizationImage AnalysisData SciencePattern RecognitionNormalization StatisticsTest Entropy MinimizationVideo TransformerNeural Scaling LawData AugmentationMachine VisionComputer ScienceAdaptive AlgorithmStatistical Learning TheorySignal ProcessingComputer VisionStochastic OptimizationDomain AdaptationStatistical InferenceCorrupted Imagenet
A model must adapt itself to generalize to new and different data during testing. In this setting of fully test-time adaptation the model has only the test data and its own parameters. We propose to adapt by test entropy minimization (tent): we optimize the model for confidence as measured by the entropy of its predictions. Our method estimates normalization statistics and optimizes channel-wise affine transformations to update online on each batch. Tent reduces generalization error for image classification on corrupted ImageNet and CIFAR-10/100 and reaches a new state-of-the-art error on ImageNet-C. Tent handles source-free domain adaptation on digit recognition from SVHN to MNIST/MNIST-M/USPS, on semantic segmentation from GTA to Cityscapes, and on the VisDA-C benchmark. These results are achieved in one epoch of test-time optimization without altering training.
| Year | Citations | |
|---|---|---|
Page 1
Page 1