Publication | Closed Access
Which Model to Transfer? Finding the Needle in the Growing Haystack
13
Citations
30
References
2022
Year
Artificial IntelligenceFew-shot LearningEngineeringMachine LearningModeling MethodGrowing HaystackTensorflow HubData ScienceHigh RegretMulti-task LearningRobot LearningTechnology TransferLarge Ai ModelEconomicsMachine VisionKnowledge TransferVision Language ModelComputer ScienceDeep LearningModel TransformationComputer VisionBusinessKnowledge ManagementTransfer LearningModel BuildingTechnologyModel AnalysisData Modeling
Transfer learning has been recently popularized as a data-efficient alternative to training models from scratch, in particular for computer vision tasks where it provides a remarkably solid baseline. The emergence of rich model repositories, such as TensorFlow Hub, enables the practitioners and researchers to unleash the potential of these models across a wide range of downstream tasks. As these repositories keep growing exponentially, efficiently selecting a good model for the task at hand becomes paramount. We provide a formalization of this problem through afamiliar notion of regret and introduce the predominant strategies, namely task-agnostic (e.g. ranking models by their ImageNet performance) and task-aware search strategies (such as linear or kNN evaluation). We conduct a large-scale empirical study and show that both task-agnostic and task-aware methods can yield high regret. We then propose a simple and computationally efficient hybrid search strategy which outperforms the existing approaches. We highlight the practical benefits of the proposed solution on a set of 19 diverse vision tasks.
| Year | Citations | |
|---|---|---|
Page 1
Page 1