go back

Volume 16, No. 2

SHiFT: An Efficient, Flexible Search Engine for Transfer Learning

Authors:
Cedric Renggli, Xiaozhe Yao, Luka Kolar, Luka Rimanic, Ana Klimovic, Ce Zhang

Abstract

Transfer learning can be seen as a data-and compute-ecient alternative to training models from scratch. The emergence of rich model repositories, such as TensorFlow Hub, enables practitioners and researchers to unleash the potential of these models across a wide range of downstream tasks. As these repositories keep growing exponentially, eciently selecting a good model for the task at hand becomes paramount. However, a single generic search strategy (e.g., taking the model with the highest linear classier accuracy) does not lead to optimal model selection for diverse downstream tasks. In fact, using hybrid or mixed strategies can often be benecial. Therefore, we propose SHiFT, the rst downstream task-aware, exible, and ecient model search engine for transfer learning. Users interface with SHiFT using the SHiFT-QL query language, which gives users the exibility to customize their search criteria. We optimize SHiFT-QL queries using a cost-based decision maker and evaluate them on a wide rang of tasks. Motivated by the iterative nature of machine learning development, we further support ecient incremental executions of our queries, which requires a special implementation when jointly used with our optimizations.

PVLDB is part of the VLDB Endowment Inc.

Privacy Policy