go back
go back
Volume 15, No. 12
Transformers for Tabular Data Representation: A Tutorial on Models and Applications
Abstract
In the last few years, the natural language processing community witnessed advances in neural representations of free texts with transformer-based language models (LMs). Given the importance of knowledge available in relational tables, recent research efforts extend LMs by developing neural representations for tabular data. In this tutorial, we present these proposals with two main goals. First, we introduce to a database audience the potentials and the limitations of current models. Second, we demonstrate the large variety of data applications that benefit from the transformer architecture. The tutorial aims at encouraging database researchers to engage and contribute to this new direction, and at empowering practitioners with a new set of tools for applications involving text and tabular data.
PVLDB is part of the VLDB Endowment Inc.
Privacy Policy