Skip to content

Latest commit

 

History

History
62 lines (42 loc) · 2.92 KB

README.md

File metadata and controls

62 lines (42 loc) · 2.92 KB

RATransformers 🐭

PyPI - Latest Package Version GitHub - License

RATransformers, short for Relation-Aware Transformers, is a package built on top of transformers 🤗 that enables the training/fine-tuning of models with extra relation-aware input features.

Example - Encoding a table in TableQA (Question Answering on Tabular Data)

[Notebook Link]

In this example we can see that passing the table as text with no additional information to the model is a poor representation.

With RATransformers 🐭 you are able to encode the table in a more structured way by passing specific relations within the input. RATransformers 🐭 also allows you to pass further features related with each input word/token.

Check more examples in [here].

Installation

Install directly from PyPI:

pip install ratransformers

Usage

from ratransformers import RATransformer
from transformers import AutoModelForSequenceClassification


ratransformer = RATransformer(
    "nielsr/tapex-large-finetuned-tabfact", # define the 🤗 model you want to load
    relation_kinds=['is_value_of_column', 'is_from_same_row'], # define the relations that you want to model in the input
    model_cls=AutoModelForSequenceClassification, # define the model class
    pretrained_tokenizer_name_or_path='facebook/bart-large' # define the tokenizer you want to load (in case it is not the same as the model)
)
model = ratransformer.model
tokenizer = ratransformer.tokenizer

With only these steps your RATransformer 🐭 is ready to be trained.

More implementation details in the examples here.

How does it work?

We modify the self-attention layers of the transformer model as explained in the section 3 of the RAT-SQL paper.

Supported Models

Currently we support a limited number of transformer models:

Want another model? Feel free to open an Issue or create a Pull Request and let's get started 🚀