Releases: ant-research/StructuredLM_RTDT
Releases · ant-research/StructuredLM_RTDT
Official code for the paper titled Augmenting Transformers with Recursively Composed Multi-grained Representations
In this work, we successfully combine a composition model with bi-directional Transformers and make them jointly pre-trainable.
self-interpretable classification
Fast-R2D2
The paper version and the corresponding model pretrained on wiki-103.
r2d2
The code for paper "R2D2: Recursive Transformer based on Differentiable Tree for Interpretable Hierarchical Language Modeling"