🏡 Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
-
Updated
Dec 20, 2023 - Python
🏡 Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
Simple XLNet implementation with Pytorch Wrapper
This shows how to fine-tune Bert language model and use PyTorch-transformers for text classififcation
疫情期间网民情绪识别代码,包含lstm,bert,xlnet,robert,最高f1为0.725 部署在Google colab
Determine the polarity of amazon fine food reviews using ULMFiT, BERT, XLNet and RoBERTa
PyTorch implementation of Deep-Learning Architectures
BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.
This GitHub repository presents our solution to Touché 2023 Task 4: Human Value Detection, a multilabel text classification task. We fine-tuned transformer architectures like BERT, RoBERTa, and XLNet to classify whether or not a given argument draws on a human value category.
This is a document's contextual similarity evaluation tool. It captures the contextual meaning of a document and compares it with other document's contextual meaning stored in its dataset.
R&D for datasets for book genres
Identifying complaints on social media using transformer-XL based XLNet model
Add a description, image, and links to the xlnet-pytorch topic page so that developers can more easily learn about it.
To associate your repository with the xlnet-pytorch topic, visit your repo's landing page and select "manage topics."