-
Tokyo Institute of Technology
- Japan
-
15:31
(UTC +09:00)
Popular repositories Loading
-
-
-
-
Cutout
Cutout PublicForked from uoguelph-mlrg/Cutout
2.56%, 15.20%, 1.30% on CIFAR10, CIFAR100, and SVHN https://arxiv.org/abs/1708.04552
Python
-
Megatron-DeepSpeed
Megatron-DeepSpeed PublicForked from microsoft/Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Python
-
RedPajama-Data
RedPajama-Data PublicForked from togethercomputer/RedPajama-Data
The RedPajama-Data repository contains code for preparing large datasets for training large language models.
Python
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.