Skip to content

My own implementation of the babyLM ELC BERT, given by the 2023 paper "Not all layers are equally as important: Every Layer Counts BERT" by Charpentier and Samuel

Notifications You must be signed in to change notification settings

SunnyWan59/qdelcbert

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 

Repository files navigation

QD-ELC-BERT

My own implementation of the babyLM ELC BERT, as described in the 2023 paper "Not all layers are equally as important: Every Layer Counts BERT" by Charpentier and Samuel.

#Bibliography

@inproceedings{georges-gabriel-charpentier-samuel-2023-layers,
    title = "Not all layers are equally as important: Every Layer Counts {BERT}",
    author = "Georges Gabriel Charpentier, Lucas  and
      Samuel, David",
    editor = "Warstadt, Alex  and
      Mueller, Aaron  and
      Choshen, Leshem  and
      Wilcox, Ethan  and
      Zhuang, Chengxu  and
      Ciro, Juan  and
      Mosquera, Rafael  and
      Paranjabe, Bhargavi  and
      Williams, Adina  and
      Linzen, Tal  and
      Cotterell, Ryan",
    booktitle = "Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning",
    month = dec,
    year = "2023",
    address = "Singapore",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.conll-babylm.20",
    doi = "10.18653/v1/2023.conll-babylm.20",
    pages = "238--252",
}

About

My own implementation of the babyLM ELC BERT, given by the 2023 paper "Not all layers are equally as important: Every Layer Counts BERT" by Charpentier and Samuel

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages