This repository hosts the dataset associated with our paper titled “When Context Leads but Parametric Memory Follows in Large Language Models”. Our work has been accepted for presentation at the 2024 Conference on Empirical Methods in Natural Language Processing (EMNLP).
- wikiatomic.csv: Contains all 10,000 atomic sentences seperated by topics (~200 topics, 50 atomic sentences each).
- subdatasets for each model: The folder contains 9 csv files that each one contains questions we used in our paper and corresponding responses from each LLM.
We thank the anonymous reviewers as well as the members of PortNLP lab for their insightful comments that helped improve this paper.
Please cite our work as follows:
@misc{tao2024contextleadsparametricmemory,
title={When Context Leads but Parametric Memory Follows in Large Language Models},
author={Yufei Tao and Adam Hiatt and Erik Haake and Antonie J. Jetter and Ameeta Agrawal},
year={2024},
eprint={2409.08435},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2409.08435},
}