StoryAnalogy: Deriving Story-level Analogies from Large Language Models to Unlock Analogical Understanding
This repo contains the dataset and code in the EMNLP'23 paper: StoryAnalogy: Deriving Story-level Analogies from Large Language Models to Unlock Analogical Understanding.
We recommend using Hugging Face's datasets
to load the story analogy dataset:
from datasets import load_dataset
dataset = load_dataset("JoeyCheng/story_analogy")
The multiple choice subset can be found at src/data/storyanalogy_multiple_choice.json
.
If you have any questions related to the code or the paper, please feel free to email us at [email protected]
.
If you use this research, please cite us:
@inproceedings{jiayang2023storyanalogy,
title={StoryAnalogy: Deriving Story-level Analogies from Large Language Models to Unlock Analogical Understanding},
author={Jiayang, Cheng and Qiu, Lin and Chan, Tsz and Fang, Tianqing and Wang, Weiqi and Chan, Chunkit and Ru, Dongyu and Guo, Qipeng and Zhang, Hongming and Song, Yangqiu and others},
booktitle={Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing},
pages={11518--11537},
year={2023}
}