A set of document-level language models with contextual information. Please refer to our ICLR 2016 submission for more technical details.
You need the Boost C++ libraries (>=1.56) to save/load word vocabulary and trained models.
-
First you need to fetch the cnn library into the same folder, then follow the instruction to get additional libraries and compile cnn.
-
To compile all DCLMs, run
make
It will produce three executable files: main-dclm, dam, and baseline.
- baseline refers to the RNNLM model with contextual information
- dam refers to the attentional DCLM
- main-dclm includes other four models: context-to-output DCLM, context-to-context DCLM, RNNLM without sentence boundary, and Hierarchical RNNLM.