You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, first of all, thank you very much for the repository! :)
I want to retrain the model using a larger number of keyphrases and output longer keyphrases in general.
To achieve this I:
increase the number of max_phrase_words from 5 to 10 in scripts/config.py
increase max_gram from 5 to 10 parameter in my model (in bertkpe/networks/)
However, I see that every number bigger than 5 makes me run out of memory during the step
"start preparing (train) features for bert2joint (bert) ..."
I can also see that if I increase the numbers to 6, I run out of memory at a much later stage in the preparation step than if I increase it to something higher like 10, even though the operation is performed on batches. I suspect that there is a memory leak in one of the data loader functions.
The text was updated successfully, but these errors were encountered:
Hi, first of all, thank you very much for the repository! :)
I want to retrain the model using a larger number of keyphrases and output longer keyphrases in general.
To achieve this I:
However, I see that every number bigger than 5 makes me run out of memory during the step
"start preparing (train) features for bert2joint (bert) ..."
I can also see that if I increase the numbers to 6, I run out of memory at a much later stage in the preparation step than if I increase it to something higher like 10, even though the operation is performed on batches. I suspect that there is a memory leak in one of the data loader functions.
The text was updated successfully, but these errors were encountered: