Q195 : Summarization of short Persian documents using data-driven solutions
Thesis > Central Library of Shahrood University > Computer Engineering > MSc > 2021
Authors:
Abstarct: Persian is a low-resource language and thus abstractive-summarization is a considerable challenge with the lack of an efficient dataset. We gathered a dataset suitable for fine-tuning a pretrained transformer model, then employed the mt5 pre-trained model to train an abstractive tixtle-generation model on our introduced dataset. The model achieved the Rouge-1 score of 46.32 with coherent outputs. The results are competitive with state-of-the-art similar English works.
Keywords:
#Summarization #Deep learning #Deep Neural networks #Transformers #T5 #MT5 #Farsi Abstractive text Summarization #Farsi tixtle-Generation Keeping place: Central Library of Shahrood University
Visitor:
Visitor: