YYÜ GCRIS Basic veritabanının içerik oluşturulması ve kurulumu Research Ecosystems (https://www.researchecosystems.com) tarafından devam etmektedir. Bu süreçte gördüğünüz verilerde eksikler olabilir.
 

A Hybrid Model for Extractive Summarization: Leveraging Graph Entropy To Improve Large Language Model Performance

No Thumbnail Available

Date

2025

Journal Title

Journal ISSN

Volume Title

Publisher

Elsevier

Abstract

Extractive text summarization models focus on condensing large texts by selecting key sentences rather than generating new ones. Recently, studies have utilized large language models (LLMs) for effective summarization solutions. However, limitations like cost and time in using LLMs make achieving high performance challenging. This study introduces a hybrid model that combines graph entropy with LLMs to improve summarization accuracy and time efficiency. Initially, the text is represented as a graph, with each sentence as a node. Using Karci Entropy (KE) to measure each sentence's information, the model selects the most valuable sentences, which are then processed by LLMs like BERT, RoBERTa, and XLNet, to create summaries of 400 words, 200 words, and 3 sentences. Testing on Duc2002 and CNN Daily datasets shows significant gains in both accuracy and processing speed, highlighting the proposed model's effectiveness.

Description

Keywords

Extractive Summarization, Large Language Models, Graph Theory, Karc & Imath, Entropy

Turkish CoHE Thesis Center URL

WoS Q

Q1

Scopus Q

Q1

Source

Volume

16

Issue

5

Start Page

End Page