YYÜ GCRIS Basic veritabanının içerik oluşturulması ve kurulumu Research Ecosystems (https://www.researchecosystems.com) tarafından devam etmektedir. Bu süreçte gördüğünüz verilerde eksikler olabilir.
 

A Hybrid Model for Extractive Summarization: Leveraging Graph Entropy To Improve Large Language Model Performance

dc.authorscopusid 57200138639
dc.authorwosid Uckan, Taner/Izp-9705-2023
dc.contributor.author Uckan, Taner
dc.date.accessioned 2025-05-10T17:29:45Z
dc.date.available 2025-05-10T17:29:45Z
dc.date.issued 2025
dc.department T.C. Van Yüzüncü Yıl Üniversitesi en_US
dc.department-temp [Uckan, Taner] Van Yuzuncu Yil Univ, Dept Comp Engn, TR-65000 Van, Turkiye en_US
dc.description.abstract Extractive text summarization models focus on condensing large texts by selecting key sentences rather than generating new ones. Recently, studies have utilized large language models (LLMs) for effective summarization solutions. However, limitations like cost and time in using LLMs make achieving high performance challenging. This study introduces a hybrid model that combines graph entropy with LLMs to improve summarization accuracy and time efficiency. Initially, the text is represented as a graph, with each sentence as a node. Using Karci Entropy (KE) to measure each sentence's information, the model selects the most valuable sentences, which are then processed by LLMs like BERT, RoBERTa, and XLNet, to create summaries of 400 words, 200 words, and 3 sentences. Testing on Duc2002 and CNN Daily datasets shows significant gains in both accuracy and processing speed, highlighting the proposed model's effectiveness. en_US
dc.description.woscitationindex Science Citation Index Expanded
dc.identifier.doi 10.1016/j.asej.2025.103348
dc.identifier.issn 2090-4479
dc.identifier.issn 2090-4495
dc.identifier.issue 5 en_US
dc.identifier.scopus 2-s2.0-105001402059
dc.identifier.scopusquality Q1
dc.identifier.uri https://doi.org/10.1016/j.asej.2025.103348
dc.identifier.uri https://hdl.handle.net/20.500.14720/12448
dc.identifier.volume 16 en_US
dc.identifier.wos WOS:001459360200001
dc.identifier.wosquality Q1
dc.institutionauthor Uckan, Taner
dc.language.iso en en_US
dc.publisher Elsevier en_US
dc.relation.publicationcategory Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı en_US
dc.rights info:eu-repo/semantics/closedAccess en_US
dc.subject Extractive Summarization en_US
dc.subject Large Language Models en_US
dc.subject Graph Theory en_US
dc.subject Karc & Imath en_US
dc.subject Entropy en_US
dc.title A Hybrid Model for Extractive Summarization: Leveraging Graph Entropy To Improve Large Language Model Performance en_US
dc.type Article en_US

Files