Keyword search (4,163 papers available)

"Attention" Keyword-tagged Publications:

Title Authors PubMed ID
1 Tuned to walk: cue type, beat perception, and gait dynamics during rhythmic stimulation in aging Parker A; Dalla Bella S; Penhune VB; Young L; Grenet D; Li KZH; 41661338
PSYCHOLOGY
2 Towards user-centered interactive medical image segmentation in VR with an assistive AI agent Spiegler P; Harirpoush A; Xiao Y; 41509996
ENCS
3 Attention-Fusion-Based Two-Stream Vision Transformer for Heart Sound Classification Ranipa K; Zhu WP; Swamy MNS; 41155032
ENCS
4 Lung Nodule Malignancy Classification Integrating Deep and Radiomic Features in a Three-Way Attention-Based Fusion Module Khademi S; Heidarian S; Afshar P; Mohammadi A; Sidiqi A; Nguyen ET; Ganeshan B; Oikonomou A; 41150036
ENCS
5 Reduced Eye Blinking During Sentence Listening Reflects Increased Cognitive Load in Challenging Auditory Conditions Coupal P; Zhang Y; Deroche M; 40910460
PSYCHOLOGY
6 A novel span and syntax enhanced large language model based framework for fine-grained sentiment analysis Zou H; Wang Y; Huang A; 40876298
ENCS
7 Joint enhancement of automatic chest x-ray diagnosis and radiological gaze prediction with multistage cooperative learning Qiu Z; Rivaz H; Xiao Y; 40665596
ENCS
8 Deformable detection transformers for domain adaptable ultrasound localization microscopy with robustness to point spread function variations Gharamaleki SK; Helfield B; Rivaz H; 40640235
PHYSICS
9 SAVE: Self-Attention on Visual Embedding for Zero-Shot Generic Object Counting Zgaren A; Bouachir W; Bouguila N; 39997554
ENCS
10 Association between aggression and ADHD polygenic scores and school-age aggression: the mediating role of preschool externalizing behaviors and adverse experiences Bouliane M; Boivin M; Kretschmer T; Lafreniere B; Paquin S; Tremblay R; Côté S; Gouin JP; Andlauer TFM; Petitclerc A; Ouellet-Morin I; 39907790
PSYCHOLOGY
11 NREM sleep brain networks modulate cognitive recovery from sleep deprivation Lee K; Wang Y; Cross NE; Jegou A; Razavipour F; Pomares FB; Perrault AA; Nguyen A; Aydin Ü; Uji M; Abdallah C; Anticevic A; Frauscher B; Benali H; Dang-Vu TT; Grova C; 39005401
PERFORM
12 The Algorithms of Mindfulness Johannes Bruder 35103028
CONCORDIA
13 Neural substrates of appetitive and aversive prediction error. Iordanova MD, Yau JO, McDannald MA, Corbit LH 33453307
CSBN
14 Predicting Interpersonal Outcomes From Information Processing Tasks Using Personally Relevant and Generic Stimuli: A Methodology Study Serravalle L; Tsekova V; Ellenbogen MA; 33071861
CRDH
15 Synergistic effects of cognitive training and physical exercise on dual-task performance in older adults Bherer L; Gagnon C; Langeard A; Lussier M; Desjardins-Crépeau L; Berryman N; Bosquet L; Vu TTM; Fraser S; Li KZH; Kramer AF; 32803232
PERFORM
16 Prefrontal Cortex and Multiparity in Lactation. Opala EA, Verlezza S, Long H, Rusu D, Woodside B, Walker CD 31437474
CSBN
17 Gating of the neuroendocrine stress responses by stressor salience in early lactating female rats is independent of infralimbic cortex activation and plasticity. Hillerer KM, Woodside B, Parkinson E, Long H, Verlezza S, Walker CD 29397787
CSBN
18 Dehydroepiandrosterone impacts working memory by shaping cortico-hippocampal structural covariance during development. Nguyen TV, Wu M, Lew J, Albaugh MD, Botteron KN, Hudziak JJ, Fonov VS, Collins DL, Campbell BC, Booij L, Herba C, Monnier P, Ducharme S, McCracken JT 28946055
PSYCHOLOGY
19 Limited Benefits of Heterogeneous Dual-Task Training on Transfer Effects in Older Adults. Lussier M, Brouillard P, Bherer L 26603017
PERFORM
20 Specific transfer effects following variable priority dual-task training in older adults. Lussier M, Bugaiska A, Bherer L 27372514
PERFORM

 

Title:A novel span and syntax enhanced large language model based framework for fine-grained sentiment analysis
Authors:Zou HWang YHuang A
Link:https://pubmed.ncbi.nlm.nih.gov/40876298/
DOI:10.1016/j.neunet.2025.108012
Publication:Neural networks : the official journal of the International Neural Network Society
Keywords:Fine-grained sentiment analysisLarge language modelNatural language processingSpan-aware attentionSyntex-aware transformer
PMID:40876298 Category: Date Added:2025-08-29
Dept Affiliation: ENCS
1 School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China; Department of Computer Science and Software Engineering, Concordia University, 2155 Guy Street, Montreal, H3H 2L9, Quebec, Canada. Electronic address: haochen.zou@mail.concordia.ca.
2 School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China. Electronic address: yongliwang@njust.edu.cn.
3 School of Computer Science and Engineering, Nanjing University of Science and Technology, Xiaolingwei Street No.200, Nanjing, 210094, Jiangsu, China. Electronic address: anqihuang@njust.edu.cn.

Description:

Fine-grained aspect-based sentiment analysis requires language models to identify aspect entities and the corresponding sentiment information in the input text content. Transformer-based pre-trained large language models have demonstrated remarkable performance on various challenging natural language processing tasks. However, large language models face limitations in explicitly modelling syntactic relationships and effectively capturing local nuances between terms in the text content, which constrains their capability in fine-grained aspect-based sentiment analysis. We propose a novel span and syntax enhanced joint learning framework based on the latest large language model. The framework incorporates three key components, including the span-aware attention mechanism, the contextual Transformer, and the syntax-aware Transformer, which examine in parallel to generate span-aware features, contextual features, and syntax-aware features, respectively. The three dimensions of analyzed features are dynamically fused in the feature aggregation module, resulting in a combined feature for aspect entity recognition and sentiment classification. To the best of our knowledge, this study represents the pioneering effort to comprehensively leverage span-aware, contextual, and syntax-aware characteristics to augment large language models in addressing the fine-grained aspect-based sentiment analysis task. Experimental results on publicly available benchmark datasets validate the effectiveness of the architecture compared to state-of-the-art baseline competitors.





BookR developed by Sriram Narayanan
for the Concordia University School of Health
Copyright © 2011-2026
Cookie settings
Concordia University