• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索
High Impact Results & Cited Count Trend for Year Keyword Cloud and Partner Relationship

Query:

学者姓名:王一蕾

Refining:

Source

Submit Unfold

Language

Submit

Clean All

Sort by:
Default
  • Default
  • Title
  • Year
  • WOS Cited Count
  • Impact factor
  • Ascending
  • Descending
< Page ,Total 7 >
注意力融合机制和拓扑关系挖掘的异构图神经网络
期刊论文 | 2025 , 53 (1) , 1-9 | 福州大学学报(自然科学版)
Abstract&Keyword Cite

Abstract :

针对异构图神经网络模型依赖元路径和复杂聚合操作导致元路径受限与高成本的不足,提出一种基于注意力融合机制和拓扑关系挖掘的异构图神经网络模型(FTHGNN).该模型首先使用一种轻量级的注意力融合机制,融合全局关系信息和局部节点信息,以较低的时空开销实现更有效的消息聚合;接着使用一种无需先验知识的拓扑关系挖掘方法替代元路径方法,挖掘图上的高阶邻居关系,并引入对比学习捕获图上的高阶语义信息;最后,在4个广泛使用的现实世界异构图数据集上进行的充分实验,验证了 FTHGNN简单而高效,在分类预测准确率上超越了绝大多数现有模型.

Keyword :

图神经网络 图神经网络 对比学习 对比学习 异构图 异构图 注意力机制 注意力机制

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 陈金杰 , 王一蕾 , 傅仰耿 . 注意力融合机制和拓扑关系挖掘的异构图神经网络 [J]. | 福州大学学报(自然科学版) , 2025 , 53 (1) : 1-9 .
MLA 陈金杰 等. "注意力融合机制和拓扑关系挖掘的异构图神经网络" . | 福州大学学报(自然科学版) 53 . 1 (2025) : 1-9 .
APA 陈金杰 , 王一蕾 , 傅仰耿 . 注意力融合机制和拓扑关系挖掘的异构图神经网络 . | 福州大学学报(自然科学版) , 2025 , 53 (1) , 1-9 .
Export to NoteExpress RIS BibTex

Version :

面向中文小样本命名实体识别的BERT优化方法
期刊论文 | 2025 , 46 (3) , 602-611 | 小型微型计算机系统
Abstract&Keyword Cite

Abstract :

为解决中文小样本命名实体识别(NER)任务所面临的问题和挑战,提出了一种面向中文小样本NER的BERT优化方法,该方法包含两方面的优化:首先,针对训练样本数量不足限制了预训练语言模型BERT的语义感知能力的问题,提出了 Pro-ConBERT,一种基于提示学习与对比学习的BERT预训练策略.在提示学习阶段,设计掩码填充模板来训练BERT预测出每个标记对应的中文标签词.在对比学习阶段,利用引导模板训练BERT学习每个标记和标签词之间的相似性与差异性.其次,针对中文缺乏明确的词边界所带来的复杂性和挑战性,修改BERT模型的第一层Transformer结构,并设计了一种带有混合权重引导器的特征融合模块,将词典信息集成到BERT底层中.最后,实验结果验证了所提方法在中文小样本NER任务中的有效性与优越性.该方法结合BERT和条件随机场(CRF)结构,在4个采样的中文NER数据集上取得了最好的性能.特别是在Weibo数据集的3个小样本场景下,模型的F1值分别达到了 63.78%、66.27%、70.90%,与其他方法相比,平均F1值分别提高了16.28%、14.30%、11.20%.此外,将ProConBERT应用到多个基于BERT的中文NER模型中能进一步提升实体识别的性能.

Keyword :

BERT模型 BERT模型 中文小样本命名实体识别 中文小样本命名实体识别 对比学习 对比学习 提示学习 提示学习 特征融合 特征融合 预训练 预训练

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 杨三和 , 赖沛超 , 傅仰耿 et al. 面向中文小样本命名实体识别的BERT优化方法 [J]. | 小型微型计算机系统 , 2025 , 46 (3) : 602-611 .
MLA 杨三和 et al. "面向中文小样本命名实体识别的BERT优化方法" . | 小型微型计算机系统 46 . 3 (2025) : 602-611 .
APA 杨三和 , 赖沛超 , 傅仰耿 , 王一蕾 , 叶飞扬 , 张林 . 面向中文小样本命名实体识别的BERT优化方法 . | 小型微型计算机系统 , 2025 , 46 (3) , 602-611 .
Export to NoteExpress RIS BibTex

Version :

FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition SCIE
期刊论文 | 2024 , 90 | COMPUTER SPEECH AND LANGUAGE
Abstract&Keyword Cite Version(2)

Abstract :

Although significant progress has been made in Chinese Named Entity Recognition (NER) methods based on deep learning, their performance often falls short in few-shot scenarios. Feature enhancement is considered a promising approach to address the issue of Chinese few-shot NER. However, traditional feature fusion methods tend to lead to the loss of important information and the integration of irrelevant information. Despite the benefits of incorporating BERT for improving entity recognition, its performance is limited when training data is insufficient. To tackle these challenges, this paper proposes a Feature Enhancement-based approach for Chinese Few-shot NER called FE-CFNER. FE-CFNER designs a double cross neural network to minimize information loss through the interaction of feature cross twice. Additionally, adaptive weights and a top-k mechanism are introduced to sparsify attention distributions, enabling the model to prioritize important information related to entities while excluding irrelevant information. To further enhance the quality of BERT embeddings, FE-CFNER employs a contrastive template for contrastive learning pre-training of BERT, enhancing BERT's semantic understanding capability. We evaluate the proposed method on four sampled Chinese NER datasets: Weibo, Resume, Taobao, and Youku. Experimental results validate the effectiveness and superiority of FE-CFNER in Chinese few-shot NER tasks.

Keyword :

Chinese Named Entity Recognition Chinese Named Entity Recognition Contrastive learning pre-training Contrastive learning pre-training Feature enhancement Feature enhancement Few-shot learning Few-shot learning

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Yang, Sanhe , Lai, Peichao , Fang, Ruixiong et al. FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition [J]. | COMPUTER SPEECH AND LANGUAGE , 2024 , 90 .
MLA Yang, Sanhe et al. "FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition" . | COMPUTER SPEECH AND LANGUAGE 90 (2024) .
APA Yang, Sanhe , Lai, Peichao , Fang, Ruixiong , Fu, Yanggeng , Ye, Feiyang , Wang, Yilei . FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition . | COMPUTER SPEECH AND LANGUAGE , 2024 , 90 .
Export to NoteExpress RIS BibTex

Version :

FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition EI
期刊论文 | 2025 , 90 | Computer Speech and Language
FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition Scopus
期刊论文 | 2025 , 90 | Computer Speech and Language
Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification CPCI-S
期刊论文 | 2024 | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024
Abstract&Keyword Cite Version(2)

Abstract :

Quantum-inspired models have shown enhanced capabilities in various language tasks, including question answering and sentiment analysis. However, current complex-valued-based models primarily focus on sentence embedding, overlooking the significance of the quantum evolution process and the extra time cost incurred by complex expressions. In this work, we present a novel quantum-inspired neural network, SSS-QNN, which integrates the Stochastic Liouville-von Neumann Equation (SLE) to simulate the evolution process and the complex-valued simple recurrent unit (SRU) to reduce the time cost, offering the model physical meaning, thus enhancing the interpretability. We conduct comprehensive experiments on both sentence-level and document-level sentiment classification datasets. Compared to traditional models, large language models, and quantum-inspired models, SSS-QNN demonstrates competitive performance in accuracy and time cost. Additional ablation tests verify the effectiveness of the proposed modules.

Keyword :

deep learning deep learning quantum-inspired neural network quantum-inspired neural network quantum theory quantum theory sentiment classification sentiment classification

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Yan, Kehuan , Lai, Peichao , Lyu, Qingwei et al. Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification [J]. | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 , 2024 .
MLA Yan, Kehuan et al. "Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification" . | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 (2024) .
APA Yan, Kehuan , Lai, Peichao , Lyu, Qingwei , Wang, Yilei . Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification . | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 , 2024 .
Export to NoteExpress RIS BibTex

Version :

Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification EI
会议论文 | 2024
Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification Scopus
其他 | 2024 | Proceedings of the International Joint Conference on Neural Networks
NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings CPCI-S
期刊论文 | 2024 | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024
Abstract&Keyword Cite Version(2)

Abstract :

Unsupervised sentence embedding methods based on contrastive learning have gained attention for effectively representing sentences in natural language processing. Retrieving additional samples via a nearest-neighbor approach can enhance the model's ability to learn relevant semantics and distinguish sentences. However, previous related research mainly focused on retrieving neighboring samples within a single batch range or global range, which makes the model possibly unable to capture effective semantic information or incurs excessive time cost. Furthermore, previous methods use retrieved neighbor samples as hard negatives. We argue that nearest neighbor samples contain relevant semantic information, and treating them as hard negatives risks losing valuable semantic knowledge. In this work, we introduce Neighbor Contrastive learning for unsupervised Sentence Embeddings(NCSE), which combines contrastive learning with the nearest-neighbor approach. Specifically, we create a candidate set to store sentence embeddings across multiple batches. Retrieving the candidate set can ensure sufficient samples, making it easier for the model to learn relevant semantics. Using retrieved nearest neighbor samples as positives and applying the self-attention mechanism to aggregate the sample and its neighbors encourages the model to learn relevant semantics from multiple neighbors. Experiments on the semantic text similarity task demonstrate our method's effectiveness in sentence embedding learning.

Keyword :

contrastive learning contrastive learning sentence embedding sentence embedding unsupervised learning unsupervised learning

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Zhang, Zhengfeng , Lai, Peichao , Wang, Ruiqing et al. NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings [J]. | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 , 2024 .
MLA Zhang, Zhengfeng et al. "NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings" . | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 (2024) .
APA Zhang, Zhengfeng , Lai, Peichao , Wang, Ruiqing , Ye, Feiyang , Wang, Yilei . NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings . | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 , 2024 .
Export to NoteExpress RIS BibTex

Version :

NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings Scopus
其他 | 2024 | Proceedings of the International Joint Conference on Neural Networks
NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings EI
会议论文 | 2024
Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis Scopus
其他 | 2024 , 1 , 2112-2121
Abstract&Keyword Cite

Abstract :

Quantum-inspired models have demonstrated superior performance in many downstream language tasks, such as question answering and sentiment analysis. However, recent models primarily focus on embedding and measurement operations, overlooking the significance of the quantum evolution process. In this work, we present a novel quantum-inspired neural network, LI-QiLM, which integrates the Lindblad Master Equation (LME) to model the evolution process and the interferometry to the measurement process, providing more physical meaning to strengthen the interpretability. We conduct comprehensive experiments on six sentiment analysis datasets. Compared to the traditional neural networks, transformer-based pre-trained models and quantum-inspired models, such as CICWE-QNN and ComplexQNN, the proposed method demonstrates superior performance in accuracy and F1-score on six commonly used datasets for sentiment analysis. Additional ablation tests verify the effectiveness of LME and interferometry. © 2024 Association for Computational Linguistics.

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Yan, K. , Lai, P. , Wang, Y. . Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis [未知].
MLA Yan, K. et al. "Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis" [未知].
APA Yan, K. , Lai, P. , Wang, Y. . Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis [未知].
Export to NoteExpress RIS BibTex

Version :

Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis EI
会议论文 | 2024 , 1 , 2112-2121 | 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL 2024
Abstract&Keyword Cite

Abstract :

Quantum-inspired models have demonstrated superior performance in many downstream language tasks, such as question answering and sentiment analysis. However, recent models primarily focus on embedding and measurement operations, overlooking the significance of the quantum evolution process. In this work, we present a novel quantum-inspired neural network, LI-QiLM, which integrates the Lindblad Master Equation (LME) to model the evolution process and the interferometry to the measurement process, providing more physical meaning to strengthen the interpretability. We conduct comprehensive experiments on six sentiment analysis datasets. Compared to the traditional neural networks, transformer-based pre-trained models and quantum-inspired models, such as CICWE-QNN and ComplexQNN, the proposed method demonstrates superior performance in accuracy and F1-score on six commonly used datasets for sentiment analysis. Additional ablation tests verify the effectiveness of LME and interferometry. © 2024 Association for Computational Linguistics.

Keyword :

Computational linguistics Computational linguistics Interferometry Interferometry Sentiment analysis Sentiment analysis

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Yan, Kehuan , Lai, Peichao , Wang, Yilei . Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis [C] . 2024 : 2112-2121 .
MLA Yan, Kehuan et al. "Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis" . (2024) : 2112-2121 .
APA Yan, Kehuan , Lai, Peichao , Wang, Yilei . Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis . (2024) : 2112-2121 .
Export to NoteExpress RIS BibTex

Version :

CogNLG: Cognitive graph for KG-to-text generation SCIE
期刊论文 | 2023 , 41 (1) | EXPERT SYSTEMS
Abstract&Keyword Cite Version(2)

Abstract :

Knowledge graph (KG) has been fully considered in natural language generation (NLG) tasks. A KG can help models generate controllable text and achieve better performance. However, most existing related approaches still lack explainability and scalability in large-scale knowledge reasoning. In this work, we propose a novel CogNLG framework for KG-to-text generation tasks. Our CogNLG is implemented based on the dual-process theory in cognitive science. It consists of two systems: one system acts as the analytic system for knowledge extraction, and another is the perceptual system for text generation by using existing knowledge. During text generation, CogNLG provides a visible and explainable reasoning path. Our framework shows excellent performance on all datasets and achieves a BLEU score of 36.7, which increases by 6.7 compared to the best competitor.

Keyword :

cognitive graph cognitive graph KG-to-text KG-to-text natural language generation natural language generation

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Lai, Peichao , Ye, Feiyang , Fu, Yanggeng et al. CogNLG: Cognitive graph for KG-to-text generation [J]. | EXPERT SYSTEMS , 2023 , 41 (1) .
MLA Lai, Peichao et al. "CogNLG: Cognitive graph for KG-to-text generation" . | EXPERT SYSTEMS 41 . 1 (2023) .
APA Lai, Peichao , Ye, Feiyang , Fu, Yanggeng , Chen, Zhiwei , Wu, Yingjie , Wang, Yilei et al. CogNLG: Cognitive graph for KG-to-text generation . | EXPERT SYSTEMS , 2023 , 41 (1) .
Export to NoteExpress RIS BibTex

Version :

CogNLG: Cognitive graph for KG-to-text generation Scopus
期刊论文 | 2023 , 41 (1) | Expert Systems
CogNLG: Cognitive graph for KG-to-text generation EI
期刊论文 | 2024 , 41 (1) | Expert Systems
M-Sim: Multi-level Semantic Inference Model for Chinese short answer scoring in low-resource scenarios SCIE
期刊论文 | 2023 , 84 | COMPUTER SPEECH AND LANGUAGE
Abstract&Keyword Cite Version(2)

Abstract :

Short answer scoring is a significant task in natural language processing. On datasets comprising numerous explicit or implicit symbols and quantization entities, the existing approaches continue to perform poorly. Additionally, the majority of relevant datasets contain few-shot samples, reducing model efficacy in low-resource scenarios. To solve the above issues, we propose a Multi-level Semantic Inference Model (M-Sim), which obtains features at multiple scales to fully consider the explicit or implicit entity information contained in the data. We then design a prompt-based data augmentation to construct the simulated datasets, which effectively enhance model performance in low-resource scenarios. Our M-Sim outperforms the best competitor models by an average of 1.48 percent in the F1 score. The data augmentation significantly increases all approaches' performance by an average of 0.036 in correlation coefficient scores.

Keyword :

Few-shot learning Few-shot learning Short answer scoring Short answer scoring Text similarity Text similarity

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Lai, Peichao , Ye, Feiyang , Fu, Yanggeng et al. M-Sim: Multi-level Semantic Inference Model for Chinese short answer scoring in low-resource scenarios [J]. | COMPUTER SPEECH AND LANGUAGE , 2023 , 84 .
MLA Lai, Peichao et al. "M-Sim: Multi-level Semantic Inference Model for Chinese short answer scoring in low-resource scenarios" . | COMPUTER SPEECH AND LANGUAGE 84 (2023) .
APA Lai, Peichao , Ye, Feiyang , Fu, Yanggeng , Chen, Zhiwei , Wu, Yingjie , Wang, Yilei . M-Sim: Multi-level Semantic Inference Model for Chinese short answer scoring in low-resource scenarios . | COMPUTER SPEECH AND LANGUAGE , 2023 , 84 .
Export to NoteExpress RIS BibTex

Version :

M-Sim: Multi-level Semantic Inference Model for Chinese short answer scoring in low-resource scenarios Scopus
期刊论文 | 2024 , 84 | Computer Speech and Language
M-Sim: Multi-level Semantic Inference Model for Chinese short answer scoring in low-resource scenarios EI
期刊论文 | 2024 , 84 | Computer Speech and Language
Chinese Medical Named Entity Recognition Using External Knowledge CPCI-S
期刊论文 | 2022 , 13630 , 359-371 | PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II
WoS CC Cited Count: 2
Abstract&Keyword Cite Version(1)

Abstract :

Chinese medical named entity recognition (NER) task usually lacks sufficient annotation data, and it contains many medical professional terms and abbreviations, making the NER task more difficult. In addition, compared with English NER, Chinese NER is more challenging because it lacks standard feature symbols to determine named entity boundaries. Therefore, Chinese NER needs to perform word segmentation. In this paper, we are inspired by lexicon-based BERT and propose a novel method for Chinese medical NER task. Besides, We design a template-based strategy to enrich the words' information and improve the model's ability to distinguish medical professional terms and abbreviations. Our method enhances the word segmentation accuracy by introducing the external medical lexicon. To verify the effectiveness of our method, we carry out experiments on three medical datasets and our method improves them by 0.92%, 1.18% and 1.55% F1-score compared to baseline.

Keyword :

Chinese medical NER Chinese medical NER External knowledge External knowledge Prompt Prompt

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Zhang, Lin , Lai, Peichao , Ye, Feiyang et al. Chinese Medical Named Entity Recognition Using External Knowledge [J]. | PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II , 2022 , 13630 : 359-371 .
MLA Zhang, Lin et al. "Chinese Medical Named Entity Recognition Using External Knowledge" . | PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II 13630 (2022) : 359-371 .
APA Zhang, Lin , Lai, Peichao , Ye, Feiyang , Fang, Ruixiong , Wang, Ruiqing , Li, Jiayong et al. Chinese Medical Named Entity Recognition Using External Knowledge . | PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II , 2022 , 13630 , 359-371 .
Export to NoteExpress RIS BibTex

Version :

Chinese Medical Named Entity Recognition Using External Knowledge EI
会议论文 | 2022 , 13630 LNCS , 359-371
10| 20| 50 per page
< Page ,Total 7 >

Export

Results:

Selected

to

Format:
Online/Total:286/10117624
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1