• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索
High Impact Results & Cited Count Trend for Year Keyword Cloud and Partner Relationship

Query:

学者姓名:王一蕾

Refining:

Source

Submit Unfold

Co-

Submit Unfold

Language

Submit

Clean All

Sort by:
Default
  • Default
  • Title
  • Year
  • WOS Cited Count
  • Impact factor
  • Ascending
  • Descending
< Page ,Total 8 >
注意力融合机制和拓扑关系挖掘的异构图神经网络
期刊论文 | 2025 , 53 (1) , 1-9 | 福州大学学报(自然科学版)
Abstract&Keyword Cite

Abstract :

针对异构图神经网络模型依赖元路径和复杂聚合操作导致元路径受限与高成本的不足,提出一种基于注意力融合机制和拓扑关系挖掘的异构图神经网络模型(FTHGNN).该模型首先使用一种轻量级的注意力融合机制,融合全局关系信息和局部节点信息,以较低的时空开销实现更有效的消息聚合;接着使用一种无需先验知识的拓扑关系挖掘方法替代元路径方法,挖掘图上的高阶邻居关系,并引入对比学习捕获图上的高阶语义信息;最后,在4个广泛使用的现实世界异构图数据集上进行的充分实验,验证了 FTHGNN简单而高效,在分类预测准确率上超越了绝大多数现有模型.

Keyword :

图神经网络 图神经网络 对比学习 对比学习 异构图 异构图 注意力机制 注意力机制

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 陈金杰 , 王一蕾 , 傅仰耿 . 注意力融合机制和拓扑关系挖掘的异构图神经网络 [J]. | 福州大学学报(自然科学版) , 2025 , 53 (1) : 1-9 .
MLA 陈金杰 等. "注意力融合机制和拓扑关系挖掘的异构图神经网络" . | 福州大学学报(自然科学版) 53 . 1 (2025) : 1-9 .
APA 陈金杰 , 王一蕾 , 傅仰耿 . 注意力融合机制和拓扑关系挖掘的异构图神经网络 . | 福州大学学报(自然科学版) , 2025 , 53 (1) , 1-9 .
Export to NoteExpress RIS BibTex

Version :

Span-Based Chinese Few-Shot NER with Contrastive and Prompt Learning EI
会议论文 | 2025 , 15360 LNAI , 43-55 | 13th CCF International Conference on Natural Language Processing and Chinese Computing, NLPCC 2024
Abstract&Keyword Cite Version(2)

Abstract :

For Chinese Named Entity Recognition (NER) tasks, achieving better performance with fewer training samples remains a challenge. Previous works primarily focus on enhancing model performance in NER by incorporating additional knowledge to construct entity features. These approaches neglect the semantic information of entity labels and the information of entity boundaries. Moreover, conventional methods typically treat NER as a sequence labeling task, which makes them inadequate for addressing the issue of nested entities. We propose a new span-based approach by using contrastive learning and prompt learning to address these problems. By pulling similar entities closer together, pushing dissimilar entities further apart, and leveraging entity label information, we improve model performance in few-shot scenarios effectively. Experimental results demonstrate that our method achieves significant performance improvements on a sampled Chinese nested medical dataset and several other flattened datasets, providing a new insight into addressing challenges in few-shot NER tasks. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025.

Keyword :

Adversarial machine learning Adversarial machine learning Contrastive Learning Contrastive Learning Federated learning Federated learning Zero-shot learning Zero-shot learning

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Ye, Feiyang , Lai, Peichao , Yang, Sanhe et al. Span-Based Chinese Few-Shot NER with Contrastive and Prompt Learning [C] . 2025 : 43-55 .
MLA Ye, Feiyang et al. "Span-Based Chinese Few-Shot NER with Contrastive and Prompt Learning" . (2025) : 43-55 .
APA Ye, Feiyang , Lai, Peichao , Yang, Sanhe , Zhang, Zhengfeng , Wang, Yilei . Span-Based Chinese Few-Shot NER with Contrastive and Prompt Learning . (2025) : 43-55 .
Export to NoteExpress RIS BibTex

Version :

Span-Based Chinese Few-Shot NER with Contrastive and Prompt Learning CPCI-S
期刊论文 | 2025 , 15360 , 43-55 | NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT II, NLPCC 2024
Span-Based Chinese Few-Shot NER with Contrastive and Prompt Learning Scopus
其他 | 2025 , 15360 LNAI , 43-55 | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
面向中文小样本命名实体识别的BERT优化方法
期刊论文 | 2025 , 46 (3) , 602-611 | 小型微型计算机系统
Abstract&Keyword Cite

Abstract :

为解决中文小样本命名实体识别(NER)任务所面临的问题和挑战,提出了一种面向中文小样本NER的BERT优化方法,该方法包含两方面的优化:首先,针对训练样本数量不足限制了预训练语言模型BERT的语义感知能力的问题,提出了 Pro-ConBERT,一种基于提示学习与对比学习的BERT预训练策略.在提示学习阶段,设计掩码填充模板来训练BERT预测出每个标记对应的中文标签词.在对比学习阶段,利用引导模板训练BERT学习每个标记和标签词之间的相似性与差异性.其次,针对中文缺乏明确的词边界所带来的复杂性和挑战性,修改BERT模型的第一层Transformer结构,并设计了一种带有混合权重引导器的特征融合模块,将词典信息集成到BERT底层中.最后,实验结果验证了所提方法在中文小样本NER任务中的有效性与优越性.该方法结合BERT和条件随机场(CRF)结构,在4个采样的中文NER数据集上取得了最好的性能.特别是在Weibo数据集的3个小样本场景下,模型的F1值分别达到了 63.78%、66.27%、70.90%,与其他方法相比,平均F1值分别提高了16.28%、14.30%、11.20%.此外,将ProConBERT应用到多个基于BERT的中文NER模型中能进一步提升实体识别的性能.

Keyword :

BERT模型 BERT模型 中文小样本命名实体识别 中文小样本命名实体识别 对比学习 对比学习 提示学习 提示学习 特征融合 特征融合 预训练 预训练

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 杨三和 , 赖沛超 , 傅仰耿 et al. 面向中文小样本命名实体识别的BERT优化方法 [J]. | 小型微型计算机系统 , 2025 , 46 (3) : 602-611 .
MLA 杨三和 et al. "面向中文小样本命名实体识别的BERT优化方法" . | 小型微型计算机系统 46 . 3 (2025) : 602-611 .
APA 杨三和 , 赖沛超 , 傅仰耿 , 王一蕾 , 叶飞扬 , 张林 . 面向中文小样本命名实体识别的BERT优化方法 . | 小型微型计算机系统 , 2025 , 46 (3) , 602-611 .
Export to NoteExpress RIS BibTex

Version :

Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis EI
会议论文 | 2024 , 1 , 2112-2121 | 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL 2024
Abstract&Keyword Cite

Abstract :

Quantum-inspired models have demonstrated superior performance in many downstream language tasks, such as question answering and sentiment analysis. However, recent models primarily focus on embedding and measurement operations, overlooking the significance of the quantum evolution process. In this work, we present a novel quantum-inspired neural network, LI-QiLM, which integrates the Lindblad Master Equation (LME) to model the evolution process and the interferometry to the measurement process, providing more physical meaning to strengthen the interpretability. We conduct comprehensive experiments on six sentiment analysis datasets. Compared to the traditional neural networks, transformer-based pre-trained models and quantum-inspired models, such as CICWE-QNN and ComplexQNN, the proposed method demonstrates superior performance in accuracy and F1-score on six commonly used datasets for sentiment analysis. Additional ablation tests verify the effectiveness of LME and interferometry. © 2024 Association for Computational Linguistics.

Keyword :

Computational linguistics Computational linguistics Interferometry Interferometry Sentiment analysis Sentiment analysis

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Yan, Kehuan , Lai, Peichao , Wang, Yilei . Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis [C] . 2024 : 2112-2121 .
MLA Yan, Kehuan et al. "Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis" . (2024) : 2112-2121 .
APA Yan, Kehuan , Lai, Peichao , Wang, Yilei . Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis . (2024) : 2112-2121 .
Export to NoteExpress RIS BibTex

Version :

Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis Scopus
其他 | 2024 , 1 , 2112-2121
Abstract&Keyword Cite

Abstract :

Quantum-inspired models have demonstrated superior performance in many downstream language tasks, such as question answering and sentiment analysis. However, recent models primarily focus on embedding and measurement operations, overlooking the significance of the quantum evolution process. In this work, we present a novel quantum-inspired neural network, LI-QiLM, which integrates the Lindblad Master Equation (LME) to model the evolution process and the interferometry to the measurement process, providing more physical meaning to strengthen the interpretability. We conduct comprehensive experiments on six sentiment analysis datasets. Compared to the traditional neural networks, transformer-based pre-trained models and quantum-inspired models, such as CICWE-QNN and ComplexQNN, the proposed method demonstrates superior performance in accuracy and F1-score on six commonly used datasets for sentiment analysis. Additional ablation tests verify the effectiveness of LME and interferometry. © 2024 Association for Computational Linguistics.

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Yan, K. , Lai, P. , Wang, Y. . Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis [未知].
MLA Yan, K. et al. "Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis" [未知].
APA Yan, K. , Lai, P. , Wang, Y. . Quantum-inspired Language Model with Lindblad Master Equation and Interference Measurement for Sentiment Analysis [未知].
Export to NoteExpress RIS BibTex

Version :

FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition SCIE
期刊论文 | 2024 , 90 | COMPUTER SPEECH AND LANGUAGE
Abstract&Keyword Cite Version(2)

Abstract :

Although significant progress has been made in Chinese Named Entity Recognition (NER) methods based on deep learning, their performance often falls short in few-shot scenarios. Feature enhancement is considered a promising approach to address the issue of Chinese few-shot NER. However, traditional feature fusion methods tend to lead to the loss of important information and the integration of irrelevant information. Despite the benefits of incorporating BERT for improving entity recognition, its performance is limited when training data is insufficient. To tackle these challenges, this paper proposes a Feature Enhancement-based approach for Chinese Few-shot NER called FE-CFNER. FE-CFNER designs a double cross neural network to minimize information loss through the interaction of feature cross twice. Additionally, adaptive weights and a top-k mechanism are introduced to sparsify attention distributions, enabling the model to prioritize important information related to entities while excluding irrelevant information. To further enhance the quality of BERT embeddings, FE-CFNER employs a contrastive template for contrastive learning pre-training of BERT, enhancing BERT's semantic understanding capability. We evaluate the proposed method on four sampled Chinese NER datasets: Weibo, Resume, Taobao, and Youku. Experimental results validate the effectiveness and superiority of FE-CFNER in Chinese few-shot NER tasks.

Keyword :

Chinese Named Entity Recognition Chinese Named Entity Recognition Contrastive learning pre-training Contrastive learning pre-training Feature enhancement Feature enhancement Few-shot learning Few-shot learning

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Yang, Sanhe , Lai, Peichao , Fang, Ruixiong et al. FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition [J]. | COMPUTER SPEECH AND LANGUAGE , 2024 , 90 .
MLA Yang, Sanhe et al. "FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition" . | COMPUTER SPEECH AND LANGUAGE 90 (2024) .
APA Yang, Sanhe , Lai, Peichao , Fang, Ruixiong , Fu, Yanggeng , Ye, Feiyang , Wang, Yilei . FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition . | COMPUTER SPEECH AND LANGUAGE , 2024 , 90 .
Export to NoteExpress RIS BibTex

Version :

FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition EI
期刊论文 | 2025 , 90 | Computer Speech and Language
FE-CFNER: Feature Enhancement-based approach for Chinese Few-shot Named Entity Recognition Scopus
期刊论文 | 2025 , 90 | Computer Speech and Language
NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings CPCI-S
期刊论文 | 2024 | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024
Abstract&Keyword Cite Version(2)

Abstract :

Unsupervised sentence embedding methods based on contrastive learning have gained attention for effectively representing sentences in natural language processing. Retrieving additional samples via a nearest-neighbor approach can enhance the model's ability to learn relevant semantics and distinguish sentences. However, previous related research mainly focused on retrieving neighboring samples within a single batch range or global range, which makes the model possibly unable to capture effective semantic information or incurs excessive time cost. Furthermore, previous methods use retrieved neighbor samples as hard negatives. We argue that nearest neighbor samples contain relevant semantic information, and treating them as hard negatives risks losing valuable semantic knowledge. In this work, we introduce Neighbor Contrastive learning for unsupervised Sentence Embeddings(NCSE), which combines contrastive learning with the nearest-neighbor approach. Specifically, we create a candidate set to store sentence embeddings across multiple batches. Retrieving the candidate set can ensure sufficient samples, making it easier for the model to learn relevant semantics. Using retrieved nearest neighbor samples as positives and applying the self-attention mechanism to aggregate the sample and its neighbors encourages the model to learn relevant semantics from multiple neighbors. Experiments on the semantic text similarity task demonstrate our method's effectiveness in sentence embedding learning.

Keyword :

contrastive learning contrastive learning sentence embedding sentence embedding unsupervised learning unsupervised learning

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Zhang, Zhengfeng , Lai, Peichao , Wang, Ruiqing et al. NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings [J]. | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 , 2024 .
MLA Zhang, Zhengfeng et al. "NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings" . | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 (2024) .
APA Zhang, Zhengfeng , Lai, Peichao , Wang, Ruiqing , Ye, Feiyang , Wang, Yilei . NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings . | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 , 2024 .
Export to NoteExpress RIS BibTex

Version :

NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings Scopus
其他 | 2024 | Proceedings of the International Joint Conference on Neural Networks
NCSE: Neighbor Contrastive Learning for Unsupervised Sentence Embeddings EI
会议论文 | 2024
Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification CPCI-S
期刊论文 | 2024 | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024
Abstract&Keyword Cite Version(2)

Abstract :

Quantum-inspired models have shown enhanced capabilities in various language tasks, including question answering and sentiment analysis. However, current complex-valued-based models primarily focus on sentence embedding, overlooking the significance of the quantum evolution process and the extra time cost incurred by complex expressions. In this work, we present a novel quantum-inspired neural network, SSS-QNN, which integrates the Stochastic Liouville-von Neumann Equation (SLE) to simulate the evolution process and the complex-valued simple recurrent unit (SRU) to reduce the time cost, offering the model physical meaning, thus enhancing the interpretability. We conduct comprehensive experiments on both sentence-level and document-level sentiment classification datasets. Compared to traditional models, large language models, and quantum-inspired models, SSS-QNN demonstrates competitive performance in accuracy and time cost. Additional ablation tests verify the effectiveness of the proposed modules.

Keyword :

deep learning deep learning quantum-inspired neural network quantum-inspired neural network quantum theory quantum theory sentiment classification sentiment classification

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Yan, Kehuan , Lai, Peichao , Lyu, Qingwei et al. Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification [J]. | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 , 2024 .
MLA Yan, Kehuan et al. "Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification" . | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 (2024) .
APA Yan, Kehuan , Lai, Peichao , Lyu, Qingwei , Wang, Yilei . Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification . | 2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 , 2024 .
Export to NoteExpress RIS BibTex

Version :

Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification EI
会议论文 | 2024
Quantum-inspired Neural Network Based on Stochastic Liouville-von Neumann Equation for Sentiment Classification Scopus
其他 | 2024 | Proceedings of the International Joint Conference on Neural Networks
Intelligent Experimental Teaching Auxiliary Platform Based on BERT Scopus
其他 | 2023 , 1812 CCIS , 245-258
Abstract&Keyword Cite Version(1)

Abstract :

Due to the problems of poor scalability, difficult experimental evaluation, and the lack of teaching data analysis and collaborative sharing in traditional experimental teaching platforms, this paper designs an interactive and scalable intelligent experimental teaching auxiliary platform BERTDS based on deep learning algorithms and computer technology. The platform provides a wide range of functions, such as the release of experimental resources, online Q&A, cloud storage sharing, automatic evaluation, similarity detection, evaluation and assignment management, etc. This paper first introduces the design idea and overall architecture of the experimental platform based on the deep learning BERT framework; then expounds the design of the organization module and automated evaluation engine that support a variety of experimental schemes and the distributed deployment scheme of the server; finally, through the actual application data analysis and user Research feedback to prove the feasibility and effectiveness of the platform. © 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

Keyword :

automatic evaluation automatic evaluation BERT BERT Experiment platform Experiment platform Intelligent teaching system Intelligent teaching system

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Liu, W. , Wang, Y. , Fu, Y. et al. Intelligent Experimental Teaching Auxiliary Platform Based on BERT [未知].
MLA Liu, W. et al. "Intelligent Experimental Teaching Auxiliary Platform Based on BERT" [未知].
APA Liu, W. , Wang, Y. , Fu, Y. , Ye, F. , Lai, P. . Intelligent Experimental Teaching Auxiliary Platform Based on BERT [未知].
Export to NoteExpress RIS BibTex

Version :

Intelligent Experimental Teaching Auxiliary Platform Based on BERT EI
会议论文 | 2023 , 1812 CCIS , 245-258
面向稀疏数据场景的生成对抗网络推荐算法 PKU
期刊论文 | 2023 , 51 (4) , 467-474 | 福州大学学报(自然科学版)
Abstract&Keyword Cite Version(2)

Abstract :

提出一个改进的基于生成对抗网络的协同过滤(CFGAN)的模型,通过引入增强的置换注意力机制,强化其面向稀疏数据集的特征聚焦能力,并考虑用户可能交互物品对推荐结果的影响.此外,将协同用户社交网络从用户反馈中提取的语义好友特征嵌入CFGAN,以实现负样本的个性化抽取,进一步提升模型面向稀疏数据场景的推荐效果.

Keyword :

个性化推荐 个性化推荐 协同用户社交网络 协同用户社交网络 数据稀疏 数据稀疏 生成对抗网络 生成对抗网络 置换注意力 置换注意力

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 陈文婷 , 陈学勤 , 王伟津 et al. 面向稀疏数据场景的生成对抗网络推荐算法 [J]. | 福州大学学报(自然科学版) , 2023 , 51 (4) : 467-474 .
MLA 陈文婷 et al. "面向稀疏数据场景的生成对抗网络推荐算法" . | 福州大学学报(自然科学版) 51 . 4 (2023) : 467-474 .
APA 陈文婷 , 陈学勤 , 王伟津 , 蔡毅津 , 王一蕾 . 面向稀疏数据场景的生成对抗网络推荐算法 . | 福州大学学报(自然科学版) , 2023 , 51 (4) , 467-474 .
Export to NoteExpress RIS BibTex

Version :

面向稀疏数据场景的生成对抗网络推荐算法 PKU
期刊论文 | 2023 , 51 (04) , 467-474 | 福州大学学报(自然科学版)
面向稀疏数据场景的生成对抗网络推荐算法 PKU
期刊论文 | 2023 , 51 (04) , 467-474 | 福州大学学报(自然科学版)
10| 20| 50 per page
< Page ,Total 8 >

Export

Results:

Selected

to

Format:
Online/Total:1507/9726155
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1