• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索
High Impact Results & Cited Count Trend for Year Keyword Cloud and Partner Relationship
Sort by:
Default
  • Default
  • Title
  • Year
  • WOS Cited Count
  • Impact factor
  • Ascending
  • Descending
< Page ,Total 1 >
Impacts of memory-based and non-memory-based adoption in social contagion SCIE
期刊论文 | 2025 , 35 (3) | CHAOS
Abstract&Keyword Cite Version(1)

Abstract :

In information diffusion within social networks, whether individuals adopt information often depends on the current and past information they receive. Some individuals adopt based on current information (i.e., no memory), while others rely on past information (i.e., with memory). Previous studies mainly focused on irreversible processes, such as the classic susceptible-infected and susceptible-infected-recovered threshold models, with less attention to reversible processes like the susceptible-infected-susceptible model. In this paper, we propose a susceptible-adopted-susceptible threshold model to study the competition between these two types of nodes and its impact on information diffusion. We also examine how memory length and differences in the adoption thresholds affect the diffusion process. First, we develop homogeneous and heterogeneous mean-field theories that accurately predict simulation results. Numerical simulations reveal that when node adoption thresholds are equal, increasing memory length raises the propagation threshold, thereby suppressing diffusion. When the adoption thresholds of the two node types differ, such as non-memory nodes having a lower threshold than memory-based nodes, increasing the memory length of the latter has little effect on the propagation threshold of the former. However, when the adoption threshold of the non-memory nodes is much higher than that of the memory-based nodes, increasing the memory length of the latter significantly suppresses the propagation threshold of the non-memory nodes. In heterogeneous networks, we find that as the degree of heterogeneity increases, the outbreak size of epidemic diffusion becomes smaller, while the propagation threshold also decreases. This work offers deeper insights into the impact of memory-based and non-memory-based adoption in social contagion.

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Lin, Zhao-Hua , Zhuo, Linhai , Ding, Wangbin et al. Impacts of memory-based and non-memory-based adoption in social contagion [J]. | CHAOS , 2025 , 35 (3) .
MLA Lin, Zhao-Hua et al. "Impacts of memory-based and non-memory-based adoption in social contagion" . | CHAOS 35 . 3 (2025) .
APA Lin, Zhao-Hua , Zhuo, Linhai , Ding, Wangbin , Wang, Xinhui , Han, Lilei . Impacts of memory-based and non-memory-based adoption in social contagion . | CHAOS , 2025 , 35 (3) .
Export to NoteExpress RIS BibTex

Version :

Impacts of memory-based and non-memory-based adoption in social contagion Scopus
期刊论文 | 2025 , 35 (3) | Chaos
Unified View Empirical Study for Large Pretrained Model on Cross-Domain Few-Shot Learning EI
期刊论文 | 2024 , 20 (9) | ACM Transactions on Multimedia Computing, Communications and Applications
Abstract&Keyword Cite

Abstract :

The challenge of cross-domain few-shot learning (CD-FSL) stems from the substantial distribution disparities between target and source domain images, necessitating a model with robust generalization capabilities. In this work, we posit that large-scale pretrained models are pivotal in addressing the CD-FSL task owing to their exceptional representational and generalization prowess. To our knowledge, no existing research comprehensively investigates the utility of large-scale pretrained models in the CD-FSL context. Addressing this gap, our study presents an exhaustive empirical assessment of the Contrastive Language-Image Pre-Training model within the CD-FSL task. We undertake a comparison spanning six dimensions: base model, transfer module, classifier, loss, data augmentation, and training schedule. Furthermore, we establish a straightforward baseline model, E-base, based on our empirical analysis, underscoring the importance of our investigation. Experimental results substantiate the efficacy of our model, yielding a mean gain of 1.2% in 5-way 5-shot evaluations on the BSCD dataset. © 2024 Copyright held by the owner/author(s). Publication rights licensed to ACM.

Keyword :

Adversarial machine learning Adversarial machine learning Contrastive Learning Contrastive Learning Zero-shot learning Zero-shot learning

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Zhuo, Linhai , Fu, Yuqian , Chen, Jingjing et al. Unified View Empirical Study for Large Pretrained Model on Cross-Domain Few-Shot Learning [J]. | ACM Transactions on Multimedia Computing, Communications and Applications , 2024 , 20 (9) .
MLA Zhuo, Linhai et al. "Unified View Empirical Study for Large Pretrained Model on Cross-Domain Few-Shot Learning" . | ACM Transactions on Multimedia Computing, Communications and Applications 20 . 9 (2024) .
APA Zhuo, Linhai , Fu, Yuqian , Chen, Jingjing , Cao, Yixin , Jiang, Yu-Gang . Unified View Empirical Study for Large Pretrained Model on Cross-Domain Few-Shot Learning . | ACM Transactions on Multimedia Computing, Communications and Applications , 2024 , 20 (9) .
Export to NoteExpress RIS BibTex

Version :

Unified View Empirical Study for Large Pretrained Model on Cross-Domain Few-Shot Learning SCIE
期刊论文 | 2024 , 20 (9) | ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS
Abstract&Keyword Cite Version(2)

Abstract :

The challenge of cross-domain few-shot learning (CD-FSL) stems from the substantial distribution disparities between target and source domain images, necessitating a model with robust generalization capabilities. In this work, we posit that large-scale pretrained models are pivotal in addressing the CD-FSL task owing to their exceptional representational and generalization prowess. To our knowledge, no existing research comprehensively investigates the utility of large-scale pretrained models in the CD-FSL context. Addressing this gap, our study presents an exhaustive empirical assessment of the Contrastive Language-Image Pre-Training model within the CD-FSL task. We undertake a comparison spanning six dimensions: base model, transfer module, classifier, loss, data augmentation, and training schedule. Furthermore, we establish a straightforward baseline model, E-base, based on our empirical analysis, underscoring the importance of our investigation. Experimental results substantiate the efficacy of our model, yielding a mean gain of 1.2% in 5-way 5-shot evaluations on the BSCD dataset.

Keyword :

CLIP CLIP Cross-domain few-shot learning Cross-domain few-shot learning unified study unified study

Cite:

Copy from the list or Export to your reference management。

GB/T 7714 Zhuo, Linhai , Fu, Yuqian , Chen, Jingjing et al. Unified View Empirical Study for Large Pretrained Model on Cross-Domain Few-Shot Learning [J]. | ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS , 2024 , 20 (9) .
MLA Zhuo, Linhai et al. "Unified View Empirical Study for Large Pretrained Model on Cross-Domain Few-Shot Learning" . | ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS 20 . 9 (2024) .
APA Zhuo, Linhai , Fu, Yuqian , Chen, Jingjing , Cao, Yixin , Jiang, Yu-Gang . Unified View Empirical Study for Large Pretrained Model on Cross-Domain Few-Shot Learning . | ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS , 2024 , 20 (9) .
Export to NoteExpress RIS BibTex

Version :

Unified View Empirical Study for Large Pretrained Model on Cross-Domain Few-Shot Learning EI
期刊论文 | 2024 , 20 (9) | ACM Transactions on Multimedia Computing, Communications and Applications
Unified View Empirical Study for Large Pretrained Model on Cross-Domain Few-Shot Learning Scopus
期刊论文 | 2024 , 20 (9) | ACM Transactions on Multimedia Computing, Communications and Applications
10| 20| 50 per page
< Page ,Total 1 >

Export

Results:

Selected

to

Format:
Online/Total:201/10039087
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1