Indexed by:
Abstract:
In the era of big data, classifying online tourism resource information can facilitate the matching of user needs with tourism resources and enhance the efficiency of tourism resource integration. However, most research in this field has concentrated on a simple classification problem with a single level of single labelling. In this paper, a Hierarchical Label-Aware Tourism-Informed Dual Graph Attention Network (HLT-DGAT) is proposed for the complex multi-level and multi-label classification presented by online textual information about Chinese tourism resources. This model integrates domain knowledge into a pre-trained language model and employs attention mechanisms to transform the text representation into the label-based representation. Subsequently, the model utilizes dual Graph Attention Network (GAT), with one component capturing vertical information and the other capturing horizontal information within the label hierarchy. The model's performance is validated on two commonly used public datasets as well as on a manually curated Chinese tourism resource dataset, which consists of online textual overviews of Chinese tourism resources above 3A level. Experimental results indicate that HLT-DGAT demonstrates superiority in threshold-based and area-under-curve evaluation metrics. Specifically, the AU(PRC) reaches 64.5 % on the Chinese tourism resource dataset with enforced leaf nodes, which is 3 % higher than the optimal corresponding metric of the baseline model. Furthermore, ablation studies show that (1) integrating domain knowledge, (2) combining local information, (3) considering label dependencies within the same level of label hierarchy, and (4) merging dynamic reconstruction can enhance overall model performance.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
INFORMATION PROCESSING & MANAGEMENT
ISSN: 0306-4573
Year: 2024
Issue: 1
Volume: 62
7 . 4 0 0
JCR@2023
CAS Journal Grade:1
Cited Count:
WoS CC Cited Count: 1
SCOPUS Cited Count: 2
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2
Affiliated Colleges: