Category: Open Access Journals
EcoHIV Infection Modulates the Effects of Cocaine Exposure Pattern and Abstinence on Cocaine Seeking and Neuroimmune Protein Expression in Male Mice
Stress in utero: effects on gliovascular integrity in female infant offspring
BDCC, Vol. 8, Pages 43: Knowledge-Enhanced Prompt Learning for Few-Shot Text Classification
BDCC, Vol. 8, Pages 43: Knowledge-Enhanced Prompt Learning for Few-Shot Text Classification
Big Data and Cognitive Computing doi: 10.3390/bdcc8040043
Authors: Jinshuo Liu Lu Yang
Classification methods based on fine-tuning pre-trained language models often require a large number of labeled samples; therefore, few-shot text classification has attracted considerable attention. Prompt learning is an effective method for addressing few-shot text classification tasks in low-resource settings. The essence of prompt tuning is to insert tokens into the input, thereby converting a text classification task into a masked language modeling problem. However, constructing appropriate prompt templates and verbalizers remains challenging, as manual prompts often require expert knowledge, while auto-constructing prompts is time-consuming. In addition, the extensive knowledge contained in entities and relations should not be ignored. To address these issues, we propose a structured knowledge prompt tuning (SKPT) method, which is a knowledge-enhanced prompt tuning approach. Specifically, SKPT includes three components: prompt template, prompt verbalizer, and training strategies. First, we insert virtual tokens into the prompt template based on open triples to introduce external knowledge. Second, we use an improved knowledgeable verbalizer to expand and filter the label words. Finally, we use structured knowledge constraints during the training phase to optimize the model. Through extensive experiments on few-shot text classification tasks with different settings, the effectiveness of our model has been demonstrated.