Projects per year
Abstract
| Original language | English |
|---|---|
| Number of pages | 26 |
| Journal | IEEE Transactions on Pattern Analysis and Machine Intelligence |
| Early online date | 26 Jan 2026 |
| DOIs | |
| Publication status | E-pub ahead of print - 26 Jan 2026 |
Bibliographical note
Publisher Copyright:© 1979-2012 IEEE.
Funding
The research described in this article has been supported by a research grant entitled “Medical Text Feature Representations based on Pre-trained Language Models” (871238); the Faculty Research Grants (DB24A4 and SDS24A8), and the Direct Grant (DR25E8) of Lingnan University, Hong Kong; and two grants from the Research Grants Council of the Hong Kong Special Administrative Region, China (R1015-23 and UGC/FDS16/E17/23). (Corresponding author: Haoran Xie.) Lingling Xu is with the School of Science and Technology, Hong Kong Metropolitan University, Hong Kong, and also with the School of Data Science, Lingnan University, Hong Kong (email: [email protected]).
Keywords
- Parameter-efficient
- fine-tuning
- pretrained language model
- large language model
- memory usage
Fingerprint
Dive into the research topics of 'Parameter-Efficient Fine-Tuning Methods for Pretrained Language Models : A Critical Review and Assessment'. Together they form a unique fingerprint.-
Automatic Weight Learning at Data-level and Task-level for Multitask Learning with the Application for Implicit Sentiment Analysis
XIE, H. (PI)
1/01/25 → 31/12/26
Project: Grant Research
-
Pretraining Language Model for Financial News Analysis
XIE, H. (PI)
1/01/25 → 31/12/26
Project: Grant Research
-
Integrating ChatGPT with Search Engine, Recommender System and Online Advertising to Enhance User Experience on Online Service Platforms (LU Part)
QIN, S. J. (CoPI), ZHAO, X. (PI), KING, I. (CoPI), LI, Q. (CoPI), LI, Y. D. (CoPI) & XU, J. (CoPI)
Research Grants Council (Hong Kong, China)
1/06/24 → 30/11/27
Project: Grant Research