Abstract: Knowledge distillation has demonstrated considerable success in scenarios involving multi-class single-label learning. However, its direct application to multi-label learning proves ...
Copyright: © 2026 Elsevier Ltd. All rights are reserved, including those for text and data mining, AI training, and similar technologies.
Abstract: As a prominent research topic, multi-view multi-label classification (MvMlC) aims to assign multiple labels to samples by integrating information from various perspectives. However, in ...