Abstract: |
Robust Partial Multi-Label Learning with EM-Enhanced Sample Purification and Distilled Label Consensus:
In multi-label learning, noisy labels significantly impair model performance. To address this, we propose a novel framework that integrates Expectation-Maximization (EM)-based sample purification with a teacher-student distillation mechanism. The EM method iteratively refines label confidence scores by leveraging feature regularization and neighborhood consistency, effectively identifying reliable samples. Subsequently, a teacher model, trained on these clean samples, generates soft labels for noisy instances, which are used to train a student model. Comprehensive analyses of computational complexity and PAC-learnability substantiate the method’s theoretical robustness. This approach effectively mitigates the impact of label noise, providing a comprehensive solution for enhancing multi-label classification tasks.
Efficient time-frequency dynamic graphical neural networks for multivariate time series analysis:
Multivariate time series analysis has many vital applications in healthcare, IoT, finance, and other fields. However, due to the specific nature of this data, challenges of modeling multivariate time series dependence, noise immunization and generalization, interpretability, and computational complexity are prevalent. Research has been conducted recently to propose solutions to the above challenges. However, the existing work still does not adequately address the extraction of dynamic features over time and the potential learning of inter-dimensional causal associations in multivariate time-series data.
|