About
About:
I am currently working in the area of Database Engines and Distributed Systems.
Previously, I got the MS degree at the School of Computer Science and Engineering, Sun Yat-sen University. Working in the iSEE Lab, I researched some challenging problems (e.g., knowledge preserving, data bias, few-shot) of Machine Learning and Computer vision which are derived from the practical applications on Medical Image Analysis. I focused on Continual learning (or Incremental learning), exploring solutions to train Neural Networks continually.
My other interests include Digital Image Processing and Compiling Theory.
Email: chh.zhong[at]outlook[dot]com
Publications:
-
8/2021 Discriminative Distillation to Reduce Class Confusion in Continual Learning (PRCV 2022): We argue that the class confusion phenomena may also play a role in downgrading the classification performance during continual learning, i.e., the high similarity between new classes and any previously learned classes would also cause the classifier to make mistakes in recognizing these old classes, even if the knowledge of these old classes is not forgotten.
-
6/2021 Continual Learning with Bayesian Model Based on a Fixed Pre-trained Feature Extractor (MICCAI 2021): We propose a Bayesian generative model for continual learning built on a fixed pre-trained feature extractor.
-
4/2021 Preserving Earlier Knowledge in Continual Learning with the Help of All Previous Feature Extractors: A fusion mechanism by including all the previously learned feature extractors into the intelligent model to reduce catastrophic forgetting. Feature extractor pruning is also applied to prevent the whole model size from growing rapidly.
-
6/2020 Continual Learning of New Diseases with Dual Distillation and Ensemble Strategy (MICCAI 2020): We propose a new continual learning framework to alleviate the catastrophic forgetting issue by simultaneously distilling both old knowledge and recently learned new knowledge.
-
11/2019 Ensemble Convolutional Neural Networks for Cell Classification in Microscopic Images (ISBI 2019 Workshop): An ensemble of state-of-the-art CNNs was adopted and ranked 5th in the ISBI 2019 C-NMC Challenge.
Projects:
- Continual-Learning-Reproduce: A general framework designed for the implementation of Class Incremental (or Continual) Learning (CIL) methods. Researchers can easily reproduce the results of various studies and develop their own innovative algorithms.
- Time-Tracking-Mode: A simpler option that only records working time locally using pure Elisp, without the additional features of third-party integration.
Awards and Honors
-
Outstanding Graduate Award (top 2%), South China Normal University, 2019.
-
Fifth place (out of 450 teams), ISBI C-NMC Challenge: Classification in Cancer Cell Imaging, 2019.
-
Grand Prize for Asia and Pacific Mathematical Contest in Modeling (top 0.1%), 2018.