Lan Zhang Lingsgz

📂论文简述

博主
Lingsgz
1年前
Shi, Guangyuan, et al. "Overcoming catastrophic forgetting in incremental few-shot learning by finding flat minima." Advances in neural information processing systems 34 (2021): 6747-6761.code available论文关注Continual Learning中的Few-shot问题,发现对于few-shot问题,一个简单的基类训练的基线模型优于最先进的方法,这证明了灾难性遗忘的严重性。论文提出了一种寻找到基
博主
Lingsgz
2年前
Kornblith, Simon, et al. "Similarity of neural network representations revisited." International conference on machine learning. PMLR, 2019.No open source. this is the reproduce of others. Seeking from other papers might be accessible.论文主要引入了一种CKA的度量方式,用以确定从不同的随机初始化和不同的宽度下训练出来的神经网络的隐藏层之间的对应关系。验证了更宽的
评论0