Hermione


  • Home

  • Archives

Knowledge DistillationTag

Knowledge Squeezed Adversarial Network Compression

04-15

Training Shallow and Thin Networks for Acceleration via Knowledge Distillation with Conditional Adversarial Networks

04-03

Improved Knowledge Distillation via Teacher Assistant

04-03

MEAL: Multi-Model Ensemble via Adversarial Learning

04-02

Zero-Shot Knowledge Distillation in Deep Networks

04-01

Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation

03-31

张倩倩

36 posts
10 categories
19 tags
GitHub
© 2020 张倩倩
Powered by Hexo
|
Theme — NexT.Pisces v5.1.4