work
[Re] Distilling Knowledge via Knowledge Review (Preprint arXiv:2205.11246 [cs.CV])
Pranjal Gulati, Apoorva Verma, Sarthak Gupta
- Submitted at the ML Reproducibility Challenge 2021 organized by Papers with Code, reproducing the CVPR’21 paper Distilling Knowledge via Knowledge Review
- Worked on knowledge distillation technique for computer vision that involved a residual learning framework to train a single student layer using multiple teacher layers.
- Alongside reproducing the results, we performed ablation studies and experiments to gain further insights
- Links: arXiv, code
Model Extraction Attack for Video Classification
- Our team of eight developed an efficient strategy to extract the video-based models in the black box and grey box setting
- We extracted Video Swin Transformer and MoviNet trained on the Kinetics datasets to obtain competitive results in the novel task
- Used conditional video generators, adversarial crafting and knowledge distillation techniques
- For this work, we won the gold medal in the high prep event, Inter IIT Tech Meet 10.0, amongst 22 IITs
- Links: report, code
Chess for Nerds
- Motivation: Lack of terminal games to play on the go.
- Terminal based chess, many ideas under implementation.
- Links: code