KroneckerBERT: Significant Compression of Pre-trained Language Models Through Kronecker Decomposition and Knowledge Distillation

Marzieh Tahaei | Ella Charlaix | Vahid Nia | Ali Ghodsi | Mehdi Rezagholizadeh |

Paper Details:

Month: July
Year: 2022
Location: Seattle, United States
Venue: NAACL |

Citations

URL

No Citations Yet

Field Of Study