GitXplorerGitXplorer
x

Knowledge-Distillation-NLP

public
19 stars
6 forks
1 issues

Commits

List of commits on branch master.
Unverified
f5341de9654bc9c19455f72d1e8f1d4e5c5622ad

fix logits and update api

xxv44586 committed 4 years ago
Unverified
15cf481b32f98bd4b65060643c79ae3ba8705dc0

change email

xxv44586 committed 4 years ago
Unverified
d0bfa82419d3161d12507da07dd1b3d1c04dc8e1

change style

xxv44586 committed 4 years ago
Unverified
9f7fa31833dd887c8ed61ca8bfd101f4c7cfc2a1

update readme

xxv44586 committed 4 years ago
Unverified
ef944a925162d9ee4bc8109e333894518877fbe5

add fastbert

xxv44586 committed 4 years ago
Unverified
16254e9e7be01f1adfdeb3a201c3b09313075dc1

edit readme

xxv44586 committed 4 years ago

README

The README file for this repository.

知识蒸馏

知识蒸馏(a.k.a Teacher-Student Model)旨在利用一个小模型(Student)去学习一个大模型(Teacher)中的知识, 期望小模型尽量保持大模型的性能,来减小模型部署阶段的参数量,加速模型推理速度,降低计算资源使用。

目录结构