This repository contains several algorithms for NLP compressions. List of papers are presented as below. Further algorithms will be added in the near future.
- Large Language Models are Reasoning Teachers (ACL 2023))
- Revisiting Intermediate Layer Distillation for Compressing Language Models: An Overfitting Perspective (EACL 2023 Findings))
This work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government(MSIT) (No. 2021-0-00907, Development of Adaptive and Lightweight Edge-Collaborative Analysis Technology for Enabling Proactively Immediate Response and Rapid Learning).
-
CR-ILD : Repository of "Revisiting Intermediate Layer Distillation for Compressing Language Models: An Overfitting Perspective " accepted at EACL 2023 Findings. This is an algorithm of utilizing intermediate layer distillation for compressing bert models.
-
Reasoning_Teacher_ACL2023 : Repository of "Large Language Models are Reasoning Teachers" accepted at ACL 2023. For compression, this paper presents a method of using the COT reasonings of large language models.
- Navigate to the desired repository by clicking on the provided link.
- Follow the instructions provided in each repository for installation, usage, and other