This repository investigates the effects of partially freezing layers during the fine-tuning of a distilled multilingual model (DistilBERT) using the Universal Dependencies dataset. The study focuses on the task of Part-of-Speech (PoS) tagging and aims to explore how layer freezing impacts both model performance and training efficiency in this specific linguistic analysis context.
-
Notifications
You must be signed in to change notification settings - Fork 0
x1ew/UD-LinguisticStudy
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published