Skip to content

x1ew/UD-LinguisticStudy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

UD-LinguisticStudy

This repository investigates the effects of partially freezing layers during the fine-tuning of a distilled multilingual model (DistilBERT) using the Universal Dependencies dataset. The study focuses on the task of Part-of-Speech (PoS) tagging and aims to explore how layer freezing impacts both model performance and training efficiency in this specific linguistic analysis context.

Data

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published