Skip to content

请问在lora微调时,如果embedding layer的长度比tokenizer的词表长度稍大,设置“resize_vocab=True”是会使用embedding layer中原来未被使用的部分,还是把embedding layer的长度增大? #4807

Unanswered
CloudyDory asked this question in Q&A
Discussion options

You must be logged in to vote

Replies: 0 comments

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant