Skip to content

Finetune t5 model for text2text generation use case to generate human like text

Notifications You must be signed in to change notification settings

mahithabsl/Finetune_huggingface_t5_for_text2text_generation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 

Repository files navigation

Finetune Huggingface t5 for text2text generation

About T5:

  • T5 , or Text-to-Text Transfer Transformer developed by Google, is a Transformer based architecture that uses a text-to-text approach.

Goal:

We have multiple small/simple notes in the following fashion.

Input:

  • James Elliot leaving the company was a loss.
  • The firm drew comfort from Shrenick Shah's experience and involvement in the strategy since its inception in 2012.

Target output:

  • Elliot's departure was a loss, but we draw comfort from Shah's experience and involvement in the strategy since its 2012 inception.

Predicted output from t5:

  • Elliot's departure from the firm was a loss, and the firm drew comfort from Shah's experience and involvement in the strategy since its inception in 2012.

Implementation details:

  • We have used a T5-based Huggingface model to finetune on our dataset
  • Dataset split into train and test
  • Ran for 12 epochs
  • Paraphrased using another Paraphraser

About

Finetune t5 model for text2text generation use case to generate human like text

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published