Skip to content

Latest commit

 

History

History
9 lines (6 loc) · 244 Bytes

model_sanitizer.md

File metadata and controls

9 lines (6 loc) · 244 Bytes

model sanitizer

  1. generate a bunch of images unconditionally
  2. detect the class(es) we want to scrub
  3. learn TI token or LoRA on flagged images
  4. subtract from null token or use as negative embedding

alternatively, try "ROME" editing?