Skip to content

Runtime Error in Stable Diffusion v2.1 Embedding #4030

@suzukimain

Description

@suzukimain

Describe the bug

I encountered a runtime error when trying to embed EasyNegative into Stable Diffusion v2.1. However, there was no error when using Stable Diffusion v1.5.

This is a link to the notebook:
https://colab.research.google.com/github/suzukimain/diffusers_in_Colab/blob/test1/Test.ipynb

Reproduction

!pip install torch==2.0.1+cu118 diffusers==0.16.1 transformers==4.29.2 accelerate==0.19.0 scipy==1.10.1 safetensors==0.3.1 ftfy==6.1.1 regex==2022.10.31 tqdm==4.65.0 scipy==1.10.1 sentencepiece==0.1.99 pysbd==0.3.4 xformers huggingface_hub sacremoses -q
!git clone https://github.com/huggingface/diffusers.git
import torch
import diffusers
import transformers
import accelerate
import scipy
import safetensors
import ftfy
import regex
import tqdm
import xformers
import sentencepiece
import pysbd
import huggingface_hub
import sacremoses
from IPython.display import display, Markdown
from diffusers import StableDiffusionPipeline, DPMSolverMultistepScheduler

model_id = "stabilityai/stable-diffusion-2-1"
#model_id = "runwayml/stable-diffusion-v1-5"
pipe = StableDiffusionPipeline.from_pretrained(
model_id, torch_dtype=torch.float16,
use_safetensors=True,
)
pipe.scheduler = DPMSolverMultistepScheduler.from_config(pipe.scheduler.config)

pipe.load_textual_inversion(
"sayakpaul/EasyNegative-test", weight_name="EasyNegative.safetensors", token="EasyNegative"
)

Logs

The loaded token: emb_params is overwritten by the passed token EasyNegative.
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ in <cell line: 10>:10                                                                            │
│                                                                                                  │
│ /usr/local/lib/python3.10/dist-packages/diffusers/loaders.py:659 in load_textual_inversion       │
│                                                                                                  │
│    656 │   │   # resize token embeddings and set new embeddings                                  │
│    657 │   │   self.text_encoder.resize_token_embeddings(len(self.tokenizer))                    │
│    658 │   │   for token_id, embedding in zip(token_ids, embeddings):                            │
│ ❱  659 │   │   │   self.text_encoder.get_input_embeddings().weight.data[token_id] = embedding    │
│    660 │   │                                                                                     │
│    661 │   │   logger.info(f"Loaded textual inversion embedding for {token}.")                   │
│    662                                                                                           │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
RuntimeError: The expanded size of the tensor (1024) must match the existing size (768) at non-singleton dimension 
0.  Target sizes: [1024].  Tensor sizes: [768]

System Info

torch==2.0.1+cu118 diffusers==0.16.1 transformers==4.29.2 accelerate==0.19.0 scipy==1.10.1 safetensors==0.3.1 ftfy==6.1.1 regex==2022.10.31 tqdm==4.65.0 scipy==1.10.1 sentencepiece==0.1.99 pysbd==0.3.4 xformers

Who can help?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions