Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About demo problem #10

Open
journey-zhuang opened this issue Jul 7, 2024 · 4 comments
Open

About demo problem #10

journey-zhuang opened this issue Jul 7, 2024 · 4 comments

Comments

@journey-zhuang
Copy link

Hello, I had the following problem when trying to run demo:

RuntimeError: Error(s) in loading state_dict for ModuleList:
size mismatch for 0.proj.weight: copying a param with shape torch.Size([1536, 1024]) from checkpoint, the shape in current model is torch.Size([2048, 1024]).
size mismatch for 0.proj.bias: copying a param with shape torch.Size([1536]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for 0.norm.weight: copying a param with shape torch.Size([768]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for 0.norm.bias: copying a param with shape torch.Size([768]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for 1.proj.weight: copying a param with shape torch.Size([1536, 1024]) from checkpoint, the shape in current model is torch.Size([2048, 1024]).
size mismatch for 1.proj.bias: copying a param with shape torch.Size([1536]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for 1.norm.weight: copying a param with shape torch.Size([768]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for 1.norm.bias: copying a param with shape torch.Size([768]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for 2.proj.weight: copying a param with shape torch.Size([1536, 1024]) from checkpoint, the shape in current model is torch.Size([2048, 1024]).
size mismatch for 2.proj.bias: copying a param with shape torch.Size([1536]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for 2.norm.weight: copying a param with shape torch.Size([768]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for 2.norm.bias: copying a param with shape torch.Size([768]) from checkpoint, the shape in current model is torch.Size([1024]).

@Jeoyal
Copy link
Contributor

Jeoyal commented Jul 8, 2024

Hey @journey-zhuang , thank you for your interest in our work.
Could you provide more details about the error, specifying whether it occurred during the execution of the python script or while using the online demo?
Before you respond, we will conduct a comprehensive check; your response will help us better identify the error.

@journey-zhuang
Copy link
Author

ok, it occurred during the execution of the python script
截屏2024-07-08 17 57 26

@styleshot
Copy link

styleshot commented Jul 8, 2024

It looks like an error occurred while loading parameters of style_image_proj_modules.
The style_image_proj_modules are implemented as a ModuleList, containing three projections. Each is initialized using the code below:
ImageProjModel( cross_attention_dim=self.pipe.unet.config.cross_attention_dim, clip_embeddings_dim=self.style_aware_encoder.projection_dim, clip_extra_context_tokens=2, ),
Our StyleShot is based on SD v1.5, requires that the cross_attention_dim parameter match the corresponding value of 768 specified in the SD v1.5 configuration file.
However, it appears that your code specifies a cross_attention_dim of 1024. Please confirm that your base diffusion model is indeed SD v1.5 and adjust the cross_attention_dim to the correct value of 768.

@journey-zhuang
Copy link
Author

Oh, I accidentally used SD 2.1, It is my fault, thanks to the author for the positive reply!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants