Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ksampler Error #4985

Open
Orenji-Tangerine opened this issue Sep 19, 2024 · 2 comments
Open

Ksampler Error #4985

Orenji-Tangerine opened this issue Sep 19, 2024 · 2 comments
Labels
Potential Bug User is reporting a bug. This should be tested.

Comments

@Orenji-Tangerine
Copy link

Expected Behavior

image
Only happens after commit ad66f7c
I am not exactly too sure if this is due to ComfyUI_InstantID error or ComfyUI error, but it happens only after commit ad66f7c. The following are the errors on the console. I have also run into the same error in an environment which only has a few nodes, just for testing which I report on ComfyUI_InstantID, where the errors are shorter: cubiq/ComfyUI_InstantID#218

Actual Behavior

RuntimeError: expected mat1 and mat2 to have the same dtype, but got: struct c10::Half != float

Steps to Reproduce

Just need to run ComfyUI_InstantID

Debug Logs

!!! Exception during processing !!! expected mat1 and mat2 to have the same dtype, but got: struct c10::Half != float
Traceback (most recent call last):
  File "A:\ComfyUI\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "A:\ComfyUI\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "A:\ComfyUI\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "A:\ComfyUI\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "A:\ComfyUI\nodes.py", line 1430, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
  File "A:\ComfyUI\nodes.py", line 1397, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
  File "A:\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 22, in informative_sample
    raise e
  File "A:\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
    return original_sample(*args, **kwargs)  # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations.
  File "A:\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 420, in motion_sample
    return orig_comfy_sample(model, noise, *args, **kwargs)
  File "A:\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\sampling.py", line 116, in acn_sample
    return orig_comfy_sample(model, *args, **kwargs)
  File "A:\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\utils.py", line 117, in uncond_multiplier_check_cn_sample
    return orig_comfy_sample(model, *args, **kwargs)
  File "A:\ComfyUI\comfy\sample.py", line 43, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "A:\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 104, in KSampler_sample
    return orig_fn(*args, **kwargs)
  File "A:\ComfyUI\comfy\samplers.py", line 829, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "A:\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 122, in sample
    return orig_fn(*args, **kwargs)
  File "A:\ComfyUI\comfy\samplers.py", line 729, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "A:\ComfyUI\comfy\samplers.py", line 716, in sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "A:\ComfyUI\comfy\samplers.py", line 695, in inner_sample
    samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
  File "A:\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 87, in KSAMPLER_sample
    return orig_fn(*args, **kwargs)
  File "A:\ComfyUI\comfy\samplers.py", line 600, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
  File "A:\ComfyUI\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "A:\ComfyUI\comfy\k_diffusion\sampling.py", line 612, in sample_dpmpp_sde
    denoised = model(x, sigmas[i] * s_in, **extra_args)
  File "A:\ComfyUI\comfy\samplers.py", line 299, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
  File "A:\ComfyUI\comfy\samplers.py", line 682, in __call__
    return self.predict_noise(*args, **kwargs)
  File "A:\ComfyUI\comfy\samplers.py", line 685, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
  File "A:\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 162, in sampling_function
    out = orig_fn(*args, **kwargs)
  File "A:\ComfyUI\comfy\samplers.py", line 279, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
  File "A:\ComfyUI\comfy\samplers.py", line 202, in calc_cond_batch
    c['control'] = control.get_control(input_x, timestep_, c, len(cond_or_uncond))
  File "A:\ComfyUI\comfy\controlnet.py", line 253, in get_control
    control = self.control_model(x=x_noisy.to(dtype), hint=self.cond_hint, timesteps=timestep.to(dtype), context=context.to(dtype), **extra)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "A:\ComfyUI\comfy\cldm\cldm.py", line 430, in forward
    h = module(h, emb, context)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "A:\ComfyUI\comfy\ldm\modules\diffusionmodules\openaimodel.py", line 60, in forward
    return forward_timestep_embed(self, *args, **kwargs)
  File "A:\ComfyUI\comfy\ldm\modules\diffusionmodules\openaimodel.py", line 44, in forward_timestep_embed
    x = layer(x, context, transformer_options)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "A:\ComfyUI\comfy\ldm\modules\attention.py", line 694, in forward
    x = block(x, context=context[i], transformer_options=transformer_options)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "A:\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\layer_diffuse\attension_sharing.py", line 252, in forward
    return func(self, x, context, transformer_options)
  File "A:\ComfyUI\comfy\ldm\modules\attention.py", line 621, in forward
    n = self.attn2(n, context=context_attn2, value=value_attn2)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "A:\ComfyUI\comfy\ldm\modules\attention.py", line 467, in forward
    k = self.to_k(context)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "A:\ComfyUI\comfy\ops.py", line 78, in forward
    return super().forward(*args, **kwargs)
  File "A:\ComfyUI\python\lib\site-packages\torch\nn\modules\linear.py", line 114, in forward
    return F.linear(input, self.weight, self.bias)
RuntimeError: expected mat1 and mat2 to have the same dtype, but got: struct c10::Half != float

Other

No response

@Orenji-Tangerine Orenji-Tangerine added the Potential Bug User is reporting a bug. This should be tested. label Sep 19, 2024
@vneznaikin
Copy link

  • same today

@hushfdop
Copy link

same

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Potential Bug User is reporting a bug. This should be tested.
Projects
None yet
Development

No branches or pull requests

3 participants