-
-
Notifications
You must be signed in to change notification settings - Fork 603
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use NNlib.bias_act!
#2327
base: master
Are you sure you want to change the base?
Use NNlib.bias_act!
#2327
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -196,10 +196,9 @@ ChainRulesCore.@non_differentiable conv_dims(::Any, ::Any) | |
|
||
function (c::Conv)(x::AbstractArray) | ||
_conv_size_check(c, x) | ||
σ = NNlib.fast_act(c.σ, x) | ||
cdims = conv_dims(c, x) | ||
xT = _match_eltype(c, x) | ||
σ.(conv(xT, c.weight, cdims) .+ conv_reshape_bias(c)) | ||
NNlib.bias_act!(c.σ, conv(xT, c.weight, cdims), conv_reshape_bias(c)) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. GPUCompiler doesn't like this when There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Thanks for digging. Error is on
where ComposedFunction comes from here: Agree it's odd that Dense doesn't hit the same. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I can replicate this issue with just CUDA.jl and NNlib, so we should consider adding some GPU tests for There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Have a theory now based on more testing. Edit: confirmed with Cthulhu. Not sure what the best course of action here would be. Do we rely heavily on the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Could always override There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This might be a good PR to test the new benchmarking tool too. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Good point. Allowing this is precisely why There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Unfortunately, it looks like this error still persists :( There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Rebased to see how it worked with Enzyme etc, but still didn't get around to fixing this error. Can save a lot of memory but haven't seen much of a speedup out of it. |
||
end | ||
|
||
_channels_in(l::Conv) = size(l.weight, ndims(l.weight)-1) * l.groups | ||
|
@@ -332,10 +331,9 @@ ChainRulesCore.@non_differentiable conv_transpose_dims(::Any, ::Any) | |
|
||
function (c::ConvTranspose)(x::AbstractArray) | ||
_conv_size_check(c, x) | ||
σ = NNlib.fast_act(c.σ, x) | ||
cdims = conv_transpose_dims(c, x) | ||
xT = _match_eltype(c, x) | ||
σ.(∇conv_data(xT, c.weight, cdims) .+ conv_reshape_bias(c)) | ||
NNlib.bias_act!(c.σ, ∇conv_data(xT, c.weight, cdims), conv_reshape_bias(c)) | ||
end | ||
|
||
function Base.show(io::IO, l::ConvTranspose) | ||
|
@@ -474,10 +472,9 @@ ChainRulesCore.@non_differentiable crosscor_dims(::Any, ::Any) | |
|
||
function (c::CrossCor)(x::AbstractArray) | ||
_conv_size_check(c, x) | ||
σ = NNlib.fast_act(c.σ, x) | ||
cdims = crosscor_dims(c, x) | ||
xT = _match_eltype(c, x) | ||
σ.(crosscor(xT, c.weight, cdims) .+ conv_reshape_bias(c)) | ||
NNlib.bias_act!(c.σ, crosscor(xT, c.weight, cdims), conv_reshape_bias(c)) | ||
end | ||
|
||
function Base.show(io::IO, l::CrossCor) | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -246,7 +246,7 @@ function _norm_layer_forward( | |
β = reshape(l.β, affine_shape) | ||
|
||
scale = γ ./ sqrt.(σ² .+ eps) | ||
bias = -scale .* μ .+ β | ||
bias = .-scale .* μ .+ β | ||
l.λ.(scale .* x .+ bias) | ||
end | ||
Comment on lines
248
to
251
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Unrelated change, but surely a typo? I considered using There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If anything I would've expected it on the line below (248). There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes that's what I meant, sorry. But while there, I spotted the missing dot. |
||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.