You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@Rockdeldiablo I've added the following changes now to make things work without having to overload the `update` method or changing it in MLJFlux:
The train method now returns la, optimiser_state, history where la is the Laplace object. This way, the object does not need to be stored as a field of the struct and the problem with update is avoided.
To facilitate this change, calling a Laplace object on an array, (la::AbstractLaplace)(X::AbstractArray) now simply calls the underlying neural network on data. In other words, it returns the generic predictions, not LA predictions.
The fitresult method was adjusted also for the classification case.
Now that tests are passing, there are a few more things to do (possibly in a new issue + PR) if you like.
Add a (short) tutorial to the documentation.
Double-check if the code in src/mlj_flux.jl can be streamlined further (e.g. do we actually still need to overload MLJFlux.build)?
For now, feel free to to focus on the other PR, just ping me and @MojiFarmanbar when you come back to this one. I need to move on to other things for now.
train
method now returnsla, optimiser_state, history
wherela
is theLaplace
object. This way, the object does not need to be stored as a field of the struct and the problem withupdate
is avoided.(la::AbstractLaplace)(X::AbstractArray)
now simply calls the underlying neural network on data. In other words, it returns the generic predictions, not LA predictions.fitresult
method was adjusted also for the classification case.Now that tests are passing, there are a few more things to do (possibly in a new issue + PR) if you like.
src/mlj_flux.jl
can be streamlined further (e.g. do we actually still need to overloadMLJFlux.build
)?For now, feel free to to focus on the other PR, just ping me and @MojiFarmanbar when you come back to this one. I need to move on to other things for now.
Originally posted by @pat-alt in #92 (comment)
The text was updated successfully, but these errors were encountered: