-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
remaining issue on the mljinterface #101
Comments
i have been trying to cross check the various branches and i think that there is a change now that has to be implemented.
|
I will work on the docs in a pull request separate from the coding part if you don't mind. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
train
method now returnsla, optimiser_state, history
wherela
is theLaplace
object. This way, the object does not need to be stored as a field of the struct and the problem withupdate
is avoided.(la::AbstractLaplace)(X::AbstractArray)
now simply calls the underlying neural network on data. In other words, it returns the generic predictions, not LA predictions.fitresult
method was adjusted also for the classification case.Now that tests are passing, there are a few more things to do (possibly in a new issue + PR) if you like.
src/mlj_flux.jl
can be streamlined further (e.g. do we actually still need to overloadMLJFlux.build
)?For now, feel free to to focus on the other PR, just ping me and @MojiFarmanbar when you come back to this one. I need to move on to other things for now.
Originally posted by @pat-alt in #92 (comment)
The text was updated successfully, but these errors were encountered: