Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Predicting using MLJ interface is slow #123

Merged
merged 4 commits into from
Sep 12, 2024
Merged

Predicting using MLJ interface is slow #123

merged 4 commits into from
Sep 12, 2024

Conversation

pat-alt
Copy link
Member

@pat-alt pat-alt commented Sep 12, 2024

@pasq-cat I think we should indeed just go with the direct MLJ interface #121 but for now I would still merge this cause it addresses the compute times.

@pat-alt pat-alt linked an issue Sep 12, 2024 that may be closed by this pull request
Copy link

codecov bot commented Sep 12, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 96.48%. Comparing base (78c846e) to head (9873c55).
Report is 5 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #123      +/-   ##
==========================================
- Coverage   96.65%   96.48%   -0.17%     
==========================================
  Files          22       22              
  Lines         658      655       -3     
==========================================
- Hits          636      632       -4     
- Misses         22       23       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@pasq-cat pasq-cat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nvm this message , i was wrong

@pat-alt
Copy link
Member Author

pat-alt commented Sep 12, 2024

shoudln't the same be done in function MLJFlux.predict(model::LaplaceRegression, fitresult, Xnew)

?

Yes, I've done that

@pat-alt pat-alt merged commit 7a0a783 into main Sep 12, 2024
8 checks passed
@pasq-cat
Copy link
Contributor

shoudln't the same be done in function MLJFlux.predict(model::LaplaceRegression, fitresult, Xnew)
?

Yes, I've done that

btw i was sure that laplace accepted vectors as input. has anything been changed or i was wrong the whole time? the documentation says abstractarray but i distinctly remember this requirement....

@pat-alt
Copy link
Member Author

pat-alt commented Sep 12, 2024

shoudln't the same be done in function MLJFlux.predict(model::LaplaceRegression, fitresult, Xnew)
?

Yes, I've done that

btw i was sure that laplace accepted vectors as input. has anything been changed or i was wrong the whole time? the documentation says abstractarray but i distinctly remember this requirement....

hmm not sure where you say this https://juliatrustworthyai.github.io/LaplaceRedux.jl/stable/reference/#LaplaceRedux.predict-Tuple{LaplaceRedux.AbstractLaplace,%20AbstractArray}

@pasq-cat
Copy link
Contributor

shoudln't the same be done in function MLJFlux.predict(model::LaplaceRegression, fitresult, Xnew)
?

Yes, I've done that

btw i was sure that laplace accepted vectors as input. has anything been changed or i was wrong the whole time? the documentation says abstractarray but i distinctly remember this requirement....

hmm not sure where you say this https://juliatrustworthyai.github.io/LaplaceRedux.jl/stable/reference/#LaplaceRedux.predict-Tuple{LaplaceRedux.AbstractLaplace,%20AbstractArray}

i know i know, must have been something that i have misunderstood in the first days, mah

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Predict is slow
2 participants