-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support moving average-based PCA dimensionality estimation #51
Comments
I chatted with the other tedana devs and we're planning to extract the MA-PCA code from tedana into its own small package, so that way other tools (in particular rapidtide) will be able to use the methods without adding tedana as a dependency. We're just waiting to hear back from the GIFT devs (from whom we got a MATLAB-based MA-PCA implementation that we then converted to Python) about licensing and whatnot. Hopefully we'll have a separate package available for rapidtide to use soon! |
Cool! |
We finally released |
Yes, I would be! |
The MA-PCA approach works by downsampling imaging data in image-space (i.e., as 4D arrays of shape X x Y x Z x T), so would it be possible to use nibabel Nifti1Image objects in the relevant functions? |
Is your feature request related to a problem? Please describe.
Using MA-PCA would serve as an alternative to the buggy MLE-PCA in
scikit-learn
.Describe the solution you'd like
tedana
implements MA-PCA for dimensionality reduction. We can incorporate this approach intorapidtide
, whether by addingtedana
as a dependency or (more likely) by copying the code over directly (with attribution, of course), sincetedana
has a fairly permissive license.Describe alternatives you've considered
Currently @bbfrederick has implemented a workaround for failed MLE-PCA that chooses the components which explain 80% of the variance.
Additional context
This stems from 1308d59#r44963454.
The text was updated successfully, but these errors were encountered: