8000 Add logL as model output by adelavega · Pull Request #230 · poldracklab/fitlins · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Add logL as model output #230

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Apr 9, 2020
Merged

Add logL as model output #230

merged 6 commits into from
Apr 9, 2020

Conversation

adelavega
Copy link
Collaborator
  • Also output log-likelkihood (logL) from model

@pep8speaks
Copy link
pep8speaks commented Apr 6, 2020

Hello @adelavega, Thank you for updating!

Cheers! There are no style issues detected in this Pull Request. 🍻 To test for issues locally, pip install flake8 and then run flake8 fitlins.

Comment last updated at 2020-04-08 22:12:04 UTC

@adelavega
Copy link
Collaborator Author

@effigies @jdkent I'm testing out adding log-likelihood to fitlins outputs.

Unfortunately, it was actually not available with _get_voxelwise_model_attribute because logL is not an attribute of SimpleRegressionResult, and that's what's being checked when it validates the arguments to the function (it shouldbe checking RegressionResult which is the type of object returned when minimize_memory is True). It's also not an attribute, but a function of the result object.

nistats is archived so I can't get that fix upstream, so that custom function to retrieve it would have to live here for now.

@adelavega adelavega changed the title Add logL as model output WIP: Add logL as model output Apr 7, 2020
@adelavega adelavega changed the title WIP: Add logL as model output Add logL as model output Apr 7, 2020
@adelavega
Copy link
Collaborator Author

Okay, great the maps look good.

Only other point is what they key should be. logL would reflect nistats, but we could go with logLikelihood for more clarity.

@jdkent
Copy link
Contributor
jdkent commented Apr 8, 2020

Thanks for investigating this @adelavega, and sorry it was more pain to implement than anticipated.

Unfortunately, it was actually not available with _get_voxelwise_model_attribute because logL is not an attribute of SimpleRegressionResult

I don't think _get_voxelwise_model_attribute would work at all with SimpleRegressionResult since the function raises an error if minimize_memory is True (which mean that the model is using SimpleRegressionResult instead of RegressionResult to store output).

Your point still stands that logL is also not an attribute directly from RegressionResults:

>>> dict(vars(RegressionResults)).keys()
dict_keys(['__module__', '__doc__', '__init__', 'wdesign', 'wY', 'wresid', 'resid', 'residuals', 'norm_resid', 'normalized_residuals', 'predicted', 'SSE', 'r_square', 'MSE'])

But logL is an attribute from LikelihoodModelResults (which RegressionResults inherits from)

>>> dict(vars(LikelihoodModelResults)).keys()
dict_keys(['__module__', '__doc__', '__init__', 'df_resid', 'logL', 't', 'vcov', 'Tcontrast', 'Fcontrast', 'conf_int', '__dict__', '__weakref__'])

I did not do a diligent check to make sure the attributes from the inherited class (LikelihoodModelResults) were passed into the inheriting class (RegressionResult), likely because of how @setattr_on_read works.

It's also not an attribute, but a function of the result object.

logL is a function for SimpleRegressionResults, but logL should be an attribute for RegressionResults (once calculated), unless if my understanding of @setattr_on_read is wrong.

So I would change:
all_attributes = dict(vars(RegressionResults)).keys()
to

all_attributes = set(dict(vars(LikelihoodModelResults)).keys()).union(dict(vars(RegressionResults)).keys())

So any attributes that RegressionResults directly has or inherits from would be valid.

Finally, I also like logLikelihood for more clarity.


# Manually compute log-likelihood

def _get_voxelwise_logL(flm):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know if it would be easier, but you could also patch the FirstLevelModel class with a more correct _get_voxelwise_model_attribute function definition, and use FirstLevelModelPatch here.
The main benefit being if this change is pulled into nilearn/nistats, then you can just delete the patched class and use the "normal" one again and not potentially change how you are solving the problem (by defining a custom function _get_voxelwise_logL).

Overthinking this since your solution works and is also good (and ultimately fewer lines of code).

from nistats.regression import RegressionResults
from nistats.model import LikelihoodModelResults

class FirstLevelModelPatch(FirstLevelModel)
    def _get_voxelwise_model_attribute(self, attribute,
                                       result_as_time_series):
        """Transform RegressionResults instances within a dictionary
        (whose keys represent the autoregressive coefficient under the 'ar1'
        noise model or only 0.0 under 'ols' noise_model and values are the
        RegressionResults instances) into input nifti space.
        Parameters
        ----------
        attribute : str
            an attribute of a RegressionResults instance.
            possible values include: resid, norm_resid, predicted,
            SSE, r_square, MSE.
        result_as_time_series : bool
            whether the RegressionResult attribute has a value
            per timepoint of the input nifti image.
        Returns
        -------
        output : list
            a list of Nifti1Image(s)
        """
        # check if valid attribute is being accessed.
         # CHANGING THIS LINE TO BE BETTER
        all_attributes = set(dict(vars(LikelihoodModelResults)).keys()).union(dict(vars(RegressionResults)).keys())
        possible_attributes = [prop
                               for prop in all_attributes
                               if '__' not in prop
                               ]
        if attribute not in possible_attributes:
            msg = ("attribute must be one of: "
                   "{attr}".format(attr=possible_attributes)
                   )
            raise ValueError(msg)

        if self.minimize_memory:
            raise ValueError(
                'To access voxelwise attributes like '
                'R-squared, residuals, and predictions, '
                'the `FirstLevelModel`-object needs to store '
                'there attributes. '
                'To do so, set `minimize_memory` to `False` '
                'when initializing the `FirstLevelModel`-object.')

        if self.labels_ is None or self.results_ is None:
            raise ValueError('The model has not been fit yet')

        output = []

        for design_matrix, labels, results in zip(self.design_matrices_,
                                                  self.labels_,
                                                  self.results_
                                                  ):
            if result_as_time_series:
                voxelwise_attribute = np.zeros((design_matrix.shape[0],
                                                len(labels))
                                               )
            else:
                voxelwise_attribute = np.zeros((1, len(labels)))

            for label_ in results:
                label_mask = labels == label_
                voxelwise_attribute[:, label_mask] = getattr(results[label_],
                                                             attribute)

            output.append(self.masker_.inverse_transform(voxelwise_attribute))

            return output

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think subclassing things from nistats is probably going to be more effort than it's worth at this point, but worth thinking about making these contributions upstream to nilearn post-0.7.0.

@adelavega Can you define this function at the file-level, rather than nested inside this scope?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I can do that. I think that's easiest for now.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good to me, I can reference an issue in nilearn.

@codecov-io
Copy link

Codecov Report

Merging #230 into master will increase coverage by 0.25%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #230      +/-   ##
==========================================
+ Coverage   77.62%   77.87%   +0.25%     
==========================================
  Files          18       18              
  Lines        1059     1071      +12     
  Branches      189      190       +1     
==========================================
+ Hits          822      834      +12     
  Misses        149      149              
  Partials       88       88              
Flag Coverage Δ
#ds003 77.87% <100.00%> (+0.25%) ⬆️
Impacted Files Coverage Δ
fitlins/workflows/base.py 64.13% <ø> (ø)
fitlins/interfaces/nistats.py 85.05% <100.00%> (+1.10%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update abe803d...bc0c52d. Read the comment docs.

@adelavega
Copy link
Collaborator Author

Looks like tests need to be re-run, random installation failure

@effigies
Copy link
Collaborator
effigies commented Apr 8, 2020

Tests passing. Happy with this?

@adelavega
Copy link
Collaborator Author

Let me test locally once more.

@adelavega
Copy link
Collaborator Author

Actually let's hold off, @tyarkoni is checking out if this is the log-likelihood we want.

@adelavega
Copy link
Collaborator Author

OK It looks good

@adelavega adelavega merged commit ab47bf2 into master Apr 9, 2020
@effigies effigies deleted the enh/logl branch March 30, 2022 00:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants
0