-
Notifications
You must be signed in to change notification settings - Fork 746
Nwb integration #1419
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nwb integration #1419
Conversation
Also solves an error related to generateCore
added comments to test
…into nwb-integration
SpikeEventSeries is also found through searchFor, but in the current nwb file there is no true ElectricalSeries in the SpikeEventSeries.
fixed with lfp instead of electrialseries
…into nwb-integration
In case someone calls it directly.
also when data is stored in a DataPipe instead of a DataStub.
…diff path lengths error
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the code changes look good, although I have not been able to test it yet (I have to download the files and dependencies). I have renamed the test script to inspect_issueXXX
which will not automatically be executed. All test_xxx
scripts are executed and in this case it would still fail (since the data is not yet on the compute cluster and the paths have not been updated).
I will first merge and then update the test script.
Regarding your questions:
Please give me some time to update the test script and try it, first on my local MacBook, then on the Linux compute cluster. I cannot test on Windows. |
…ary, see fieldtrip#1419 I moved the test script to invalid
Jens Klinzing (@jensklinzing), Tara van Viegen (@TaravanViegen) and I wrote code to implement loading of NWB data (both LFP and spikes) with FieldTrip.
https://github.com/jens-k/fieldtrip/tree/nwb-integration
The code is tested with matNWB release 2.2.5:
https://github.com/NeurodataWithoutBorders/matnwb/releases/tag/v2.2.5.0
It is assumed that the user is familiar with the basics of MatNWB (Github MatNWB) and the user's system is in a state that would also allow loading the data using MatNWB (MatNWB must be in the path, generateCore must have been run, the correct schema must be installed, see NWB schemas).
We want to include a test_nwb.m script (draft currently in folder test, but should be added as markdown here: fieldtrip/website#249). To adapt this to be a proper test-script we have a few questions:
1
We believe some NWB datasets should be transferred to the test data folder on the DCCN server. Our test datasets are quite large (~10GB). Is this an issue? Should we create smaller ones?
2
How to deal with the MatNWB toolbox that has to be loaded? Will there be one stable version on the server or should it automatically be pulled from https://github.com/NeurodataWithoutBorders/matnwb?
3
Also, the MatNWB will create folders when being initialized and when loading a file. Can we use the systems 'tempdir' to create these files? Are they then automatically deleted after?
The code was tested with the following example datasets that are freely available online:
From https://gui.dandiarchive.org/#/file-browser/folder/5e6eb2b776569eb93f451f8d:
sub-YutaMouse20_ses-YutaMouse20-140324_behavior+ecephys.nwb
sub-YutaMouse41_ses-YutaMouse41-150831_behavior+ecephys.nwb
From https://osf.io/hv7ja/ OR https://gui.dandiarchive.org/#/dandiset/5e6ff4140fe0ec33e6b3d5d2:
P9HMH_NOID5.nwb
See discussion of NWB-fieldtrip integration here: #721 (comment)
Cheers,
Steffen.