You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Storing the Hilbert HFO statistic array is cumbersome especially if it's a gigantic dataset.
For example, a dataset from sickkids E6, has a huge array that can't be allocated.
Extracting parameters from /home/adam2392/hdd3/sickkids/sub-E6/ses-preresection/ieeg/sub-E6_ses-preresection_task-pre_acq-ecog_run-01_ieeg.vhdr...
Setting channel info structure...
Reading channel info from /home/adam2392/hdd3/sickkids/sub-E6/ses-preresection/ieeg/sub-E6_ses-preresection_task-pre_acq-ecog_run-01_channels.tsv.
Reading electrode coords from /home/adam2392/hdd3/sickkids/sub-E6/ses-preresection/ieeg/sub-E6_ses-preresection_acq-ecog_space-fs_electrodes.tsv.
<ipython-input-11-4873fb17ff74>:4: RuntimeWarning: Did not find any events.tsv associated with sub-E6_ses-preresection_task-pre_acq-ecog_run-01.
The search_str was "/home/adam2392/hdd3/sickkids/sub-E6/**/sub-E6_ses-preresection*events.tsv"
raw = read_raw_bids(fpath, verbose=verbose)
<ipython-input-11-4873fb17ff74>:4: RuntimeWarning: Defaulting coordinate frame to unknown from coordinate system input Other
raw = read_raw_bids(fpath, verbose=verbose)
<ipython-input-11-4873fb17ff74>:4: RuntimeWarning: There are channels without locations (n/a) that are not marked as bad: ['C1', 'C2', 'C3', 'C4', 'C5', 'C6', 'C7', 'C8', 'C9', 'C10', 'C11', 'C12', 'C13', 'C14', 'C16', 'C17', 'C18', 'C19', 'C20', 'C21', 'C22', 'C23', 'C24', 'C25', 'C26', 'C27', 'C28', 'C29', 'C30', 'C31', 'C32', 'C33', 'C34', 'C35', 'C36', 'C37', 'C38', 'C39', 'C40', 'C41', 'C42', 'C43', 'C44', 'C45', 'C46', 'C47', 'C48', 'C49', 'C50', 'C51', 'C52', 'C53', 'C54', 'C55', 'C56', 'C57', 'C58', 'C59', 'C60', 'C61', 'C62', 'C63', 'C64', '1D1', '1D2', '1D3', '1D4', '1D5', '1D6', '2D1', '2D2', '2D3', '2D4', '2D5', '2D6', '3D1', '3D2', '3D3', '3D4', '3D5', '3D6', 'C83', 'C84', 'C85', 'C86', 'C87', 'C88', 'C89', 'C90', 'C91', 'C92', 'C93', 'C94', 'C95', 'C96', 'C97', 'C98', 'C99', 'C100', 'C101', 'C102', 'C103', 'C104', 'C105', 'C106', 'C107', 'C108', 'C109', 'C110', 'C111', 'C112', 'C113', 'C114', 'C115', 'C116', 'C117', 'C118', 'C119', 'C120', 'C121', 'C122', 'C123']
raw = read_raw_bids(fpath, verbose=verbose)
<ipython-input-11-4873fb17ff74>:4: RuntimeWarning: Fiducial point nasion not found, assuming identity unknown to head transformation
raw = read_raw_bids(fpath, verbose=verbose)
---------------------------------------------------------------------------
MemoryError Traceback (most recent call last)
<ipython-input-11-4873fb17ff74> in <module>
27
28 # run HFO detection
---> 29 detector.fit(raw)
30
31 # extract the resulting annotations dataframe
~/Documents/mne-hfo/mne_hfo/base.py in fit(self, X, y)
350
351 chs_hfos = {}
--> 352 self.hfo_event_arr_ = self._create_empty_event_arr()
353 if self.n_jobs == 1:
354 for idx in tqdm(range(self.n_chs)):
~/Documents/mne-hfo/mne_hfo/detect.py in _create_empty_event_arr(self)
108 n_windows = self.n_times
109 n_bands = len(self.freq_cutoffs) - 1
--> 110 hfo_event_arr = np.empty((self.n_chs, n_bands, n_windows))
111 return hfo_event_arr
112
MemoryError: Unable to allocate 121. GiB for an array with shape (72, 61, 3704064) and data type float64
Expected results
What HilbertDetector should be able to do is:
i) determine RAM on the computer
ii) determine window chunks to apply (maybe we hard-code it to 1 GB a most?), based on number of channels and number of frequency bins -> get number of windows that amount to 1GB for the array
iii) run a loop over those windows to compute HFO events and store into temporary disc
iv) combine them into HFO annotatinos dataframe (note the edge of each window chunk needs to be smartly handled)
The text was updated successfully, but these errors were encountered:
Describe the bug
Storing the Hilbert HFO statistic array is cumbersome especially if it's a gigantic dataset.
For example, a dataset from sickkids E6, has a huge array that can't be allocated.
Expected results
What HilbertDetector should be able to do is:
i) determine RAM on the computer
ii) determine window chunks to apply (maybe we hard-code it to 1 GB a most?), based on number of channels and number of frequency bins -> get number of windows that amount to 1GB for the array
iii) run a loop over those windows to compute HFO events and store into temporary disc
iv) combine them into HFO annotatinos dataframe (note the edge of each window chunk needs to be smartly handled)
The text was updated successfully, but these errors were encountered: