-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't load larger Image file #87
Comments
Thanks for asking about this! Yeah memory shouldn't be an issue here because of memmap (and your computer's huge amount of RAM!). Are you trying to load the If
This should import it to |
I tried this but it doesn't work:
But he still creates the image5d.npy file even if he can't load it afterwards (there's no .yml file). I think its an issue regarding to the z-t dimension. Sometimes, I can't find out when, imageJ saves the slices as timepoints so I have to retransform it, but I can't do this now because it costs to much time (at least 2 days), I try it again afterwards. In the meantime I cutted a small 100vx³ block out of the dataset to check for the cell-detection in isolation. But the learning curve for the GUI and functions is so steep I can't handle it now, so I will do the cellcount using ClearMap2 as I did before. Time is running out and there's so much left to do. |
Thanks for posting this. Yeah I agree that it looks like something to do with the z- and t-axes. Is the shape of your image z = 2581, y = 7578, x = 5735? The image is being interpreted that way, though I am not sure why it triggers the invalid z index in Bio-Formats. Did your smaller block load ok?
I completely understand. Makes good sense to use what you're familiar with. Always open to feedback on how to flatten the learning curve whenever you get the chance. Thanks for all your feedback thus far! By the way, I am working on an update to load TIF files directly, without requiring import and also without requiring the whole image to be loaded. I hope this will help the setup/learning curve at least a little. |
Yes it did, also the detection seems ok, but I can't validate it because of GUI issues. Also for me all of the "settings" like "scale detection" (is says just 0 and I can't change it), ROI-editor: Filter, Bordered, Seg, Grid, MIP doesn't do anything. But the detection itself seems ok, but I can't validate them because I couldn't find out how to plot the detected cells onto the large top images (I can only see them in the smaller images below in the ROI-editor view). I don't know if this means anything but clearmap found 2400 Cells while mag only found 62, but its most likely a threshold issue.
That sounds like a good idea. For me, the most important challenge is the better handling of larger data sets. When I see that Imaris is able to display a dataset of over 500gb in 3D in less than 10s with their proprietary IMS format and scale it according to the zoom level, that's the benchmark (and the filesize is only 50% of the TIF). The same dataset can be exported by Imaris in 2 hours as Tif files, which takes ImageJ at least 4 days. I don't think the problem can be solved using Tifs, because as far as I know it is very difficult or impossible to load this format in parallel and piecewise (similar to memmap). Maybe you know napari and how they handle the problem (https://napari.org/tutorials/applications/dask.html)? |
Thanks for your feedback, as always!
Yeah I realize this is confusing. The slider scales the sizes of blobs only in the 3D viewer and not in the ROI Editor.
Currently, the blobs are only shown in the smaller images. But good to know they could be useful to see in the overview images as well! Will note this as a possible future feature.
I agree that handling large files on commodity hardware is key. Imaris has definitely made admirable strides! I am trying a simpler approach by using the Tifffile library, which has memmap support for at least some types of TIFs. I've opened a PR (#90) to memory map these types of TIFs such as those output by ImageJ and BigStitcher to bypass the import step at least in these cases. It's not nearly as advanced as pyramidal images or using Dask (thanks for the link, by the way!) but can hopefully serve as a drop-in replacement for importing these files. |
As indicated in #87, the "Scale detections" control label is confusing since it does not operate on the ROI Editor but only on the 3D viewer. Rename this control and add a tooltip to clarify this behavior.
As indicated in #87, the "Scale detections" control label is confusing since it does not operate on the ROI Editor but only on the 3D viewer. Rename this control and add a tooltip to clarify this behavior.
I try to load my example dataset (280GB) into mag using the roi parameter. But it won't load. I also tried to load it using the GUI, it say "loading image" but does nothing at all (I checked the CPU and w/r load for my system, the process is on hold). I've got 512 GB Ram in the system so this shouldn't be an issue (but mag using memmap anyway).
I will now cut it down and try again to find out the "maximal possible" filesize.
I already transformed it into the npy format:
The text was updated successfully, but these errors were encountered: