You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I thought I could use this library for querying extremely large BigWig files, but despite providing python with 300 GB of ram it crashes anyway.
Is pyBigWig keeping all the data in memory at any point of operation?
The text was updated successfully, but these errors were encountered:
balwierz
changed the title
pyBigWig.open() runs out of memory on 300 GB ram and 165 GB BigBed file
Out of memory listing entries on one human chromosome on a machine with 300 GB ram and 165 GB BigBed file
Apr 22, 2024
It'll need to keep those results in memory to output them. I wonder what the utility of such a large bigBed file is, since I suspect you'd be better served by storing some sort of summarization of the underlying data.
I thought I could use this library for querying extremely large BigWig files, but despite providing python with 300 GB of ram it crashes anyway.
Is pyBigWig keeping all the data in memory at any point of operation?
The text was updated successfully, but these errors were encountered: