Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add hardware guide for bioimage analysis #47
base: main
Are you sure you want to change the base?
add hardware guide for bioimage analysis #47
Changes from 5 commits
8652de0
6e1db2e
c9e9a67
b771d49
33cf010
db3c359
77eeb31
1359de0
2961cb9
7e3c1d5
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm removing the sentence about remote coding, because the mentioned series is not linked yet. We can add the reference back once these tutorials were written.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed accordingly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm curious: Have you tried installing Pyton on a non-Mac ARM computer, e.g. featuring the Snapdragon CPU. I'm curious how well the Python ecosystem is compatible with these machines. Also which Operating System runs on non-Mac ARM computers?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So far I have tested on several ARM platforms like Raspberry Pi and nvidia SBC. Both are running under Ubuntu.
Software slowly catching up with pre-compiled libraries and Linux side. If one using VSCode and miniforge the environment is quite mature.
miniforge has no pre-compiled version for ARM Windows. 100% not recommended.
Certain image libraries that require OpenGL, and some SoC from Broadcom (RPI) run natively OpenGLES2 that emulates OpenGL using mesa drivers from Debian. High-performance rendering like 3D data plots is very bottlenecked.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, then how likely is it that someone who aims at analysing images uses a Raspberry PI or an NVidia SBC?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The not-so-computational reader might wonder what "edging computing" is.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed the term
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How do you think about the difference between x86 and x64 ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
x64 (in full x86-64) is the 64bit version of x86 instruction set. x86 is the broader family name.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you complete this sentence? What's your concern?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It may make sense to mention deep-learning (DL) here. Many DL applications require a GPU.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How much experience do you have with those? I think I never worked with an NPU. Is it maybe too early to mention this technology in a introduction document for beginners?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Before we recommend ARM (it sounds a bit like) I would like to make sure that Python runs on this architecture. (See comment above)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have you tried ARM-based servers? Where are those accessible?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This sentence contains multiple terms non-computational folks may not know (SSH, VSCode, 2FA, sockets)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking at this table, I find many terms that must be confusing for beginners. Also I'm wondering what a reader should conclude from the table. I see three options:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is an IC vendor? Why is it necessary to refer to this particular architecture (Meteor Lake?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe my knowledge is not up-to-date but as far as I know the only vendor shipping SoCs to end-users is Apple at the moment. Correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This section appears very complicated for non-computational folks. Is it necessary to explain this to people who want to buy a computer for image processing?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the ARM-based NVidia CPU something normal people can buy?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think common Intel Laptop processors come with efficiency cores. E.g.: https://www.intel.com/content/www/us/en/products/sku/232153/intel-core-i51335u-processor-12m-cache-up-to-4-60-ghz/specifications.html
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure if SoC-users are the primary target audience. I'm wondering if most imaging scientists in "rich" institutes may have a workstation with an NVidia GPU. Less wealthy image analysts may do their work on cheaper laptops, maybe offering gaming GPUs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
NPU is more for inference. In my analysis experience now cell detections are 80% relies on AI based segmentation (mainly cellpose).
I don't think too many bioimage analysts will train their specific cell detection model. That's the reason why I think the NPU will play a significant role in the upcoming years and is worth more mentioning than GPU.
All new laptop CPUs are SoC from the year 2024 with most of them embedded with NPU. The only difference is if they have an independent CUDA chipset.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Again, I'm not sure if specifically designed circuits and NPUs are available to common bio-image analysts.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They are on all newly released laptops, ranging from Intel, Apple, AMD and Nvidia.
Apple equipped NPU after M1 and Nvidia has Tensor core after Volta (V100/GeForce RTX 20 series)
All tensor cores are enabled by default with Tensorflow (https://docs.nvidia.com/deeplearning/frameworks/tensorflow-user-guide/index.html#tf_disable_tensor_op_math).
To the concern of beginners, I am wondering if one cannot afford the high price of nvidia devices, they should still have a guide on NPU accelerated inference. For apple users they will need the forked version of tensorflow: https://developer.apple.com/metal/tensorflow-plugin/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The link to https://github.com/jackyko1991/sCMOS-Denoise/blob/main/notebooks/camera_calibration.ipynb does not work for me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
link removed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about mentioning AMD and Intel GPUs in a table above?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added comparision between GPU vendors
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm wondering if the document might fit better in an advanced section, e.g. in the GPU-acceleration section? We should certainly not introduce hardware-aspects before anything else.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
moved to a separate hardware section