Bool support in TRTorch #358
Replies: 4 comments 5 replies
-
Relevant issue for the sake of completeness #341 |
Beta Was this translation helpful? Give feedback.
-
Actually, I think that typecasting in Boolean operator seems bad idea. I tried to check the type of tensor and converting it to float or int in "mul" operator, but I failed to implemented this by now. |
Beta Was this translation helpful? Give feedback.
-
Even some basic test case like this would fail with the above error
This is due to TRT limitation. Previously with lowering of aten::to, this wouldn't be a problem as we used to ignore cast. So any layer consuming FP16 inputs in pytorch graph would have been assumed to consume fp32 inputs in TRT graph (without --fp16 flag). Solution is to use |
Beta Was this translation helpful? Give feedback.
-
We have 3 overlapping PRs regarding bool support for certain operators. I wanted to unify the discussion here so we can establish what the best approach is here.
For context as I understand it, there are some operators that can use booleans as masks. There is a Bool type in TensorRT but I guess the operations that support it are quite limited which is why a couple PRs are casting these bools to int. Please correct me if I am wrong and add additional detail if there are particular use cases that are important to consider.
I think we need to establish the set of operators that might need this sort of support in the case of bool and make sure we have test coverage for this set. I think there is some issues regarding making sure we cast to the type dependent on input type of the network.
I just want to get on the same page before committing to a solution here.
Beta Was this translation helpful? Give feedback.
All reactions