You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when I use the torch.split , it get the error: File "/opt/conda/lib/python3.11/site-packages/torch_tensorrt/dynamo/conversion/impl/split.py", line 37, in split
assert input.shape[dim] != -1, "Can't chunk on dynamic shape dimension!"
To Reproduce
Steps to reproduce the behavior:
import torch.nn as nn
import torch
from typing import Any, Callable, Dict, List, Optional, Tuple, Union
class Split(nn.Module):
def forward(self, x:torch.Tensor,y:torch.Tensor):
values = []
values.append(x)
values.append(y)
length_per_key=[x.numel() for x in values]
values=torch.cat(values, dim=-1)
return torch.split(values,length_per_key)
a=torch.randn(66093).cuda()
b=torch.randn(50).cuda()
model = Split().cuda()
res = model(a,b)
print(res[0].shape)
batch = torch.export.Dim("batch",min=1,max=1000000)
batch2 = torch.export.Dim("batch2",min=1,max=1000000)
dynamic_shapes={"x":{0:batch},"y": {0:batch2}}
from torch.fx import symbolic_trace
model = symbolic_trace(model)
import torch_tensorrt
exp_program = torch.export.export(model, tuple([a,b]),dynamic_shapes=dynamic_shapes)
trt_gm = torch_tensorrt.dynamo.compile(exp_program, [a,b],min_block_size=1,allow_shape_tensors=True,assume_dynamic_shape_support=True)
# # # Run inference
print(trt_gm.code)
Currently the dynamic shapes in the case of split is not supported. I think the issue was when we were trying chunk dynamic shape which led to this pytorch/pytorch#134663 and chunk support without decomposition to split was leading to TRT dynamic inner loop limitations. I need to revisit this with split since earlier the implementation was more focused on chunk, will update it here in the issue.
Bug Description
To Reproduce
Steps to reproduce the behavior:
Expected behavior
Environment
Additional context
The text was updated successfully, but these errors were encountered: