[BUG] pad_sequence
with batch_size=1
does not encapsulate non-tensor data into an iterable
#1168
Closed
3 tasks done
Labels
bug
Something isn't working
Describe the bug
I use
tensordict.pad_sequence
as acollate_fn
. Forbatch_size
s greater than 1, it works perfectly. However, during validation/testing; if I have a residual batch with a single element in it, non-tensor elements do not conform to the batch size.To Reproduce
Expected behavior
t1.name should be a NonTensorStack, not NonTensorData.
System info
Describe the characteristic of your environment:
Reason and Possible fixes
tensordict/tensordict/functional.py
Line 219 in aeff837
instead of torch.stack, I think
NonTensorStack.from_list
should be used.Checklist
The text was updated successfully, but these errors were encountered: