You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We should think about our current way of using NumpyBatch and BatchKeys.
The NumpyBatch works well with torch dataloader to automatically convert the samples to torch tensors and thus feed the model. However it is harder to probe when we want to check if the data is as expected.
Batchkeys have been quite inflexible in the past and for this reason are not used in summation model and are hard to add to or subtract from without corrupting alrerady saved batches
We should investigate if there is a better, but still simple way of packaging batches of data
The text was updated successfully, but these errors were encountered:
Yea, I wonder if we could still use Batchkeys, but in the Batch we use the values of them, not the Key. This i think means we can change the BatchKeys and we wont have those iterative problems. I think it would be changing batch[BatchKey.gsp] =x. to batch[BatchKey.gsp.name], or something like that.
On a slightly different note something fun that we could do too. Having a Sample/Batch object could be useful. Here we add function like save, load and plot to help standardise things
We should think about our current way of using NumpyBatch and BatchKeys.
The NumpyBatch works well with torch dataloader to automatically convert the samples to torch tensors and thus feed the model. However it is harder to probe when we want to check if the data is as expected.
Batchkeys have been quite inflexible in the past and for this reason are not used in summation model and are hard to add to or subtract from without corrupting alrerady saved batches
We should investigate if there is a better, but still simple way of packaging batches of data
The text was updated successfully, but these errors were encountered: