You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I load the pretrain model that you have provided and find the lantent layer is VAELatentLayer. According the codes in CellPLM/latent/autoencoders.py, it seems the ordinary VAE encoder, not the Gaussian mixture prior distribution that mentioned in your paper. Could you please explain the question? Thanks a lot.
PRETRAIN_VERSION = '20231027_85M'
DEVICE = 'cuda:0'
from CellPLM.pipeline import load_pretrain
load_pretrain(pretrain_prefix=PRETRAIN_VERSION, # Specify the pretrain checkpoint to load
overwrite_config={},
pretrain_directory='../ckpt')
Hi, I load the pretrain model that you have provided and find the lantent layer is VAELatentLayer. According the codes in CellPLM/latent/autoencoders.py, it seems the ordinary VAE encoder, not the Gaussian mixture prior distribution that mentioned in your paper. Could you please explain the question? Thanks a lot.
The text was updated successfully, but these errors were encountered: