Are pretrained models available? #40
-
I would love to use this module within liGAN instead of depending on the full gnina platform. Have the caffe model weights been converted to pytorch somewhere publicly available? |
Beta Was this translation helpful? Give feedback.
Replies: 9 comments 4 replies
-
Unfortunately, I never converted the original Caffe models weights to PyTorch. The However, I would be very interested in doing so (or incorporate PRs that will allow to do that). It would be a great feature, and would serve as double check that everything works as expected. Is there any recommended way of doing this Caffe to PyTorch conversion? PS: For easier integration, I could push the package to PyPI and conda-forge, if it is of interest. |
Beta Was this translation helpful? Give feedback.
-
I actually have these weights files for all of the built-in models easily available. The built-in model weights were pulled directly from the source code of GNINA and converted to a I have created PR #34 to add these weights files. |
Beta Was this translation helpful? Give feedback.
-
I'll add that there is now a |
Beta Was this translation helpful? Give feedback.
-
That's fantastic, thanks @drewnutt ! I'll incorporate PR #34 and start adding the functionality to use them easily (including model ensemble, which is currently not implemented). |
Beta Was this translation helpful? Give feedback.
-
Awesome! Thanks, @drewnutt. |
Beta Was this translation helpful? Give feedback.
-
I opened #35 as a follow-up. I'll link relevant PRs there. Please feel free to comment with specific needs and missing features from your point of view. |
Beta Was this translation helpful? Give feedback.
-
@mattragoza you can now easily load the pre-trained from gninatorch.gnina import load_gnina_model
model = load_gnina_model(MODEL_NAME) where The |
Beta Was this translation helpful? Give feedback.
-
An ensemble of models is now also available: from gninatorch.gnina import load_gnina_models
ensemble = load_gnina_models([MODEL_NAME_1, MODEL_NAME_2, ...]) The ensemble of models returns Still no |
Beta Was this translation helpful? Give feedback.
-
@mattragoza Thanks once again to @drewnutt , who manually converted the weights of the See PR #44 for details. Related PRs: #38, #36, #34. You can now easily load GNINA's pre-trained models as follows: from gninatorch import gnina
model, ensemble = setup_gnina_model("MODEL") where |
Beta Was this translation helpful? Give feedback.
@mattragoza Thanks once again to @drewnutt , who manually converted the weights of the
dense
model (see #43), thedense_
models and thedense_ensemble
are now supported. The same goes for GNINA'sdefault
model ensemble.See PR #44 for details. Related PRs: #38, #36, #34.
You can now easily load GNINA's pre-trained models as follows:
where
MODEL
corresponds to the string you would use as--cnn
argument in GNINA, whileensemble
is a boolean flag that tells you ifmodel
is a single model (ensemble == False
) or if it is an ensemble of models (ensemble == True
).