Create a model configuration by users #1758
Replies: 6 comments
-
I have tried following code in my project: @register_model
def swin_base_patch4_window6_192(pretrained=False, **kwargs):
""" Swin-B @ 192x192
"""
model_kwargs = dict(
patch_size=4, window_size=6, embed_dim=128, depths=(2, 2, 18, 2), num_heads=(4, 8, 16, 32), **kwargs)
return _create_swin_transformer('swin_base_patch4_window6_192', pretrained=pretrained, **model_kwargs) However, it seems lossing some essential parts like default |
Beta Was this translation helpful? Give feedback.
-
@LuoXin-s for dataset / pretrained related details you need to setup default pretrained cfgs for the new model, as these can vary with each dataset the model is trained on. |
Beta Was this translation helpful? Give feedback.
-
@rwightman I added below code: from timm.models.swin_transformer import _cfg
generate_default_cfgs({
'swin_base_patch4_window6_192': _cfg()
}) But this seems not work properly. I only want to define model's hyperparameter, and no pretrained models will be loaded. Could you please provide me a simple code snippet? Thanks! |
Beta Was this translation helpful? Give feedback.
-
@LuoXin-s not sure what the actual goal is, if it's not to share pretrained weights or add a new def for others, etc but just to train that model for research you can skip the default cfg generation and just embed the pretrained cfg in the entry fn... For the more permanent approach, the output of generate_default_cfgs needs to be assigned to a module level 'default_cfgs' var in the same module that the @register_model decorated entry fn exists (like all other models)
|
Beta Was this translation helpful? Give feedback.
-
@rwightman Thanks for your kind instruction. One further problem is that I rely on Is |
Beta Was this translation helpful? Give feedback.
-
@LuoXin-s moving out of issues since not a bug.. you need to register model & pretrained configs properly to use models.get_pretrained_cfg() ... as mentioned, in the same file, follow existing model as an example (https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py#L102-L202 ..) I don't know why you'd create a model and then destroy it by doing There are helpers that will take the pretrained config from the model (or via argument / dict overrides), merge them all and output a final dict that can be passed via kwargs to downstream create loader/transform fns ie https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/config.py#L101
|
Beta Was this translation helpful? Give feedback.
-
Hello, I am trying to use a
SwinTransformer
model with the configuration 'swin_base_patch4_window6_192'. Unfortunately, the timm library does not provide this configuration.I am consistently using the timm.create_model interface and would like to define my own model configuration named 'swin_base_patch4_window6_192' in my project. This way, I can create the model using the
timm.create_model
interface.Could you please provide guidance on how I can define this custom configuration in my project and use it with the
timm.create_model
function?Beta Was this translation helpful? Give feedback.
All reactions