How To Create A List Of Modulelists
Is it ok to create a python-list of PyTorch modulelists? If for example, I want to have a few Conv1d in a layer and then another layer with different Conv1d. In each layer I need
Solution 1:
You need to register all sub-modules of your net properly so that pytorch can have access to their parameters, buffers etc. This can be done only if you use proper containers. If you store sub-modules in a simple pythonic list pytorch will have no idea there are sub modules there and they will be ignored.
So, if you use simple pythonic list to store the sub-modules, when you call, for instance, model.cuda()
the parameters of the sub-modules in the list will not be transferred to GPU, but rather remain on CPU. If you call model.parameters()
to pass all trainable parameters to an optimizer, all the sub-modules parameters will not be detected by pytorch and thus the optimizer will not "see" them.
Baca Juga
- Runtimeerror: Given Groups=1, Weight Of Size [64, 3, 7, 7], Expected Input[3, 1, 224, 224] To Have 3 Channels, But Got 1 Channels Instead
- Randomly Set Some Elements In A Tensor To Zero (with Low Computational Time)
- Why The Model Is Training On Only 1875 Training Set Images If There Are 60000 Images In The Mnist Dataset?
Post a Comment for "How To Create A List Of Modulelists"