Dropout implementation for ensemble #3962
Unanswered
johanenforcer
asked this question in
Q&A
Replies: 1 comment 7 replies
-
@johanenforcer yes you can add a dropout layer by modifying modules directly (i.e. in models/common.py) or by simply adding a line to your model.yaml file. If you modify the model.yaml file though you need to ensure that all layers after the modified layer have their input indexes updated accordingly, i.e. probably adding +1 to their values. |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How would I go about implementing dropout in an efficient manner, I am aware of the post where overfitting has been discussed ,where dropout has been discussed but wanted to discuss these matters more. The point is not to reduce overfitting, but rather just use the dropout itself for the sake of creating an ensemble of model networks for an active learning query strategy by sampling different model variants. That being said, I am interested in knowing whether it is fine to implement the dropout technique straight after the backbone has finished, since I am retrieving activation volumes from three different conv layers in the head section of the network? Is the regular dropout such as
self.drop
= nn.Dropout2d(0.1)` fine to use or should there be other considerations like for example using dropblock instead on the activation volume from the backbone?Thanks in advance!
Beta Was this translation helpful? Give feedback.
All reactions