Releases: lucidrains/DALLE2-pytorch
Releases · lucidrains/DALLE2-pytorch
1.6.0
bet on the new self-conditioning technique out of geoffrey hintons group
1.5.0
add gradient checkpointing for all resnet blocks
1.4.6
enforce clip anytorch version
1.4.5
make open clip available for use with dalle2 pytorch
1.4.4
quick fix for linear attention
1.4.3
add cosine sim for self attention as well, as a setting
1.4.2
change up epsilon in layernorm the case of using fp16, thanks to @Vel…
1.4.1
change up epsilon in layernorm the case of using fp16, thanks to @Vel…
1.4.0
allow for cosine sim cross attention, modify linear attention in atte…
1.2.2
make sure entire readme runs without errors