diff --git a/README.md b/README.md index be0c8de..d1616c6 100644 --- a/README.md +++ b/README.md @@ -4,7 +4,7 @@
[📄 Research Paper]•[📖 Documentation & Examples] @@ -14,7 +14,7 @@ * **What is the ConFIG method?** -​ The conFIG method is a generic method for optimization problems involving **multiple loss terms** (e.g., Multi-task Learning, Continuous Learning, and Physics Informed Neural Networks). It prevents the optimization from getting stuck into a local minimum of a specific loss term due to the conflict between losses. On the contrary, it leads the optimization to the **shared minimal of all losses** by providing a **conflict-free update direction.** +​ The conFIG method is a generic method for optimization problems involving **multiple loss terms** (e.g., Multi-task Learning, Continuous Learning, and Physics Informed Neural Networks). It prevents the optimization from getting stuck into a local minimum of a specific loss term due to the conflict between losses. On the contrary, it leads the optimization to the **shared minimum of all losses** by providing a **conflict-free update direction.**
@@ -35,7 +35,7 @@
Then the dot product between $\boldsymbol{g}_{ConFIG}$ and each loss-specific gradient is always positive and equal, i.e., $`\boldsymbol{g}_{i}^{\top}\boldsymbol{g}_{ConFIG}=\boldsymbol{g}_{j}^{\top}\boldsymbol{g}_{ConFIG}> 0 \quad \forall i,j \in [1,m]`$​.
-* **Is the ConFIG Computationally expensive?**
+* **Is the ConFIG computationally expensive?**
​ Like many other gradient-based methods, ConFIG needs to calculate each loss's gradient in every optimization iteration, which could be computationally expensive when the number of losses increases. However, we also introduce a **momentum-based method** where we can reduce the computational cost **close to or even lower than a standard optimization procedure** with a slight degeneration in accuracy. This momentum-based method is also applied to another gradient-based method.
diff --git a/docs/assets/config_white.png b/docs/assets/config_white.png
new file mode 100644
index 0000000..7684ca7
Binary files /dev/null and b/docs/assets/config_white.png differ
diff --git a/docs/assets/config_white.svg b/docs/assets/config_white.svg
index 9d3cb3f..3c897d9 100644
--- a/docs/assets/config_white.svg
+++ b/docs/assets/config_white.svg
@@ -6,9 +6,9 @@
version="1.1"
id="svg171"
sodipodi:docname="config_white.svg"
- inkscape:version="1.2.2 (732a01da63, 2022-12-09)"
+ inkscape:version="1.3.2 (1:1.3.2+202311252150+091e20ef0f)"
xml:space="preserve"
- inkscape:export-filename="config.png"
+ inkscape:export-filename="config_white.png"
inkscape:export-xdpi="96"
inkscape:export-ydpi="96"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
@@ -27,14 +27,14 @@
inkscape:pagecheckerboard="0"
inkscape:deskcolor="#d1d1d1"
inkscape:document-units="pt"
- inkscape:zoom="1.2245469"
- inkscape:cx="331.95952"
- inkscape:cy="12.249429"
- inkscape:window-width="2560"
- inkscape:window-height="1494"
- inkscape:window-x="-11"
- inkscape:window-y="-11"
- inkscape:window-maximized="1"
+ inkscape:zoom="2.4490938"
+ inkscape:cx="115.55294"
+ inkscape:cy="91.666559"
+ inkscape:window-width="1464"
+ inkscape:window-height="773"
+ inkscape:window-x="1472"
+ inkscape:window-y="449"
+ inkscape:window-maximized="0"
inkscape:current-layer="svg171"
showgrid="false" />
[ 📄 Research Paper ]•[ GitHub Repository ]
@@ -20,7 +20,7 @@ hide:
* **What is the ConFIG method?**
-​ The conFIG method is a generic method for optimization problems involving **multiple loss terms** (e.g., Multi-task Learning, Continuous Learning, and Physics Informed Neural Networks). It prevents the optimization from getting stuck into a local minimum of a specific loss term due to the conflict between losses. On the contrary, it leads the optimization to the **shared minimal of all losses** by providing a **conflict-free update direction.**
+​ The conFIG method is a generic method for optimization problems involving **multiple loss terms** (e.g., Multi-task Learning, Continuous Learning, and Physics Informed Neural Networks). It prevents the optimization from getting stuck into a local minimum of a specific loss term due to the conflict between losses. On the contrary, it leads the optimization to the **shared minimum of all losses** by providing a **conflict-free update direction.**
@@ -41,7 +41,7 @@ $$
Then the dot product between $\mathbf{g}_{ConFIG}$ and each loss-specific gradient is always positive and equal, i.e., $\mathbf{g}_{i}^{\top}\mathbf{g}_{ConFIG}=\mathbf{g}_{j}^{\top}\mathbf{g}_{ConFIG} > 0 \quad \forall i,j \in [1,m]$​.
-* **Is the ConFIG Computationally expensive?**
+* **Is the ConFIG computationally expensive?**
​ Like many other gradient-based methods, ConFIG needs to calculate each loss's gradient in every optimization iteration, which could be computationally expensive when the number of losses increases. However, we also introduce a **momentum-based method** where we can reduce the computational cost **close to or even lower than a standard optimization procedure** with a slight degeneration in accuracy. This momentum-based method is also applied to another gradient-based method.
diff --git a/mkdocs.yml b/mkdocs.yml
index a08bdc8..79d5ab9 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -15,6 +15,7 @@ theme:
- toc.integrate # Table of contents is integrated on the left; does not appear separately on the right.
- header.autohide # header disappears as you scroll
- navigation.top
+ - navigation.footer
palette:
- scheme: default
primary: brown
@@ -30,7 +31,7 @@ theme:
name: Switch to light mode
icon:
repo: fontawesome/brands/github # GitHub logo in top right
- logo: assets/config_white.svg
+ logo: assets/config_white.png
favicon: assets/config_colorful.svg
extra:
Towards Conflict-free Training for everything!
+Towards Conflict-free Training for Everything and Everyone!