Skip to content

Commit

Permalink
Deploying to gh-pages from @ 0cdff78 🚀
Browse files Browse the repository at this point in the history
  • Loading branch information
wbenoit26 committed Oct 1, 2024
1 parent 4fd7ef0 commit f348766
Show file tree
Hide file tree
Showing 8 changed files with 200 additions and 120 deletions.
6 changes: 3 additions & 3 deletions ml4gw.dataloading.html
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,7 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dd><p>Sample a single batch of multichannel timeseries</p>
<dl class="field-list simple">
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p><span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span></p>
<dd class="field-odd"><p><span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Float[Tensor,</span> <span class="pre">'batch</span> <span class="pre">num_ifos</span> <span class="pre">time']</span></code></span></p>
</dd>
</dl>
</dd></dl>
Expand Down Expand Up @@ -224,13 +224,13 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dl class="field-list simple">
<dt class="field-odd">Parameters<span class="colon">:</span></dt>
<dd class="field-odd"><ul class="simple">
<li><p><strong>X</strong> (<span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span>) -- Timeseries data to be iterated through. Should have
<li><p><strong>X</strong> (<span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Float[Tensor,</span> <span class="pre">'channels</span> <span class="pre">time']</span></code></span>) -- Timeseries data to be iterated through. Should have
shape <cite>(num_channels, length * sample_rate)</cite>. Windows
will be sampled from the time (1st) dimension for all
channels along the channel (0th) dimension.</p></li>
<li><p><strong>kernel_size</strong> (<span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code></span>) -- The length of the windows to sample from <cite>X</cite> in units
of samples.</p></li>
<li><p><strong>y</strong> (<span class="sphinx_autodoc_typehints-type"><code class="xref py py-data docutils literal notranslate"><span class="pre">Optional</span></code>[<code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code>]</span>) -- Target timeseries to be iterated through. If specified,
<li><p><strong>y</strong> (<span class="sphinx_autodoc_typehints-type"><code class="xref py py-data docutils literal notranslate"><span class="pre">Optional</span></code>[<code class="xref py py-class docutils literal notranslate"><span class="pre">Float[Tensor,</span> <span class="pre">'</span> <span class="pre">time']</span></code>]</span>) -- Target timeseries to be iterated through. If specified,
should be a single channel and have shape
<cite>(length * sample_rate,)</cite>. If left as <cite>None</cite>, only windows
sampled from <cite>X</cite> will be returned during iteration.
Expand Down
126 changes: 66 additions & 60 deletions ml4gw.html

Large diffs are not rendered by default.

71 changes: 58 additions & 13 deletions ml4gw.nn.autoencoder.html
Original file line number Diff line number Diff line change
Expand Up @@ -126,18 +126,29 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dl class="py method">
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.base.Autoencoder.decode">
<span class="sig-name descname"><span class="pre">decode</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="o"><span class="pre">*</span></span><span class="n"><span class="pre">X</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">states</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">None</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.base.Autoencoder.decode" title="Permalink to this definition"></a></dt>
<dd></dd></dl>
<dd><dl class="field-list simple">
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p><span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span></p>
</dd>
</dl>
</dd></dl>

<dl class="py method">
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.base.Autoencoder.encode">
<span class="sig-name descname"><span class="pre">encode</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="o"><span class="pre">*</span></span><span class="n"><span class="pre">X</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">return_states</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">False</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.base.Autoencoder.encode" title="Permalink to this definition"></a></dt>
<dd></dd></dl>
<dd><dl class="field-list simple">
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p><span class="sphinx_autodoc_typehints-type"><code class="xref py py-data docutils literal notranslate"><span class="pre">Union</span></code>[<code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code>, <code class="xref py py-data docutils literal notranslate"><span class="pre">Tuple</span></code>[<code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code>, <code class="xref py py-class docutils literal notranslate"><span class="pre">Sequence</span></code>]]</span></p>
</dd>
</dl>
</dd></dl>

<dl class="py method">
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.base.Autoencoder.forward">
<span class="sig-name descname"><span class="pre">forward</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="o"><span class="pre">*</span></span><span class="n"><span class="pre">X</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.base.Autoencoder.forward" title="Permalink to this definition"></a></dt>
<dd><p>Define the computation performed at every call.</p>
<p>Should be overridden by all subclasses.</p>
<p>Should be overridden by all subclasses.
:rtype: <span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span></p>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>Although the recipe for forward pass needs to be defined within
Expand All @@ -159,12 +170,22 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dl class="py method">
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.convolutional.ConvBlock.decode">
<span class="sig-name descname"><span class="pre">decode</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">X</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.convolutional.ConvBlock.decode" title="Permalink to this definition"></a></dt>
<dd></dd></dl>
<dd><dl class="field-list simple">
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p><span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span></p>
</dd>
</dl>
</dd></dl>

<dl class="py method">
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.convolutional.ConvBlock.encode">
<span class="sig-name descname"><span class="pre">encode</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">X</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.convolutional.ConvBlock.encode" title="Permalink to this definition"></a></dt>
<dd></dd></dl>
<dd><dl class="field-list simple">
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p><span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span></p>
</dd>
</dl>
</dd></dl>

</dd></dl>

Expand All @@ -184,13 +205,19 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dl class="py method">
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.convolutional.ConvolutionalAutoencoder.decode">
<span class="sig-name descname"><span class="pre">decode</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="o"><span class="pre">*</span></span><span class="n"><span class="pre">X</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">states</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">None</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">input_size</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">None</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.convolutional.ConvolutionalAutoencoder.decode" title="Permalink to this definition"></a></dt>
<dd></dd></dl>
<dd><dl class="field-list simple">
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p><span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span></p>
</dd>
</dl>
</dd></dl>

<dl class="py method">
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.convolutional.ConvolutionalAutoencoder.forward">
<span class="sig-name descname"><span class="pre">forward</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">X</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.convolutional.ConvolutionalAutoencoder.forward" title="Permalink to this definition"></a></dt>
<dd><p>Define the computation performed at every call.</p>
<p>Should be overridden by all subclasses.</p>
<p>Should be overridden by all subclasses.
:rtype: <span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span></p>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>Although the recipe for forward pass needs to be defined within
Expand All @@ -213,7 +240,8 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.skip_connection.AddSkipConnect.forward">
<span class="sig-name descname"><span class="pre">forward</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">X</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">state</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.skip_connection.AddSkipConnect.forward" title="Permalink to this definition"></a></dt>
<dd><p>Define the computation performed at every call.</p>
<p>Should be overridden by all subclasses.</p>
<p>Should be overridden by all subclasses.
:rtype: <span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span></p>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>Although the recipe for forward pass needs to be defined within
Expand All @@ -233,7 +261,8 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.skip_connection.ConcatSkipConnect.forward">
<span class="sig-name descname"><span class="pre">forward</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">X</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">state</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.skip_connection.ConcatSkipConnect.forward" title="Permalink to this definition"></a></dt>
<dd><p>Define the computation performed at every call.</p>
<p>Should be overridden by all subclasses.</p>
<p>Should be overridden by all subclasses.
:rtype: <span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span></p>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>Although the recipe for forward pass needs to be defined within
Expand All @@ -246,7 +275,12 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dl class="py method">
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.skip_connection.ConcatSkipConnect.get_out_channels">
<span class="sig-name descname"><span class="pre">get_out_channels</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">in_channels</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.skip_connection.ConcatSkipConnect.get_out_channels" title="Permalink to this definition"></a></dt>
<dd></dd></dl>
<dd><dl class="field-list simple">
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p><span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code></span></p>
</dd>
</dl>
</dd></dl>

</dd></dl>

Expand All @@ -258,7 +292,8 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.skip_connection.SkipConnection.forward">
<span class="sig-name descname"><span class="pre">forward</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">X</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">state</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.skip_connection.SkipConnection.forward" title="Permalink to this definition"></a></dt>
<dd><p>Define the computation performed at every call.</p>
<p>Should be overridden by all subclasses.</p>
<p>Should be overridden by all subclasses.
:rtype: <span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span></p>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>Although the recipe for forward pass needs to be defined within
Expand All @@ -271,7 +306,12 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dl class="py method">
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.skip_connection.SkipConnection.get_out_channels">
<span class="sig-name descname"><span class="pre">get_out_channels</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">in_channels</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.skip_connection.SkipConnection.get_out_channels" title="Permalink to this definition"></a></dt>
<dd></dd></dl>
<dd><dl class="field-list simple">
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p><span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">int</span></code></span></p>
</dd>
</dl>
</dd></dl>

</dd></dl>

Expand All @@ -281,7 +321,12 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dl class="py function">
<dt class="sig sig-object py" id="ml4gw.nn.autoencoder.utils.match_size">
<span class="sig-prename descclassname"><span class="pre">ml4gw.nn.autoencoder.utils.</span></span><span class="sig-name descname"><span class="pre">match_size</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">X</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">target_size</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.autoencoder.utils.match_size" title="Permalink to this definition"></a></dt>
<dd></dd></dl>
<dd><dl class="field-list simple">
<dt class="field-odd">Return type<span class="colon">:</span></dt>
<dd class="field-odd"><p><span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></span></p>
</dd>
</dl>
</dd></dl>

</section>
<section id="module-ml4gw.nn.autoencoder">
Expand Down
3 changes: 2 additions & 1 deletion ml4gw.nn.html
Original file line number Diff line number Diff line change
Expand Up @@ -241,7 +241,8 @@ <h2>Submodules<a class="headerlink" href="#submodules" title="Permalink to this
<dt class="sig sig-object py" id="ml4gw.nn.norm.GroupNorm1D.forward">
<span class="sig-name descname"><span class="pre">forward</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">x</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#ml4gw.nn.norm.GroupNorm1D.forward" title="Permalink to this definition"></a></dt>
<dd><p>Define the computation performed at every call.</p>
<p>Should be overridden by all subclasses.</p>
<p>Should be overridden by all subclasses.
:rtype: <span class="sphinx_autodoc_typehints-type"><code class="xref py py-class docutils literal notranslate"><span class="pre">Float[Tensor,</span> <span class="pre">'batch</span> <span class="pre">channel</span> <span class="pre">length']</span></code></span></p>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>Although the recipe for forward pass needs to be defined within
Expand Down
Loading

0 comments on commit f348766

Please sign in to comment.