Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Retether to Keras v3.6.0 #1473

Merged
merged 40 commits into from
Oct 17, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
ccf5649
checkin new tethers: `op_associative_scan`, `op_searchsorted`, `optim…
t-kalinowski Oct 7, 2024
9bd0ca8
update install_keras: default to jax-gpu, tf-cpu.
t-kalinowski Oct 7, 2024
58bb2ba
doc tweaks
t-kalinowski Oct 7, 2024
1dd4df1
update `get_file()`
t-kalinowski Oct 7, 2024
862a4c5
new internal `RandomInitializer` class
t-kalinowski Oct 7, 2024
f14383f
doc improvements
t-kalinowski Oct 7, 2024
734db40
Doc fixes
t-kalinowski Oct 11, 2024
ff73f53
new arg: `op_normalize(epsilon =)`
t-kalinowski Oct 11, 2024
29b7c66
doc fixes
t-kalinowski Oct 11, 2024
3c92d2f
doc fixes
t-kalinowski Oct 11, 2024
59fcec4
redocument
t-kalinowski Oct 11, 2024
8db931e
example fixes/updates; redocument
t-kalinowski Oct 11, 2024
82a986a
doc improvements
t-kalinowski Oct 11, 2024
416e661
new arg: `export_savedmodel(verbose = )`
t-kalinowski Oct 11, 2024
8254549
internal changes
t-kalinowski Oct 11, 2024
a52e149
fixes for `export_savedmodel()` + redocument
t-kalinowski Oct 11, 2024
60ca3e3
internal changes to random image layers
t-kalinowski Oct 15, 2024
7aab1c0
fix `layer_random_rotation` signature+docs
t-kalinowski Oct 15, 2024
9194671
more `layer_random_ *`doc updates
t-kalinowski Oct 15, 2024
46a39e5
redocument
t-kalinowski Oct 15, 2024
8596d89
add 15 new ops
t-kalinowski Oct 15, 2024
592f345
redocument
t-kalinowski Oct 15, 2024
6fbc98d
checkin new op tethers
t-kalinowski Oct 15, 2024
b19c8c1
add `get_state_tree()`
t-kalinowski Oct 15, 2024
3e255e2
add `set_state_tree()`
t-kalinowski Oct 15, 2024
c40d6a8
add `layer_auto_contrast()`
t-kalinowski Oct 15, 2024
dc2872e
redocument
t-kalinowski Oct 15, 2024
aa7b6e2
redocument
t-kalinowski Oct 15, 2024
02a5696
internal changes
t-kalinowski Oct 15, 2024
252e5be
add `layer_pipeline()`
t-kalinowski Oct 15, 2024
20f5fd1
doc chunk tweak
t-kalinowski Oct 15, 2024
b83caaf
add `layer_solarization()`
t-kalinowski Oct 15, 2024
d57550b
redocument
t-kalinowski Oct 15, 2024
304d4e0
checkin new tethers
t-kalinowski Oct 15, 2024
db8579e
attempt to wrap `KerasFileEditor`
t-kalinowski Oct 16, 2024
e7d2dce
tether `keras.utils.Config`
t-kalinowski Oct 16, 2024
f82b26d
redocument and fixes
t-kalinowski Oct 16, 2024
4f41351
workaround weird knitr+roxygen2 bug
t-kalinowski Oct 16, 2024
e819ebb
new tethers
t-kalinowski Oct 16, 2024
c70bcba
added NEWS
t-kalinowski Oct 17, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
3 changes: 2 additions & 1 deletion .tether/man/activation_elu.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ keras.activations.elu(x, alpha=1.0)
__doc__
Exponential Linear Unit.

The exponential linear unit (ELU) with `alpha > 0` is define as:
The exponential linear unit (ELU) with `alpha > 0` is defined as:

- `x` if `x > 0`
- alpha * `exp(x) - 1` if `x < 0`
Expand All @@ -23,3 +23,4 @@ Args:
Reference:

- [Clevert et al., 2016](https://arxiv.org/abs/1511.07289)

2 changes: 1 addition & 1 deletion .tether/man/application_densenet121.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
be applied.
classes: optional number of classes to classify images
into, only to be specified if `include_top` is `True`, and
if no `weights` argument is specified.
if no `weights` argument is specified. Defaults to 1000.
classifier_activation: A `str` or callable.
The activation function to use
on the "top" layer. Ignored unless `include_top=True`. Set
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_densenet169.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
be applied.
classes: optional number of classes to classify images
into, only to be specified if `include_top` is `True`, and
if no `weights` argument is specified.
if no `weights` argument is specified. Defaults to 1000.
classifier_activation: A `str` or callable.
The activation function to use
on the "top" layer. Ignored unless `include_top=True`. Set
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_densenet201.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
be applied.
classes: optional number of classes to classify images
into, only to be specified if `include_top` is `True`, and
if no `weights` argument is specified.
if no `weights` argument is specified. Defaults to 1000.
classifier_activation: A `str` or callable.
The activation function to use
on the "top" layer. Ignored unless `include_top=True`. Set
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_resnet101.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
- `max` means that global max pooling will be applied.
classes: optional number of classes to classify images into, only to be
specified if `include_top` is `True`, and if no `weights` argument is
specified.
specified. Defaults to `1000`.
classifier_activation: A `str` or callable. The activation function to
use on the "top" layer. Ignored unless `include_top=True`. Set
`classifier_activation=None` to return the logits of the "top" layer.
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_resnet152.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
- `max` means that global max pooling will be applied.
classes: optional number of classes to classify images into, only to be
specified if `include_top` is `True`, and if no `weights` argument is
specified.
specified. Defaults to `1000`.
classifier_activation: A `str` or callable. The activation function to
use on the "top" layer. Ignored unless `include_top=True`. Set
`classifier_activation=None` to return the logits of the "top" layer.
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_resnet50.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
- `max` means that global max pooling will be applied.
classes: optional number of classes to classify images into, only to be
specified if `include_top` is `True`, and if no `weights` argument is
specified.
specified. Defaults to `1000`.
classifier_activation: A `str` or callable. The activation function to
use on the "top" layer. Ignored unless `include_top=True`. Set
`classifier_activation=None` to return the logits of the "top" layer.
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_xception.txt
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ Args:
be applied.
classes: optional number of classes to classify images
into, only to be specified if `include_top` is `True`, and
if no `weights` argument is specified.
if no `weights` argument is specified. Defaults to `1000`.
classifier_activation: A `str` or callable. The activation function to
use on the "top" layer. Ignored unless `include_top=True`. Set
`classifier_activation=None` to return the logits of the "top"
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/clone_model.txt
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ def clone_function(layer):
config["seed"] = 1337
return layer.__class__.from_config(config)

new_model = clone_model(model)
new_model = clone_model(model, clone_function=clone_function)
```

Using a `call_function` to add a `Dropout` layer after each `Dense` layer
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@ __signature__
keras.Model.export(
self,
filepath,
format='tf_saved_model'
format='tf_saved_model',
verbose=True
)
__doc__
Create a TF SavedModel artifact for inference.
Expand All @@ -22,6 +23,7 @@ entirely standalone.
Args:
filepath: `str` or `pathlib.Path` object. Path where to save
the artifact.
verbose: whether to print all the variables of the exported model.

Example:

Expand Down
15 changes: 10 additions & 5 deletions .tether/man/get_file.txt
Original file line number Diff line number Diff line change
Expand Up @@ -35,14 +35,18 @@ path_to_downloaded_file = get_file(
```

Args:
fname: Name of the file. If an absolute path, e.g. `"/path/to/file.txt"`
is specified, the file will be saved at that location.
fname: If the target is a single file, this is your desired
local name for the file.
If `None`, the name of the file at `origin` will be used.
If downloading and extracting a directory archive,
the provided `fname` will be used as extraction directory
name (only if it doesn't have an extension).
origin: Original URL of the file.
untar: Deprecated in favor of `extract` argument.
boolean, whether the file should be decompressed
Boolean, whether the file is a tar archive that should
be extracted.
md5_hash: Deprecated in favor of `file_hash` argument.
md5 hash of the file for verification
md5 hash of the file for file integrity verification.
file_hash: The expected hash string of the file after download.
The sha256 and md5 hash algorithms are both supported.
cache_subdir: Subdirectory under the Keras cache dir where the file is
Expand All @@ -51,7 +55,8 @@ Args:
hash_algorithm: Select the hash algorithm to verify the file.
options are `"md5'`, `"sha256'`, and `"auto'`.
The default 'auto' detects the hash algorithm in use.
extract: True tries extracting the file as an Archive, like tar or zip.
extract: If `True`, extracts the archive. Only applicable to compressed
archive files like tar or zip.
archive_format: Archive format to try for extracting the file.
Options are `"auto'`, `"tar'`, `"zip'`, and `None`.
`"tar"` includes tar, tar.gz, and tar.bz files.
Expand Down
67 changes: 67 additions & 0 deletions .tether/man/get_state_tree.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
__signature__
keras.Model.get_state_tree(self, value_format='backend_tensor')
__doc__
Retrieves tree-like structure of model variables.

This method allows retrieval of different model variables (trainable,
non-trainable, optimizer, and metrics). The variables are returned in a
nested dictionary format, where the keys correspond to the variable
names and the values are the nested representations of the variables.

Returns:
dict: A dictionary containing the nested representations of the
requested variables. The keys are the variable names, and the
values are the corresponding nested dictionaries.
value_format: One of `"backend_tensor"`, `"numpy_array"`.
The kind of array to return as the leaves of the nested
state tree.

Example:

```python
model = keras.Sequential([
keras.Input(shape=(1,), name="my_input"),
keras.layers.Dense(1, activation="sigmoid", name="my_dense"),
], name="my_sequential")
model.compile(optimizer="adam", loss="mse", metrics=["mae"])
model.fit(np.array([[1.0]]), np.array([[1.0]]))
state_tree = model.get_state_tree()
```

The `state_tree` dictionary returned looks like:

```
{
'metrics_variables': {
'loss': {
'count': ...,
'total': ...,
},
'mean_absolute_error': {
'count': ...,
'total': ...,
}
},
'trainable_variables': {
'my_sequential': {
'my_dense': {
'bias': ...,
'kernel': ...,
}
}
},
'non_trainable_variables': {},
'optimizer_variables': {
'adam': {
'iteration': ...,
'learning_rate': ...,
'my_sequential_my_dense_bias_momentum': ...,
'my_sequential_my_dense_bias_velocity': ...,
'my_sequential_my_dense_kernel_momentum': ...,
'my_sequential_my_dense_kernel_velocity': ...,
}
}
}
}
```

2 changes: 2 additions & 0 deletions .tether/man/initializer_glorot_normal.txt
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@ class GlorotNormal(VarianceScaling)
| Method resolution order:
| GlorotNormal
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -51,3 +52,4 @@ class GlorotNormal(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

2 changes: 2 additions & 0 deletions .tether/man/initializer_glorot_uniform.txt
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ class GlorotUniform(VarianceScaling)
| Method resolution order:
| GlorotUniform
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -50,3 +51,4 @@ class GlorotUniform(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

2 changes: 2 additions & 0 deletions .tether/man/initializer_he_normal.txt
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ class HeNormal(VarianceScaling)
| Method resolution order:
| HeNormal
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -50,3 +51,4 @@ class HeNormal(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

2 changes: 2 additions & 0 deletions .tether/man/initializer_he_uniform.txt
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ class HeUniform(VarianceScaling)
| Method resolution order:
| HeUniform
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -50,3 +51,4 @@ class HeUniform(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

2 changes: 2 additions & 0 deletions .tether/man/initializer_lecun_normal.txt
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ class LecunNormal(VarianceScaling)
| Method resolution order:
| LecunNormal
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -54,3 +55,4 @@ class LecunNormal(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

2 changes: 2 additions & 0 deletions .tether/man/initializer_lecun_uniform.txt
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ class LecunUniform(VarianceScaling)
| Method resolution order:
| LecunUniform
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -50,3 +51,4 @@ class LecunUniform(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

4 changes: 3 additions & 1 deletion .tether/man/initializer_orthogonal.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Help on class OrthogonalInitializer in module keras.src.initializers.random_initializers:

class OrthogonalInitializer(keras.src.initializers.initializer.Initializer)
class OrthogonalInitializer(RandomInitializer)
| OrthogonalInitializer(gain=1.0, seed=None)
|
| Initializer that generates an orthogonal matrix.
Expand Down Expand Up @@ -37,6 +37,7 @@ class OrthogonalInitializer(keras.src.initializers.initializer.Initializer)
|
| Method resolution order:
| OrthogonalInitializer
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand Down Expand Up @@ -66,3 +67,4 @@ class OrthogonalInitializer(keras.src.initializers.initializer.Initializer)
| Returns:
| A JSON-serializable Python dict.
|

4 changes: 3 additions & 1 deletion .tether/man/initializer_random_normal.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Help on class RandomNormal in module keras.src.initializers.random_initializers:

class RandomNormal(keras.src.initializers.initializer.Initializer)
class RandomNormal(RandomInitializer)
| RandomNormal(mean=0.0, stddev=0.05, seed=None)
|
| Random normal initializer.
Expand Down Expand Up @@ -33,6 +33,7 @@ class RandomNormal(keras.src.initializers.initializer.Initializer)
|
| Method resolution order:
| RandomNormal
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand Down Expand Up @@ -63,3 +64,4 @@ class RandomNormal(keras.src.initializers.initializer.Initializer)
| Returns:
| A JSON-serializable Python dict.
|

4 changes: 3 additions & 1 deletion .tether/man/initializer_random_uniform.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Help on class RandomUniform in module keras.src.initializers.random_initializers:

class RandomUniform(keras.src.initializers.initializer.Initializer)
class RandomUniform(RandomInitializer)
| RandomUniform(minval=-0.05, maxval=0.05, seed=None)
|
| Random uniform initializer.
Expand Down Expand Up @@ -33,6 +33,7 @@ class RandomUniform(keras.src.initializers.initializer.Initializer)
|
| Method resolution order:
| RandomUniform
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand Down Expand Up @@ -63,3 +64,4 @@ class RandomUniform(keras.src.initializers.initializer.Initializer)
| Returns:
| A JSON-serializable Python dict.
|

4 changes: 3 additions & 1 deletion .tether/man/initializer_truncated_normal.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Help on class TruncatedNormal in module keras.src.initializers.random_initializers:

class TruncatedNormal(keras.src.initializers.initializer.Initializer)
class TruncatedNormal(RandomInitializer)
| TruncatedNormal(mean=0.0, stddev=0.05, seed=None)
|
| Initializer that generates a truncated normal distribution.
Expand Down Expand Up @@ -36,6 +36,7 @@ class TruncatedNormal(keras.src.initializers.initializer.Initializer)
|
| Method resolution order:
| TruncatedNormal
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand Down Expand Up @@ -66,3 +67,4 @@ class TruncatedNormal(keras.src.initializers.initializer.Initializer)
| Returns:
| A JSON-serializable Python dict.
|

4 changes: 3 additions & 1 deletion .tether/man/initializer_variance_scaling.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Help on class VarianceScaling in module keras.src.initializers.random_initializers:

class VarianceScaling(keras.src.initializers.initializer.Initializer)
class VarianceScaling(RandomInitializer)
| VarianceScaling(scale=1.0, mode='fan_in', distribution='truncated_normal', seed=None)
|
| Initializer that adapts its scale to the shape of its input tensors.
Expand Down Expand Up @@ -45,6 +45,7 @@ class VarianceScaling(keras.src.initializers.initializer.Initializer)
|
| Method resolution order:
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand Down Expand Up @@ -76,3 +77,4 @@ class VarianceScaling(keras.src.initializers.initializer.Initializer)
| Returns:
| A JSON-serializable Python dict.
|

Loading
Loading