Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding additional signals (e.g. for computed_field) #337

Open
tlambert03 opened this issue Nov 8, 2024 · 2 comments · May be fixed by #340
Open

Adding additional signals (e.g. for computed_field) #337

tlambert03 opened this issue Nov 8, 2024 · 2 comments · May be fixed by #340

Comments

@tlambert03
Copy link
Member

from @sjdemartini in #334:

Side note/ask: It would be nice if the @computed_field could be added to the model.events SignalGroup even without a setter, so that at least we could manually model.events.my_computed_field.emit() too.

see also, discussion starting here: #334 (comment)

@tlambert03
Copy link
Member Author

so @sjdemartini, am I correct that the gist of this is essentially that you'd just like to have a mechanism to add signals in addition to those derived from fields? or is it more specifically just that computed_fields should be treated no differently than other fields on the model, and therefore receive an event (even if it's not clear how they would be emitted)?

I can imagine a few solutions:

  1. we just go ahead and create events for all fields, including computed_fields, and don't worry about the issue I mentioned in Adding Pydantic v2 @computed_field decorator prevents class creation with dependencies #334 (comment)
  2. you could just manually add a SignalInstance to the Model.events group in your __init__.py method. That's more of an end-user workaround
  3. We could add something to the model_config like extra_events: {'event_name': (int, str)}. This would not magically add computed_fields, but would allow complete freedom of additional fields.

I don't love any of these, since they all feel slightly unexpected to me. i might be happiest with number 2, but only if you @sjdemartini also feel like it's a satisfying and natural solution. Would you perhaps be able to give me a quick toy example, expanding on your PrivateAttr example, that shows how exactly you might emit the signal?

@sjdemartini
Copy link
Contributor

tldr: "custom" events with approaches like (1), (2), or (3) seem unnecessary if:
a) computed_fields without setters could still have automatic event emission based on field_dependencies (this seems uncontroversially worth supporting, but please let me know there's some reason it shouldn't be)
b) PrivateAttrs could be used as field_dependencies


so @sjdemartini, am I correct that the gist of this is essentially that you'd just like to have a mechanism to add signals in addition to those derived from fields? or is it more specifically just that computed_fields should be treated no differently than other fields on the model, and therefore receive an event (even if it's not clear how they would be emitted)?

@tlambert03 It's more the latter. I don't have a need to add miscellaneous custom events currently, but do have @computed_fields which are serialized for clients, and for which I want to be able to publish events (or even more ideally, have them published automatically when the @computed_field would be updated based on its dependencies).

Here's a toy example which perhaps illustrates the use-case:

from psygnal import EventedModel
from pydantic import computed_field, PrivateAttr

class MyModel(EventedModel):
    _items_dict: dict[str, int] = PrivateAttr(default_factory=dict)

    @computed_field
    @property
    def item_names(self) -> list[int]:
        return list(self._items_dict.keys())

    @computed_field
    @property
    def item_sum(self) -> int:
        return sum(self._items_dict.values())

    def add_item(self, name: str, value: int) -> None:
        if name in self._items_dict:
            raise ValueError(f"Name {name} already exists!")
        # Do other validation here...
        self._items_dict = {**self._items_dict, name: value}
        # --> Emit change events here manually, if Psygnal can't handle it automatically

        # This could alternatively use the following instead for better perf, though the above could work more
        # "automatically" for psygnal event emission if PrivateAttr were supported as a computed dependency:
        # self._items_dict[name] = value


    # Ideally the following would work
    class Config:
        field_dependencies = {
            "item_names": ["_items_dict"],
            "item_sum": ["_items_dict"],
        }

In my case, I'm serializing my Pydantic model for a web client, and send updates to the model based on Psygnal events (via Websocket). I want the client to see/utilize the computed_fields, but don't want to expose the underlying data structures (like _items_dict above) in consuming python or client-side data, since those structures should be abstracted away. The python server-side code uses higher level methods (like add_item above). So it would be nice if the consumer could subscribe to "all" events (with model.events.connect) and be notified of changes to the computed_fields, based on the methods like add_item.

As with your last example you gave here #334 (comment), it would be great to set up configuration like field_dependencies = {"a_squared": ["a"]} and not need a setter on a_squared. You can see how for an example like the above, a setter for one of these computed_fields doesn't really make sense, but event emission/dependency is entirely predictable.

This may be a separate discussion, but is there a specific reason that PrivateAttrs shouldn't be able to be dependencies in psygnal? From the consumer side (the user of the evented model), they don't need to know (and can't affect) what the underlying dependencies of any given field are, so it seems like the person defining the class should be able to set a computed field as being dependent on internal/private attributes. (Perhaps this is very cumbersome from an API perspective when working with Pydantic for Psygnal's internals, so if you have an alternative suggestion, please let me know.)


For what it's worth, my current workaround has been to define custom signals via ClassVars that use Signals in my Pydantic model class (though those don't get added to obj.events of course, so can't be subscribed that way). Something like the following:

from typing import ClassVar
from psygnal import EventedModel, Signal
from pydantic import ConfigDict, PrivateAttr, computed_field, field_serializer

class MyModel(EventedModel):
    item_names_changed: ClassVar[Signal] = Signal(list[str])
    ...
    def add_item(self, name: str, value: int) -> None:
        ...
        self.item_names_changed.emit(self.item_names)

The consumer then needs to subscribe to both (model.events.connect(...) and model.a_squared_changed.connect(...)).


you could just manually add a SignalInstance to the Model.events group in your __init__.py method. That's more of an end-user workaround

I don't have a deep enough familiarity with the APIs for events and SignalInstances to know how this looks, but it sounds decent. I'm also not as familiar with how to manually emit a "change" event for a signal that's a part of the signal group (ideally both for the field and for the overall group, which I assume is how it works). A minor detail is that I'd probably prefer to override model_post_init instead of __init__ if possible, so Pydantic can set up its automatic constructor and get the benefits of IDE autocomplete, improved static type checking, etc. for all the model fields during construction.

@tlambert03 tlambert03 linked a pull request Nov 21, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging a pull request may close this issue.

2 participants