Replies: 1 comment 4 replies
-
Hi @bking124, River does support dynamic feature sets. As you may know River models take dictionaries as inputs; models basically learn or predict from whatever features are provided (using the sparse nature of dicts). There is a short mention of that in the introduction page of the docs, not sure if there's more detailed content elsewhere... Just expect to sometimes run into cold start issues when the feature distribution changes in a way that makes the previous patterns learned by the model out of date. The model will naturally adapt to the new nature of the data after some time. |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have recently stumbled upon River and am still learning about all the features, so please excuse me if I have missed an easy answer to my question in the documentation.
From my understanding, the models in River can handle all sorts of concept drift, but generally assume that the feature set stays the same size (as far as I'm aware). However, I am interested in handling situations where new features may pop up and old features may drop away over time. In recent literature, this seems to be called 'feature evolvable streams' or 'varying feature spaces'. Some recent research I've found:
So my question is: are river models capable of handling such streams? I found an old discussion (#625) on a similar topic--the consensus seemed to be that most models would ignore any new features observed at test/inference time and missing features would be handled in different ways depending on the algorithm. However, this thread was 2.5+ years ago, and I know river has likely changed significantly since then, so I thought it was worth raising this question again.
Beta Was this translation helpful? Give feedback.
All reactions