Skip to content

Diminishing weights #852

Answered by gbolmier
numersoz asked this question in Q&A
Feb 19, 2022 · 1 comments · 4 replies
Discussion options

You must be logged in to vote

Hi @numersoz 👋

Online learning algorithms naturally "forget" about past observations over time as they learn from new ones. If you want to speed up that forgetting part you can use the sample_weight argument of the learn_one method of your model. Simple example that adds an increment of 0.01 to each new sample weight:

sample_weight, increment = 1, 0.01
for x, y in X_y:
    y_pred = model.predict_one(x)

    sample_weight += increment
    model.learn_one(x, y, sample_weight=sample_weight)

The drawback here is to persist the last value of sample_weight and the value of increment in order to resume training later on. It could be nice to do that with a scheduler. I think the learn_one methods…

Replies: 1 comment 4 replies

Comment options

You must be logged in to vote
4 replies
@numersoz
Comment options

@MaxHalford
Comment options

@nocluebutalotofit
Comment options

@gbolmier
Comment options

Answer selected by MaxHalford
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
4 participants