-
Notifications
You must be signed in to change notification settings - Fork 565
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update common.py #1608
base: master
Are you sure you want to change the base?
Update common.py #1608
Conversation
A suggestion for a fix. this is not clean, but if works for more, I can clean it
@@ -60,7 +60,7 @@ def prepare_setup_entities(hass, config_entry, platform): | |||
|
|||
|
|||
async def async_setup_entry( | |||
domain, entity_class, flow_schema, hass, config_entry, async_add_entities | |||
domain, entity_class, flow_schema, hass, config_entry, async_add_entities |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
domain, entity_class, flow_schema, hass, config_entry, async_add_entities | |
domain, entity_class, flow_schema, hass, config_entry, async_add_entities |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove whitespace changes
@@ -183,9 +184,10 @@ def async_connect(self): | |||
|
|||
async def _make_connection(self): | |||
"""Subscribe localtuya entity events.""" | |||
self.info("Trying to connect to %s...", self._dev_config_entry[CONF_HOST]) | |||
self.warning("Trying to connect to %s...", self._dev_config_entry[CONF_HOST]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
better to change log level in config rather than code change here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please advise how, I will be happy to know
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
configuration.yaml:
logger:
default: warning
logs:
custom_components.localtuya: debug
custom_components.localtuya.pytuya: debug
self.info(f"Connect to {self._dev_config_entry[CONF_HOST]} but not sure connection established") | ||
self._make_connection_local_key = self._make_connection_local_key + 1 | ||
if self._make_connection_local_key >= 5: | ||
self._make_connection_local_key = 0 | ||
self.info("Trying to receive a new key...") | ||
await self.update_partial_local_key() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems a bit strange to do this at connection time - maybe move this to the disconnect handler? That way you're tracking / reacquiring the key after 5 failures (disconnects), instead of after 5 successful connections.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I try to "debug" the code and found that here it looks like a place that can fit the requirement.
As written in the post, I am kind of new to HA/localtuya and solved a problem that I had in TuyaSmart and saw that a similar issue exists in localTuya (the problem is when a device is reconfigured to the TuyaSmart it gets a new key, and to acquire a new key in localtuya you need to restart HA).
Anyway, I will try your suggestion, or if you managed to try and it is solving the issue, please let me know :)
thanks
A suggestion for a fix. this is not clean, but if works for more, I can clean it