Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integration of an external BLE proxy into the Tesla template #14620

Closed
wants to merge 4 commits into from

Conversation

wimaha
Copy link
Contributor

@wimaha wimaha commented Jun 29, 2024

Add the possibility to define ble proxy and use it instead of Tesla Fleet Api to control the vehicle.

vehicles:
  - name: tesla
    type: template
    template: tesla
    accessToken: TOKEN
    refreshToken: TOKEN
    bleHost: IP
    blePort: 8080

The BLE Proxy has to use the same API scheme like the Fleet Api.

  • Use BLE Proxy if defined
  • Don't fetch current when using BLE Proxy (because of API Limits)
  • Build locally and tested the control part
  • Updated Docs
  • Maybe: Add more explanation in docs ?

Helps for #14252 and #14226

wimaha added 4 commits June 29, 2024 10:04
… Fleet Api to control the vehicle.

The BLE Proxy has to use the same API scheme like the Fleet Api.
if v.current >= 6 {
// assume match above 6A to save API requests
if v.current >= 6 || v.ble {
// assume match above 6A to save API requests OR if bleVehicle is active
return float64(v.current), nil
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would be cautious with that as, IMHO, it will bring back issue #13007 where going below 5A is ignored by the Tesla the first time. This feedback loop allows evcc to detect the current mismatch and send the command again.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't have this issue with ble. I've used it for multiples days now. And I always charge with less then 5A.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, good to read.

I’m also using your ble proxy with TeslaMate and I did face this issue however. Maybe I’m a special case as I also charge a non-Tesla vehicle on my TWC3 ;-)

Anyway I fixed it by fetching the actual current from TeslaMate. I added that in my custom vehicle definition and then evcc started to detect and automatically fix the current mismatch:

GetMaxCurrent:
    source: mqtt
    topic: teslamate/cars/2/charger_actual_current
    timeout: 8h

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's annoying. However, if we query the Tesla server every time there is a change in charging speed, we will hit the API limit. Therefore, I think this query must be removed if we want to avoid the API limits.

@FraBoCH
Copy link
Contributor

FraBoCH commented Jun 29, 2024

It’s funny, I proposed yesterday almost the same PR (#14616) as I wanted to push progress on that topic. @andig you can really chose the one you prefer, no feelings on my side, just happy to see things progressing !

@andig andig marked this pull request as draft June 29, 2024 11:34
@wimaha
Copy link
Contributor Author

wimaha commented Jun 29, 2024

I close this, because I think #14616 is more advanced.

@wimaha wimaha closed this Jun 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants