Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

💅 Final polishing, and better understanding of how the demo works #92

Closed
2 of 7 tasks
shankari opened this issue Nov 18, 2024 · 27 comments · Fixed by #97
Closed
2 of 7 tasks

💅 Final polishing, and better understanding of how the demo works #92

shankari opened this issue Nov 18, 2024 · 27 comments · Fixed by #97

Comments

@shankari
Copy link
Collaborator

shankari commented Nov 18, 2024

I think the demo is in pretty good shape for the kind of high-level, superficial exploration that we can do in 15 mins. But there are still some inconsistencies I found that I would like to take the time to fix. Some of these may just be my lack of understanding (e.g. #90), but then I would like to deepen my understanding to be ready for questions during the office hours.

I will keep a summar in the description and the details below.

  • Change labels on demo to "user specified" and "MIDAS grid integration". added dots to the chart
  • why is there a gap between the max current (32A) and the current limit in the default charging profile (48A)?
    • because one of them is the default profile (specified in libocpp config) and the other is the hardware limit
    • I guess theoretically they can be different, but maybe they shouldn't be different by default? Discuss with the community, make them the same for this demo.
  • where is the value in the gauge coming from, and why is it so high? We are seeing something like 10kW, but 240 x 32 = 7680, 240 x 40 = 9600 and 240 x 48 = 11520. Maybe this is 240 x 48, but we should verify first.
  • why do we not use charging curves when the departure time is not set?
  • why do we specify an SA Schedule in Watts instead of Amps, given that this is AC charging?
  • can we make more of a case for complex SA schedules to highlight the impact of departure time?
  • have we verified the "you can't get enough energy" use case?
@shankari
Copy link
Collaborator Author

wrt max current, it is set via node_red using everest_external/nodered/#/cmd/set_max_current

which invokes

    // external input to charger: update max_current and new validUntil
    bool set_max_current(float ampere, std::chrono::time_point<date::utc_clock> validUntil);

and is less than

    if (c >= 0.0 and c <= CHARGER_ABSOLUTE_MAX_CURRENT) {

it then sets it on the BSP and signals.

            bsp->set_overcurrent_limit(c);
            signal_max_current(c);

This is the max, so it is clearly not the source of the 48

    static constexpr float CHARGER_ABSOLUTE_MAX_CURRENT{1000.};

and it is clearly published from

    mod->charger->signal_max_current.connect([this](float c) {
        mod->mqtt.publish(fmt::format("everest_external/nodered/{}/state/max_current", mod->config.connector_id), c);
        limits.uuid = mod->info.id;

so where is it invoked from?

It is initially invoked the giant EvseManager::ready

    //  start with a limit of 0 amps. We will get a budget from EnergyManager that is locally limited by hw
    //  caps.
    charger->set_max_current(0.0F, date::utc_clock::now() + std::chrono::seconds(120));

and the energy manager uses https://github.com/EVerest/everest-core/blob/828072742f816d74d44d55fdf13d01c8fbecd449/modules/EvseManager/energy_grid/energyImpl.cpp#L353

            if (value.limits_root_side.value().ac_max_current_A.has_value()) {
                limit = value.limits_root_side.value().ac_max_current_A.value();
            }

...


                float a = value.limits_root_side.value().total_power_W.value() / mod->config.ac_nominal_voltage /
                          mod->ac_nr_phases_active;

But the only energy that I see is

  grid_connection_point:
    module: EnergyNode
    config_module:
      fuse_limit_A: 40.0
      phase_count: 3

Need to add some logs here to see what is going on.

@shankari
Copy link
Collaborator Author

In parallel, let's figure out where the default composite schedule comes from. The composite schedules that we use come from

std::vector<CompositeSchedule> ChargePoint::get_all_composite_schedules(const int32_t duration_s,
                                                                        const ChargingRateUnitEnum& unit) {

which calls

        auto schedule = this->smart_charging_handler->calculate_composite_schedule(

Bingo! Here's where the defaults are set
https://github.com/EVerest/libocpp/blob/925e9cd3049faf6d31e496e33aad619501f5c3d9/lib/ocpp/v201/smart_charging.cpp#L632

    const auto default_amps_limit =
        this->device_model->get_optional_value<int>(ControllerComponentVariables::CompositeScheduleDefaultLimitAmps)
            .value_or(DEFAULT_LIMIT_AMPS);
    const auto default_watts_limit =
        this->device_model->get_optional_value<int>(ControllerComponentVariables::CompositeScheduleDefaultLimitWatts)
            .value_or(DEFAULT_LIMIT_WATTS);

and looking at the /ext/dist/share/everest/modules/OCPP201/component_config/standardized/SmartChargingCtrlr.json, we do indeed see 48 A and 33120 W. This was added in EVerest/libocpp@8d74ff5. I looked through the commit, and the related PR, and there was no explanation for the defaults.

@shankari
Copy link
Collaborator Author

shankari commented Nov 18, 2024

Going back to the other limits, they are indeed set from the fuse_limit_A

modules/EnergyNode/energy_grid/energyImpl.cpp

    local_schedule.limits_to_root.ac_max_phase_count = mod->config.phase_count;
    local_schedule.limits_to_root.ac_max_current_A = mod->config.fuse_limit_A;
    local_schedule.limits_to_leaves.ac_max_phase_count = mod->config.phase_count;
    local_schedule.limits_to_leaves.ac_max_current_A = mod->config.fuse_limit_A;

or

                if (!e.limits_to_root.ac_max_current_A.has_value() ||
                    e.limits_to_root.ac_max_current_A.value() > mod->config.fuse_limit_A)
                    e.limits_to_root.ac_max_current_A = mod->config.fuse_limit_A;

                if (!e.limits_to_root.ac_max_phase_count.has_value() ||
                    e.limits_to_root.ac_max_phase_count.value() > mod->config.phase_count)
                    e.limits_to_root.ac_max_phase_count = mod->config.phase_count;

The default config has it at 32, but this variable is set to 40 in our config

This has been true since the very first commit

Let's add some logs and figure out what is going on...

@shankari
Copy link
Collaborator Author

rooting around within the energyImpl, I also see another location from which we can get the max current, the hardware capabilities.

    entry_import.limits_to_root.ac_max_current_A = hw_caps.max_current_A_import;
    entry_import.limits_to_root.ac_min_current_A = hw_caps.min_current_A_import;

Let's see where they are coming from. Bingo! They come from the hardware (aka the BSP), and in our simulator, the BSP is the JSYetiSimulator, which does have it defined as 32.

Note also that the API publishes these values once a minute

Let's get some more logs, and then make sure that we set all the locations to 32 and see if everything is more consistent then!

@shankari
Copy link
Collaborator Author

shankari commented Nov 18, 2024

Logs confirm that the limits are coming from the hardware capabilities. I am not seeing the call to the fuse API happen yet

2024-11-18 07:25:14.773429 [INFO] evse_manager_2:  :: Clearing import request schedule by setting max current from hw_caps = 32
2024-11-18 07:25:14.773564 [INFO] evse_manager_2:  :: Clearing export request schedule by setting max current from hw_caps = 32
2024-11-18 07:25:14.773658 [INFO] evse_manager_2:  :: Clearing import request schedule by setting max current from hw_caps = 32
2024-11-18 07:25:14.773744 [INFO] evse_manager_2:  :: Clearing export request schedule by setting max current from hw_caps = 32
2024-11-18 07:25:14.774441 [INFO] evse_manager_2:  :: Clearing import request schedule by setting max current from hw_caps = 32
2024-11-18 07:25:14.774556 [INFO] evse_manager_2:  :: Clearing export request schedule by setting max current from hw_caps = 32

@shankari
Copy link
Collaborator Author

double checked today morning with a fresh pair of eyes, and can confirm that:

  • we do see the value from hw_capabilities
2024-11-18 15:05:35.427388 [INFO] evse_manager_1:  :: Handle enforce limits with ac_max_current_A = 32
2024-11-18 15:05:35.481733 [INFO] evse_manager_1:  :: Clearing import request schedule by setting max current from hw_caps = 32
2024-11-18 15:05:35.481918 [INFO] evse_manager_1:  :: Clearing export request schedule by setting max current from hw_caps = 32
2024-11-18 15:05:36.483674 [INFO] evse_manager_1:  :: Clearing import request schedule by setting max current from hw_caps = 32
2024-11-18 15:05:36.483872 [INFO] evse_manager_1:  :: Clearing export request schedule by setting max current from hw_caps = 32
  • we don't see any updates from the fuse (even after plugging in)

  • it is still mismatched with the charge profile.

2024-11-18 15:09:28.420556 [INFO] ocpp:OCPP201     :: {
    "chargingRateUnit": "A",
    "chargingSchedulePeriod": [
        {
            "limit": 48.0,
            "numberPhases": 3,
            "startPeriod": 0
        }
    ],
    "duration": 600,
    "evseId": 0,
    "scheduleStart": "2024-11-18T15:09:28.000Z"
}
2024-11-18 15:09:28.421033 [INFO] ocpp:OCPP201     :: {
    "chargingRateUnit": "A",
    "chargingSchedulePeriod": [
        {
            "limit": 48.0,
            "numberPhases": 3,
            "startPeriod": 0
        }
    ],
    "duration": 600,
    "evseId": 1,
    "scheduleStart": "2024-11-18T15:09:28.000Z"
}

Maybe that is OK - the CSMS thinks that the station can have 48 A although the station has a hardware limit of 32, and then the limit should be 32. But shouldn't that 32 be reflected as an external constraint then? Need to discuss with the community @the-bay-kay and @Abby-Wheelis.

When plugged in, the kW also seems to reflect the 48 (240 * 48 = 11.5kw) which is wrong since we can only provide 32A at the hardware level. Going to investigate that next...

Screenshot 2024-11-18 at 7 13 29 AM

@the-bay-kay
Copy link

why do we not use charging curves when the departure time is not set?

It seems there are a few different issues occurring in the powercurve generation. For one, we're not generating any of the previews, because the exec module is failing to run. The first error we get is:

/bin/bash: line 1: /usr/src/node-red/python_environments/everest/bin/python3: No such file or directory

So, we can assume our venv isn't configured correctly. As a sanity, check, we attempt to run with just python3...

python3: can't open file '/bin/scripts/preview_curve_4_nodered.py': [Errno 2] No such file or directory

... and we fail to find the script entirely. Running find / -name preview_curve_4_nodered.py in the node-red container shows that our script hasn't been copied over. The script is present in the repository, but the the steps to copy it over are absent. Originally, I implemented this as setup within the Node-RED Dockerfile -- would it be better if we set this up in a separate script, as we now do with the manager patches here?

@the-bay-kay
Copy link

When plugged in, the kW also seems to reflect the 48 (240 * 48 = 11.5kw) which is wrong since we can only provide 32A at the hardware level. Going to investigate that next...

Will update this comment as I find more details, but wanted to link to my findings on the PowerMeter issues here before I forget (this thread was originally concerned with piping through the ChargingProfileSchedules, but the findings concerning JSYetiSimulator's PWM implementation should remain relevant).

@shankari
Copy link
Collaborator Author

Ah maybe I missed that while porting over the changes. In the Dockerfile is fine, there were just soooo many patches for the manager that I thought it would be tidier to put them into scripts. But the nodered Dockerfile is fairly simple now.

@the-bay-kay
Copy link

the-bay-kay commented Nov 19, 2024

Some updates on the Node-RED changes!

  • I've edited the Dockerfile to include the script setup, and confirmed its working
  • The Node-RED flows have been updated such that the powercurve preview graphs now better reflect our non-DT behavior.
    • The last change I will need to make here is to update the "actively charging" curve, s.t. our graph reflects grid clamps.

Video of existing changes included in the cut below -- I'll upload my existing changes and link to the fork in this comment. fork can be found here -- this builds NodeRED locally, I can upload and image if that'd be preferred! We shouldn't need to change the NodeRED beyond the script setup.

Changes to Node-RED Demo
compressed_SECCSchedule.mp4

@shankari shankari added this to the CharIN demo prep milestone Nov 19, 2024
@shankari
Copy link
Collaborator Author

So I finally figured out the value in the gauge. And it is in fact tied back to the 48 and some confusion around naming.

Starting with basic electrical engineering, AC, unlike DC has phases. In the US, we typically use 3-phase charging. The total power is drawn across the three phases. Most chargers in the US apparently use 40-48 amps on a 240v line, giving us around 11520W or 11.5 kw. While supplying 48 A of current, the total current is split between three phases, giving us ~ 16 A per phase. That is the L1, L2, L3 current/voltage that we see in the powermeter output logs.

The demo/simulator seems to have some inconsistencies between values. Going forward, we need to decide:

  • what we can do for the demo?
  • what the settings should be going forward? (deferring this to @Abby-Wheelis)

The inconsistencies are:

2024-11-18 16:44:11.562515 [INFO] car_simulator_1  :: {
  cmd: 'iso_draw_power_regulated',
  args: [ 16, 3 ],
  exec: [Function (anonymous)]
}

So those are internally consistent and make sense.

The 32A comes from a different location - the hardware capabilities of the charger, defined at
https://github.com/EVerest/everest-core/blob/828072742f816d74d44d55fdf13d01c8fbecd449/modules/simulation/JsYetiSimulator/index.js#L1437

It appears, though, that this capability is not specified at the phase level, but is combined. But the slider on the UI is at the phase level. That makes the slider confusing. We see it at 32A, but nothing starts happening to the power drawn gauge until we get down below 16A.

So how can we fix this for the demo?

  • We could make all the numbers be consistent, which may or may not be needed. I do think it is a bad idea to combine a charger that is only capable of supporting 32A on a grid tie that can support 48, but I am not enough of an electrical engineer to know why. Or if that represents 32A per phase, it seems like overkill since we are only drawing 16A per phase.
  • We need to have the display and the slider matched - either they are both combined or both per-phase. I would suggest changing the display to be per-phase since that seems to be easier to think about and consistent with the way the power draw ISO message is sent. And then we don't need to figure out how to ensure that the values selected are a multiple of 3.

We should then go over all this and understand which limit is being specified where, and unify them for readability.

Finally, the simulation settings or the power meter are defined in modules/simulation/JsYetiSimulator/index.js
The voltage there is 230V, not 240, which is how we get 230 * 48 ~11kW

  mod.simdata_setting = {
    cp_voltage: 12.0,
    pp_resistor: 220.1,
    impedance: 500.0,
    rcd_current: 0.1,
    voltages: { L1: 230.0, L2: 230.0, L3: 230.0 },
    currents: {
      L1: 0.0, L2: 0.0, L3: 0.0, N: 0.0,
    },
    frequencies: { L1: 50.0, L2: 50.0, L3: 50.0 },
  };

Note also that we can see the delivered power in greater detail in the "Debug" screen; see screenshots below.

Power gauge without charging (note fake voltage "noise") with charging (note current around 16A)
Screenshot 2024-11-18 at 9 00 22 PM Screenshot 2024-11-18 at 7 19 44 PM Screenshot 2024-11-18 at 8 15 00 PM

@shankari
Copy link
Collaborator Author

I also double-checked, and the Yeti reference hardware has a cap of 16A.

./modules/YetiDriver/board_support/evse_board_supportImpl.cpp:        caps.max_current_A_import = 16;
./modules/YetiDriver/board_support/evse_board_supportImpl.cpp:        caps.max_current_A_export = 16;

That's another option 😄

@shankari
Copy link
Collaborator Author

I changed the value in the simulator and verified that it is getting invoked by adding logs, But the max limit is not changing.
I am first going to cheat by setting the limit as part of the node-red SIL and then will ask the community/figure out where it is coming from

@shankari
Copy link
Collaborator Author

Hardcoding also does not work. However, I do notice

{
            std::scoped_lock lock(hw_caps_mutex);
            hw_capabilities = c;
            // Maybe override with user setting for this EVSE
            if (config.max_current_import_A < hw_capabilities.max_current_A_import) {
                hw_capabilities.max_current_A_import = config.max_current_import_A;
            }
            if (config.max_current_export_A < hw_capabilities.max_current_A_export) {
                hw_capabilities.max_current_A_export = config.max_current_export_A;
            }
        }

This is indeed the reason

2024-11-19 07:48:08.647155 [WARN] evse_manager_1: module::EvseManager::init()::<lambda(types::evse_board_support::HardwareCapabilities)> :: Received new capability {
    "connector_type": "IEC62196Type2Cable",
    "max_current_A_export": 16.0,
    "max_current_A_import": 48.0,
    "max_phase_count_export": 3,
    "max_phase_count_import": 3,
    "min_current_A_export": 0.0,
    "min_current_A_import": 6.0,
    "min_phase_count_export": 1,
    "min_phase_count_import": 1,
    "supports_changing_phases_during_charging": true
}comparing to config 32

Let's try to set the config as well!

@shankari
Copy link
Collaborator Author

shankari commented Nov 19, 2024

Ah it is set to 32 by default. Let's override...

  max_current_import_A:
    description: User configurable current limit for this EVSE in Ampere
    type: number
    default: 32

@shankari
Copy link
Collaborator Author

Now we see a limit of 48, but it is then dialed down to 40, presumably as part of the fuse value. Setting that to 48 as well.

2024-11-19 08:02:43.451230 [INFO] evse_manager_1:  :: Clearing import request schedule by setting max current from hw_caps = 48
2024-11-19 08:02:43.451392 [INFO] evse_manager_1:  :: Clearing export request schedule by setting max current from hw_caps = 48
2024-11-19 08:02:43.497242 [INFO] evse_manager_1:  :: Handle enforce limits with ac_max_current_A = 40

@shankari
Copy link
Collaborator Author

Great! That worked. I am now going to fix the limit setting slider, and then pull everything and generate a new release.
I can verify that:

  • both the limit slider and the charging profile expect a per-phase max current.
  • if set as an external limit (via the slider), the limit gets reset by the composite schedule
Screenshot 2024-11-19 at 12 13 23 AM Screenshot 2024-11-19 at 12 54 05 AM Screenshot 2024-11-19 at 12 54 28 AM

I think that the easiest option at this point is to reset everything to either 16 or 32. Let's go with 16 to be consistent with the uWMC.

shankari pushed a commit to US-JOET/everest-demo that referenced this issue Nov 19, 2024
It looks like the limits applied to the JsYetiSimulator are
on a per-phase basis, although limits applied elsewhere are
not. This makes things very confusing. We will fix this by
configuring everything to a per-phase limit of 16A

Please see
EVerest#92 (comment)
to
EVerest#92 (comment)

Locations changed:
- config
- JsYetiSimulator
- SmartCharging OCPP defaults

Also fix the disable_iso_tls patch to not have a starting `/`

Add a new patch file to enable limit logging but don't enable
it by default. This may help future efforts to debug this.

Signed-off-by: Shankari <[email protected]>
shankari pushed a commit to US-JOET/everest-demo that referenced this issue Nov 19, 2024
Consistent with
EVerest#92 (comment)

History starts at:
EVerest#92 (comment)

Also show dots in the chart
Also convert the time range to/from local time so it makes
sense

Signed-off-by: Shankari <[email protected]>
@shankari
Copy link
Collaborator Author

wrt #92 (comment)

@the-bay-kay can you pull my node-red changes, merge them and submit a PR with your changes? I think we can be GtG then!

@the-bay-kay
Copy link

...can you pull my node-red changes, merge them and submit a PR with your changes? I think we can be GtG then!

Awesome! I'll leave docker-compose.ocpp201.yaml set up to build Node-RED locally, and then you can create a new image from that if you'd like. (Speaking of, I think there may be a missing image -- .env was set to .22, but we're only up to .21! Easy enough fix : ))

There were no merge conflicts, so I'm assuming it should be smooth sailing -- I'll create the PR once I know I've run some tests and confirm things are working!

@shankari
Copy link
Collaborator Author

@the-bay-kay great! I think it is particularly important to ensure that, when we specify a power delivery curve as part of the power delivery req/res, that the charging "gauge" reflects that. If it doesn't, you may need to take a look at modules/EvseManager/energy_grid/energyImpl.cpp and modules/simulation/JsYetiSimulator/index.js (in particular, drawPower) to see what is going on.

I think this is really and truly the last piece for the demo. I don't plan to investigate the other issues in the time left.

@shankari
Copy link
Collaborator Author

Awesome! I'll leave docker-compose.ocpp201.yaml set up to build Node-RED locally, and then you can create a new image from that if you'd like.

No need to do this; all images (except the manager) are built by the CI and pushed. We should be using pre-built images, and only pre-built images in the demo.

(Speaking of, I think there may be a missing image -- .env was set to .22, but we're only up to .21! Easy enough fix : ))

The manager is not built by the CI due to lack of resources. So there will always be a ~ 30 min lag while I build and push it locally. Just try again in a bit....

@shankari
Copy link
Collaborator Author

@the-bay-kay just checking in on where this is. would be great if you could check in what you have so I can do a dry-run today

@the-bay-kay
Copy link

the-bay-kay commented Nov 19, 2024

...just checking in on where this is. would be great if you could check in what you have so I can do a dry-run today

I'm done with the Node-RED cleanup -- there were some trailing issues with how a fresh-clone behaved, such as the graph default, graph editing behavior, and setting our default simulation. Feel free to check out the existing changes in the draft PR here, or just by pulling my fork here (you'll need to tweak docker_compose.ocpp201.yaml to build node-red locally). Let me fix the sign-off issue on the merge commit before pushing my latest changes (I remember signing off, but must have been mistaken... oops). Pushed! I committed with -S and typed in my credentials, but the DSO is still complaining... I must be doing something wrong there, but we can clean that up after I fix the powermeter.

As for the powermeter debugging -- it seems that our local limits are set up within energyImpl.cpp here, with the get_local_energy_limits being defined here. It seems that the ISO15118 Charging Profile is never stored within EvseManager's external_local_energy_limits, so I believe I'll need to update the schedule via update_local_energy_limit or an equivalent function. We get and check the ISO Schedule in iso_server.cpp here, so I need to figure out how to pass that along to the EvseManager!

@shankari
Copy link
Collaborator Author

shankari commented Nov 19, 2024

@the-bay-kay Let's get the node-red part checked-in and pushed, and we can have a short conversation around what to expect for the second around what exactly is not working and what we need to do about it. I thought we had tested everything before...

the-bay-kay pushed a commit to the-bay-kay/everest-demo that referenced this issue Nov 19, 2024
It looks like the limits applied to the JsYetiSimulator are
on a per-phase basis, although limits applied elsewhere are
not. This makes things very confusing. We will fix this by
configuring everything to a per-phase limit of 16A

Please see
EVerest#92 (comment)
to
EVerest#92 (comment)

Locations changed:
- config
- JsYetiSimulator
- SmartCharging OCPP defaults

Also fix the disable_iso_tls patch to not have a starting `/`

Add a new patch file to enable limit logging but don't enable
it by default. This may help future efforts to debug this.

Signed-off-by: Shankari <[email protected]>
Signed-off-by: the-bay-kay <[email protected]>
the-bay-kay pushed a commit to the-bay-kay/everest-demo that referenced this issue Nov 19, 2024
Consistent with
EVerest#92 (comment)

History starts at:
EVerest#92 (comment)

Also show dots in the chart
Also convert the time range to/from local time so it makes
sense

Signed-off-by: Shankari <[email protected]>
Signed-off-by: the-bay-kay <[email protected]>
@the-bay-kay
Copy link

the-bay-kay commented Nov 19, 2024

Let's get the node-red part checked-in and pushed...

Those changes are pushed and checked in! As for getting the ChargingProfiles to reflect on the powermeter -- my plan is to store the ChargingProfile in the conn->ctx->evse_v2g_data, similar to the SASchededule (struct defined here) -- the context is shared across EvseV2G, still figuring out how we get these values to EvseManager (where we can set the local limits) -- EvseManager connects to EvseV2G via the iso15118_charger.yaml interface defined here... I see where departure_time is defined (EvseManager subscribes to it here), but see nothing similar for ChargingProfiles. Here is the charging profile type we'll need., So, it seems I'll need to pipe that through and add it to the interface as well...

I've added the ChargingProfile to the v2g context, and am attempting to store it in powerDeliveryRequest -- since we can't directly assign the profile value within the request to this struct, it looks like we'll need to iterate through the list and use populate_physical_value_float to populate it...

@shankari
Copy link
Collaborator Author

@the-bay-kay can you clarify what problem you are trying to fix? The Charging Profile is reflected in the power drawn. After my changes to drop the limit down to 16A, changing the charging profile does change the power gauge. Also, although I can build the node-red changes locally, I am not able to build them remotely.

I really don't like docker commit because, as you saw, it makes it hard to track the changes required.
And since I don't really buy into the "user should see the curve" ethos. I am going to try to make some other changes which are hopefully easier as well.

@shankari
Copy link
Collaborator Author

Couple of issues with the current implementation:

  • the EVSE side is still "taking departure time into account" so lowering the pmax if the departure time is way in the future. But the whole point of the charge curves is that the EV chooses the curve, potentially with feedback from the user. So let's first pull back the change to lower the pmax in response to the departure time.
  • I also noticed that the curve that is generated has a max of EAmount, not PMax. The curve generation algorithm doesn't even take pmax into account. This means that even if the pmax is 11, we say that our curve starts at 60kW. That is clearly wrong, but I don't know enough control theory to figure out how to fix it. I stared at the wikipedia for LQR and at the control python library for a couple of hours, and I think that a solution that is not as wrong is to convert xd=np.array([[EAmount]]). I am not sure that is right, but we ca discuss with DJ later. And we are not creating a reference implementation of the car simulator anyway

shankari pushed a commit to US-JOET/everest-demo that referenced this issue Nov 20, 2024
- Do not throttle the SASchedule from the EV based on departure time, only
  transmit the composite pmax schedule. The EV will deareate based on
  departure time
- Show the values in the current power delivery request instead of the
  progress towards the eamount so that we can see the impact of curtailing the
  pmax
- Pass in the pmax to the power curve computation algo, although it is
  currently a NOP

This fixes: EVerest#92 (comment)

Signed-off-by: Shankari <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants