You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We know that Radio::updateTransceiverState() gets called after each events of the Radio, e.g. initialization, start/end transmission and start/end reception. In Radio::updateTransceiverState(), to determine whether the radio is busy or idle, RadioMedium::listenOnMedium is called. RadioMedium::listenOnMedium takes into account all the transmissions that are on-going to compute the noise level. But when using the various RadioMedium.*Filter options (e.g. macAddressFilter) the radios don't necessarily process all the transmissions that are happening on the RadioMedium. This could e.g. lead to the following situations:
Radio is in RECEPTION_STATE_IDLE, a new transmission occurs, but due to some filter configuration the radio doesn't receive the signal, thus no event occurs in the radio module, Radio::updateTransceiverState() is not called and the radio stays in RECEPTION_STATE_IDLE. But the new transmission could have introduced noise on the RadioMedium such that the radio should become busy.
Viceversa, radio is in RECEPTION_STATE_BUSY due to some series of events. An on-going transmission finishes on the RadioMedium, but radio was never aware of that transmission due to some filter configuration. Radio::updateTransceiverState() is not called, radio remains in RECEPTION_STATE_BUSY. But the end of the transmission (if could even be the last transmission) could have reduced the noise on the RadioMedium such that the radio should become idle.
If you think this is possible, I can work on a reproduction and fix for this.
The text was updated successfully, but these errors were encountered:
This is a known limitation of the radio medium filters. When you enable a filter, you loose some precision in the simulation. So basically you trade accuracy for speed. For example, when a radio doesn't get notified about a signal reception start/end it cannot update its listening state. In order to update the listening state, there should have been an event at that moment, which of course could just as well be the received frame. You could argue that updating the reception state could be done cheaper than doing the complete processing of a signal start/end. I can't tell if that's the case, I didn't do any such measurements. For some physical layers, the inaccuracy of the reception state (e.g. no BUSY state) may not be a problem.
Nice to know that it is a known issue. I think it would be great if some documentation could be added (e.g. in the Transmission Medium - User guide) about these limitations.
As for the cost of the complete processing of a signal start/end, I did some measurements some time ago using rangeFilter and a wifi adhoc simulation with 1000 randomly uniformly placed hosts. As you can see from the plot below, as the constraint area becomes bigger (thus the average distance between hosts becomes larger), the rangeFilter comes into effect we get really nice performance speedups.
And if we take a look at the flamegraph of one plain simulation (so with rangeFilter disabled), we see that the processing of signal start/end dominate the computation cost.
The measurements can definitely be more comprehensive and rigorous, but I think this already is enough to argue that filters are a good thing and that complete processing of a signal start/end is computationally expensive.
And, while it is true that in some cases the the inaccuracy of the physical layer is acceptable, I think e.g. in the case of Wifi, or any other MAC protocol that performs listen before talk, this is not desirable behavior. And, given the speed up that filters can bring, I think it is worth to make them more "correct", so that they can be used in more cases.
As you say, the question is how cheaper is to only compute the reception state instead of the whole signal start/end process. I think at least we can avoid inet::physicallayer::Radio::startReception > inet::physicallayer::RadioMedium::isReceptionAttempted, which we can observe to be pretty expensive.
Hi, when studying the inet codebase, I noticed something that I think could lead to a bug as described in the title.
inet/src/inet/physicallayer/wireless/common/radio/packetlevel/Radio.cc
Line 554 in dc1fe50
We know that
Radio::updateTransceiverState()
gets called after each events of the Radio, e.g. initialization, start/end transmission and start/end reception. InRadio::updateTransceiverState()
, to determine whether the radio is busy or idle,RadioMedium::listenOnMedium
is called.RadioMedium::listenOnMedium
takes into account all the transmissions that are on-going to compute the noise level. But when using the variousRadioMedium.*Filter
options (e.g.macAddressFilter
) the radios don't necessarily process all the transmissions that are happening on theRadioMedium
. This could e.g. lead to the following situations:RECEPTION_STATE_IDLE
, a new transmission occurs, but due to some filter configuration the radio doesn't receive the signal, thus no event occurs in the radio module,Radio::updateTransceiverState()
is not called and the radio stays inRECEPTION_STATE_IDLE
. But the new transmission could have introduced noise on theRadioMedium
such that the radio should become busy.RECEPTION_STATE_BUSY
due to some series of events. An on-going transmission finishes on theRadioMedium
, but radio was never aware of that transmission due to some filter configuration.Radio::updateTransceiverState()
is not called, radio remains inRECEPTION_STATE_BUSY
. But the end of the transmission (if could even be the last transmission) could have reduced the noise on the RadioMedium such that the radio should become idle.If you think this is possible, I can work on a reproduction and fix for this.
The text was updated successfully, but these errors were encountered: