This repository has been archived by the owner on Apr 22, 2024. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 35
Length fields are not being checked #314
Comments
I propose some simple length and bit checks on unpack methods like in the examples bellow. Do you guys want this kind of implementation? @diraol @beraldoleal ?
def unpack(self, buff, offset=0):
if not buff[offset:offset + self._length] == self.pack():
raise UnpackException
def unpack(self, buff, offset=0):
begin = offset
hexas = []
while begin < offset + 8:
number = struct.unpack("!B", buff[begin:begin + 1])[0]
hexas.append("%.2x" % number)
begin += 1
self._value = ':'.join(hexas) proposition: def unpack(self, buff, offset=0):
hexa_values = struct.unpack("!8B", buff[offset:offset + 8])
self._value = ':'.join(('{:02x}'.format(number)
for number in hexa_values)) should I continue making theese kind of changes? or is this not what you have in mind? |
|
I agree, we must check this validations. However we have already an issue for a more open discussion, please see #96 . We need to find a solution for a more generic implementation. I'm removing the milestone for now. But for sure we need to think this soon. |
( update: I realized now buffer is not beeing sliced prior to unpacking the attributes... So with this approach, my method may not work very well...) |
hdiogenes
changed the title
length fields are not beeing checked.
Length fields are not being checked
Jun 10, 2020
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
length fields are not beeing checked.
(pg 50) - "OpenFlow is a simple binary protocol, therefore invalid OpenFlow messages
generated by OpenFlow implementations will in general result in the message
having the wrong length or some fields having invalid values (see 7.1.3). It
is recommended to monitor those conditions to detect non-compliant
implementations."
should be computed locally, or checked for validity before message is sent.
The text was updated successfully, but these errors were encountered: