-
Notifications
You must be signed in to change notification settings - Fork 483
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AVRO Record types without fields can't be consumed #305
Comments
Can you clarify the purpose of a record without fields? Perhaps you'd be better off just sending a boolean primitive instead if you simply want some message?
How so? The docs say " |
For forward compatibility I find it easier to have a record without fields and then add (optional) fields to it later than to replace a boolean with a record.
Yes the "fields" array must exist, but it is allowed to be empty. At least that is my interpretation, and the offical AVRO library will also accept and generate classes for such a schema. |
I'd find it useful too, as we have empty record types as well. Other avro-aware software works well with it, so, in case one finds the spec ambiguous, it's allowed de-facto. |
It looks like a fix for this was merged into avro-c a while ago (apache/avro#159) and is in the Would we be able to just update the version being used here? Line 147 in 7a61200
We generally build this using the bootstrap script but not sure if this would fix the problem for everyone (haven't tested for ourselves yet either) or if there is somewhere else that would need to be updated? |
When consuming a topic that contains an AVRO type that has no fields, the following error is produced:
% ERROR: Failed to format message in event-rule-engine-eventTriggerStateStore-changelog [2] at offset 18734304: Avro/Schema-registry message deserialization: Record type must have at least one field : terminating
The AVRO specification does not indicate that such types are not allowed. You can create them using the official Java AVRO library, and the Confluent Java AVRO serdes can also serialize/deserialize them without issues.
Is it possible to remove this check: https://github.com/confluentinc/avro-c-packaging/blob/master/src/schema.c#L922 so kafkacat can also consume this type of record?
The text was updated successfully, but these errors were encountered: