-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug: failed to extract payload schema: cannot resolve field type \"jsonb\" #229
Comments
@WORMrus Thanks for reaching out to us.:) So, looking at your logs, I see this:
The connector is reading an existing publication slot, and we can also see in the logs the I'd probably do the following:
A table with just a primary key, big int and varchar should work, we've been using that often (on the Conduit Platform too). The |
@hariso Thank you for returning to me so quickly, indeed, it seems that the current state was related to the original table and its jsonb column. I did see the warnings but was not sure what to make of them as I am not familiar with replication slots\publications as a feature of PG. As advised, I've deleted the slot with With that out of the way, I've once again removed the slot and the publication, deleted the container and added a new jsonb column (not null, '{}'::jsonb as default). I was able to start a new container. However, it threw an error when I inserted a new row (json there was just a {}):
So it seems that something is off with jsonb columns |
Bug description
I am trying to start a pipeline that would read rows of my outbox table into Kafka. The actual table contains about 10 columns one of which is jsonb to which I originally attributed the problem I am expiriencing.
However, during my troubleshooting I've created a table that is just a bigint Id and a varchar field and tried reading from it with the same results. I've also dropped the Kafka destination and replaced it with a file one. Even though the destination plugin did not mention any issues.
I am certain that the conenctor reaches the correct table as I originally copied it from the real one but without any constraints. The connector complained about the primary key missing. Adding it to the test table got me back to the original issue.
Conduit is running in Docker, the info endpoint returns:
{ "version": "v0.12.3", "os": "linux", "arch": "amd64" }
The connector is of version v0.10.1
Postgres is "PostgreSQL 14.3, compiled by Visual C++ build 1914, 64-bit"
My pipeline file is as follows:
Here are the full logs as displayed when starting a brand new container:
Steps to reproduce
docker run -it -p 8080:8080 -v path\to\pipe.yaml:/app/pipelines/pipeline.yaml conduit.docker.scarf.sh/conduitio/conduit
Version
Conduit v0.12.3 connector v0.10.1
The text was updated successfully, but these errors were encountered: