Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error Data incompatible with tuples format. Each message should be a list of length 2. #17

Open
darkon12 opened this issue Dec 2, 2024 · 9 comments

Comments

@darkon12
Copy link

darkon12 commented Dec 2, 2024

What should I do?

@unitycoder
Copy link

using Windows+Anaconda, same issue.
I added some print commands in the gradio method for debugging, its not receiving all data from the web form?

chatgipity™ made a workaround for me, at least it runs now and creates mesh:
modified chatbot.py (use some code compare/merge tool to see the modified parts and could remove the debug logs)
https://gist.github.com/unitycoder/ec21b35526bb78caac1fa002afc1d833

But of course, interested to know if there are better fixes..

@darkon12
Copy link
Author

darkon12 commented Dec 3, 2024

@unitycoder
Copy link

replaced existing chatbot.py with that code (i think it was inside gradio package folder, might see the file location in console error messages)

@gazonk
Copy link

gazonk commented Dec 3, 2024

Thanks for the info. Something that seems to work is to set, in app.py, the chatbot's message type to "messages".
Here is the diff:

# Gradio block
-chatbot=gr.Chatbot(height=450, placeholder=PLACEHOLDER, label='Gradio ChatInterface')
+chatbot=gr.Chatbot(height=450, placeholder=PLACEHOLDER, label='Gradio ChatInterface', type="messages")

@gazonk
Copy link

gazonk commented Dec 3, 2024

I have done also modifications to avoid error related to padding token.
It would be exaggerated to say that I know exactly what I'm doing :-) but some spurious messages about inferring masking have disappeared :-). I'm interested in knowing what I'm doing wrong.
app.py.txt

@darkon12
Copy link
Author

darkon12 commented Dec 3, 2024

I have done also modifications to avoid error related to padding token. It would be exaggerated to say that I know exactly what I'm doing :-) but some spurious messages about inferring masking have disappeared :-). I'm interested in knowing what I'm doing wrong. app.py.txt

Uhm, no. The only thing that your mod does is to switch to a generic "error" message.
Your app.py.txt gets an OOM (al least in colab).
Will try again later.

@gazonk
Copy link

gazonk commented Dec 3, 2024

I probably don't get the OOM because I'm on a Macbook M2 with 96GB of ram.

@darkon12
Copy link
Author

darkon12 commented Dec 3, 2024

I probably don't get the OOM because I'm on a Macbook M2 with 96GB of ram.

Wow, bragging at its finest ;)

@gazonk
Copy link

gazonk commented Dec 3, 2024

I wouldn't brag about that ;-), it just explains why it works for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants