-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
invoke_model_with_response_stream generates lexical error: invalid char in json text, but points to blank #839
Comments
HI @tonisama can you provide a code example please. To help me reproduce the error. |
Sure, but you'll need bedrock credentials to use it. |
This error is coming from Lines 4 to 17 in 45ece0e
I am not sure what is going on regarding the parsing of the body. However I believe this method is not fully supported as of yet. Paws will need to develop a streaming method to parse the data. I will have to see how botocore does it and go from there. In the meantime is |
As paws is currently built off |
Thanks! invoke_model works just fine. |
The invoke_model_with_response_stream function generates an error:
Error in parse_con(txt, bigint_as_char) :
lexical error: invalid char in json text.
(right here) ------^
The error occurs at the response request. The (right here) usually points to the infringing character, but points to nothing here. I replicated this same script in Python using the boto3 library and it works perfectly. In R, the invoke_model function without the streaming also works perfectly.
To ensure I wasn't making an error in the prompt encoding, I copied the hex representation from Python into R and converted it back to string which still gave me the same error.
Unfortunately, due to the error, R doesn't record the response, so I'm unable to view any server-side error messages.
The text was updated successfully, but these errors were encountered: