How can we access the prompt given to the LLM? #350
Answered
by
jxnl
Chasearmer
asked this question in
Q&A
-
As the title suggests, how can we access the prompt that is generated from the Pydantic response model and passed to the LLM? I would be very interested in seeing what that looks, especially towards accessing the clarity of prompts created from nested Pydantic models. |
Beta Was this translation helpful? Give feedback.
Answered by
jxnl
Jan 17, 2024
Replies: 1 comment 9 replies
-
Making some improvements to docs |
Beta Was this translation helpful? Give feedback.
9 replies
Answer selected by
jxnl
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Making some improvements to docs
https://jxnl.github.io/instructor/concepts/logging/