Why we will finally support Jinja #1009
jxnl
announced in
Announcements
Replies: 3 comments 3 replies
-
Cool proposal, would be great to have. Suggestion:
|
Beta Was this translation helpful? Give feedback.
3 replies
-
@jxnl Is this a replacement for descriptions in pydantic models? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Instructor Proposal: Integrating Jinja Templating
As the creator of Instructor, I've always aimed to keep our product development streamlined and avoid unnecessary complexity. However, I'm now convinced that it's time to incorporate better templating into our data structure, specifically by integrating Jinja.
This decision serves multiple purposes:
Why Jinja is the Right Choice
Formatting Capabilities
Validation
Versioning and Logging
By integrating Jinja into Instructor, we're not just adding a feature; we're enhancing our ability to handle complex formatting, improve validation processes, and streamline our versioning and logging capabilities. This addition will significantly boost the power and flexibility of Instructor, making it an even more robust tool for our users.
Enhancing Formatting Capabilities
In Instructor, we propose implementing a new
context
keyword in our create methods. This addition will allow users to render the prompt using a provided context, leveraging Jinja's templating capabilities. Here's how it would work:context
dictionary to the create method.content
field of the message.This approach offers these benefits:
Let's look at an example to illustrate this feature:
Validation
Let's consider a scenario where we redact words from text. By using
ValidationInfo
to access context and passing it to the validator and template, we can implement a system for handling sensitive information. This approach allows us to:Here's an example demonstrating this concept using Pydantic validators:
Better Versioning and Logging
With the separation of prompt templates and variables, we gain several advantages:
Version Control: We can now version the templates and retrieve the appropriate one for a given prompt. This allows for better management of template history, diffing and comparison.
Enhanced Logging: The separation facilitates structured logging, enabling easier debugging and integration with various logging sinks, databases, and observability tools like OpenTelemetry.
Security: Sensitive information in variables can be handled separately from the templates, allowing for better access control and data protection.
This separation of concerns adheres to best practices in software design, resulting in a more maintainable, scalable, and robust system for managing prompts and their associated data.
Side effect of Context also being Pydantic Models
Since they are just python objects we can use Pydantic models to validate the context and also control how they are rendered, so even secret information can be dynamically rendered!
Consider using secret string to pass in sensitive information to the llm.
This approach offers several advantages:
Beta Was this translation helpful? Give feedback.
All reactions