What's New in v1.2.2
Added
New Model Support: The toolkit now supports llama-3-sonar-small-32k-chat
, llama-3-sonar-large-32k-chat
, llama-3-sonar-small-32k-online
, and llama-3-sonar-large-32k-online
models.
Changed
Updated Default Models: The default models in config.py
have been updated to llama-3-sonar-large-32k-chat
for chat functionalities and llama-3-sonar-large-32k-online
for search functionalities.
Removed
Deprecated Model Removal: Older models such as sonar-small-chat
, sonar-medium-chat
, sonar-small-online
, and sonar-medium-online
have been removed from the toolkit. Additionally, support for codellama-70b-instruct
, mistral-7b-instruct
, and mixtral-8x22b-instruct
has been discontinued in alignment with Perplexity Labs' current model offerings.
*Users will need to update any existing implementations that rely on the older models to the new llama-3-sonar
models within the next 2 weeks to maintain functionality, according to Perplexity Labs.
How to Upgrade
- Existing users can upgrade to v1.2.2 by pulling the latest changes from our GitHub repository and updating their environment to reflect the new dependencies and configurations.
- It is recommended to review the updated documentation to familiarize yourself with the new models and their capabilities.
We thank our users for their continued support and are excited to see how the new features will be utilized in your AI projects!
Full Changelog: v1.2.1...v1.2.2