Skip to content

Commit

Permalink
feat: Release 0.0.10 Demos and LangChain Integration (#15)
Browse files Browse the repository at this point in the history
* chore(internal): rebase

* feat(api): OpenAPI spec update via Stainless API (#12)

* release: 0.1.0

* Mirascope demo

* Update pyproject.toml

* Simple Discord Example

* feat: LangChain Integration and Simple Demos

* style: fix linting errors

---------

Co-authored-by: stainless-app[bot] <142633134+stainless-app[bot]@users.noreply.github.com>
  • Loading branch information
Stainless Bot and stainless-app[bot] committed May 23, 2024
1 parent 05a39a1 commit 1c6f477
Show file tree
Hide file tree
Showing 28 changed files with 5,817 additions and 1 deletion.
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "0.0.9"
}
}
27 changes: 27 additions & 0 deletions examples/cli/mirascope/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# Simple CLI Chatbot with Mirascope

This is a quick demo that shows how to create a chatbot with MiraScope using
Honcho as the storage engine.

It uses the command line as an interface and uses GPT-4o as the underlying
model. Follow the below steps to setup the demo.

1. Install the dependencies with `poetry`

```bash
poetry shell
poetry install
```

2. Add your OpenAI API key to an `.env` file

```bash
echo "OPENAI_API_KEY=<YOUR_API_KEY>" > .env
```

3. Run the demo from your poetry demo

```bash
poetry shell
python main.py
```
78 changes: 78 additions & 0 deletions examples/cli/mirascope/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
import asyncio
from typing import List

from dotenv import load_dotenv
from mirascope.openai import OpenAICall, OpenAICallParams

from honcho import Honcho

load_dotenv()
honcho = Honcho(environment="demo") # initialize the honcho client
app = honcho.apps.get_or_create("Mirascope Test") # Get your app instance
user = honcho.apps.users.get_or_create(app_id=app.id, name="test_user") # Get your user
session = honcho.apps.users.sessions.create(app_id=app.id, user_id=user.id, location_id="cli") # Make a new session


# Set up your OpenAI Call
class Conversation(OpenAICall):
prompt_template = """
SYSTEM:
You are a helpful assistant that provides incredibly short and efficient
responses.
MESSAGES: {history}
USER:
{user_input}
"""
user_input: str
session_id: str
app_id: str
user_id: str

@property
def history(self) -> List[dict]:
"""Get the conversation history from Honcho"""
history_list = []
iter = honcho.apps.users.sessions.messages.list(
session_id=self.session_id, app_id=self.app_id, user_id=self.user_id
)
for message in iter:
if message.is_user:
history_list.append({"role": "user", "content": message.content})
else:
history_list.append({"role": "assistant", "content": message.content})
return history_list

# context: str
call_params = OpenAICallParams(model="gpt-4o-2024-05-13", temperature=0.4)


conversation = Conversation(user_input="", app_id=app.id, user_id=user.id, session_id=session.id)


async def chat():
while True:
conversation.user_input = input(">>> ")
if conversation.user_input == "exit":
honcho.apps.users.sessions.delete(session_id=session.id, app_id=app.id, user_id=user.id)
break
response = ""
cstream = conversation.stream_async()
print("\033[96mAI:\033[0m")
async for chunk in cstream:
print(f"\033[96m{chunk.content}\033[0m", end="", flush=True)
response += chunk.content
print("\n")

# Save User and AI messages to Honcho
honcho.apps.users.sessions.messages.create(
session_id=session.id, app_id=app.id, user_id=user.id, content=conversation.user_input, is_user=True
)
honcho.apps.users.sessions.messages.create(
session_id=session.id, app_id=app.id, user_id=user.id, content=response, is_user=False
)


if __name__ == "__main__":
asyncio.run(chat())
Loading

0 comments on commit 1c6f477

Please sign in to comment.