Skip to content

Commit

Permalink
copy button and examples for .md
Browse files Browse the repository at this point in the history
  • Loading branch information
luckysanpedro committed Jan 29, 2024
1 parent 1fbbb49 commit 709fc08
Show file tree
Hide file tree
Showing 15 changed files with 674 additions and 8 deletions.
20 changes: 20 additions & 0 deletions docs/css/custom.css
Original file line number Diff line number Diff line change
Expand Up @@ -24,4 +24,24 @@ a.internal-link::after {

.shadow {
box-shadow: 5px 5px 10px #999;
}

pre {
position: relative;
}

.copy-code-button {
position: absolute;
right: 5px;
top: 5px;
cursor: pointer;
padding: 0.5em;
margin-bottom: 0.5em;
background-color: rgba(247, 247, 247, 0.4); /* Light grey background with slight transparency */
border: 1px solid #dcdcdc; /* Slightly darker border for definition */
border-radius: 3px; /* Rounded corners */
font-family: monospace; /* Monospace font similar to code blocks */
font-size: 0.85em; /* Slightly smaller font size */
color: #333; /* Dark grey text for contrast */
outline: none; /* Remove outline to maintain minimal style on focus */
}
3 changes: 2 additions & 1 deletion docs/examples/chat.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@


## Code Example

<pre><code id="codeblock">
```python
from funcchain import chain, settings
from funcchain.utils.memory import ChatMessageHistory
Expand Down Expand Up @@ -49,6 +49,7 @@ if __name__ == "__main__":
print("Hey! How can I help you?\n")
chat_loop()
```
</code></pre>



Expand Down
3 changes: 2 additions & 1 deletion docs/examples/dynamic_router.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ This should serve as an example of how to archive complex structures using funcc
A dynamic chat router that selects the appropriate handler for user queries based on predefined routes.

## Full Code Example

<pre><code id="codeblock">
```python
from enum import Enum
from typing import Any, Callable, TypedDict
Expand Down Expand Up @@ -98,6 +98,7 @@ router = DynamicChatRouter(

router.invoke_route("Can you summarize this csv?")
```
</code></pre>

Demo
<div class="termy">
Expand Down
3 changes: 2 additions & 1 deletion docs/examples/enums.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@

##Full Code Example
A simple system that takes a question and decides a 'yes' or 'no' answer based on the input.

<pre><code id="codeblock">
```python
from enum import Enum
from funcchain import chain
Expand All @@ -32,6 +32,7 @@ def make_decision(question: str) -> Decision:
if __name__ == "__main__":
print(make_decision("Do you like apples?"))
```
</code></pre>

#Demo
<div class="termy">
Expand Down
3 changes: 2 additions & 1 deletion docs/examples/error_output.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
The main functionality is to take a string of text and attempt to extract user information, such as name and email, and return a User object. If the information is insufficient, an Error is returned instead.

##Full Code Example

<pre><code id="codeblock">
```python
from funcchain import BaseModel, Error, chain
from rich import print
Expand All @@ -32,6 +32,7 @@ if __name__ == "__main__":
print(extract_user_info("I'm John and my mail is [email protected]")) # returns a User object

```
</code></pre>

Demo
<div class="termy">
Expand Down
3 changes: 2 additions & 1 deletion docs/examples/literals.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
You can adapt this for your own usage.

##Full Code Example

<pre><code id="codeblock">
```python
from typing import Literal
from funcchain import chain
Expand All @@ -30,6 +30,7 @@ if __name__ == "__main__":
rank = rank_output("The quick brown fox jumps over the lazy dog.")
print(rank)
```
</code></pre>

Demo
<div class="termy">
Expand Down
95 changes: 95 additions & 0 deletions docs/examples/ollama.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
#Different LLMs with funcchain EASY TO USE

!!! Example
See [ollama.py](https://github.com/shroominic/funcchain/blob/main/examples/ollama.py)
Also see supported [MODELS.md](https://github.com/shroominic/funcchain/blob/main/MODELS.md)

In this example, we will use the funcchain library to perform sentiment analysis on a piece of text. This showcases how funcchain can seamlessly utilize different Language Models (LLMs), such as ollama, without many unnececary code changes..

This is particularly useful for developers looking to integrate different models in a single application or just experimenting with different models.

##Full Code Example
<pre><code id="codeblock">
```python
from funcchain import chain, settings
from pydantic import BaseModel, Field
from rich import print

# define your model
class SentimentAnalysis(BaseModel):
analysis: str = Field(description="A description of the analysis")
sentiment: bool = Field(description="True for Happy, False for Sad")

# define your prompt
def analyze(text: str) -> SentimentAnalysis:
"""
Determines the sentiment of the text.
"""
return chain()

if __name__ == "__main__":
# set global llm
settings.llm = "ollama/wizardcoder:34b-python-q3_K_M"
# log tokens as stream to console
settings.console_stream = True

# run prompt
poem = analyze("I really like when my dog does a trick!")

# show final parsed output
print(poem)
```
</code></pre>

#Demo
<div class="termy">
```
poem = analyze("I really like when my dog does a trick!")

$ ..................

Add demo

```
</div>

##Instructions
!!! Step-by-Step

**Necessary Imports**
```python
from funcchain import chain, settings
from pydantic import BaseModel, Field
from rich import print
```

**Define the Data Model**
Here, we define a `SentimentAnalysis` model with a description of the sentiment analysis and a boolean field indicating the sentiment.
```python
class SentimentAnalysis(BaseModel):
analysis: str = Field(description="A description of the analysis")
sentiment: bool = Field(description="True for Happy, False for Sad")
```

**Create the Analysis Function**
This 'analyze' function takes a string as input and is expected to return a `SentimentAnalysis` object by calling the `chain()` function from the `funcchain` library.
```python
def analyze(text: str) -> SentimentAnalysis:
"""
Determines the sentiment of the text.
"""
return chain()
```

**Execution Configuration**
In the main block, configure the global settings to set the preferred LLM, enable console streaming, and run the `analyze` function with sample text. The result is printed using the `rich` library.
```python
if __name__ == "__main__":
settings.llm = "ollama/wizardcoder:34b-python-q3_K_M"
settings.console_stream = True
poem = analyze("I really like when my dog does a trick!")
print(poem)
```

!!!Important
We need to note here is that `settings.llm` can be adjusted to any model mentioned in [MODELS.md](https://github.com/shroominic/funcchain/blob/main/MODELS.md) and your funcchain code will still work and `chain()` does everything in the background for you.
88 changes: 88 additions & 0 deletions docs/examples/openai_json_mode.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
#JSON structured Output using Funcchain with OenAI

!!! Example
See [openai_json_mode.py](https://github.com/shroominic/funcchain/blob/main/examples/openai_json_mode.py)

This example will showcase how funcchain enables OpenAI to output even the type `int` as JSON.

This example demonstrates using the funcchain library and pydantic to create a FruitSalad model, sum its contents, and output the total in a Result model as an integer.

##Full Code Example
<pre><code id="codeblock">
```python
from funcchain import chain, settings
from pydantic import BaseModel

settings.console_stream = True

class FruitSalad(BaseModel):
bananas: int = 0
apples: int = 0

class Result(BaseModel):
sum: int

def sum_fruits(fruit_salad: FruitSalad) -> Result:
"""
Sum the number of fruits in a fruit salad.
"""
return chain()

if __name__ == "__main__":
fruit_salad = FruitSalad(bananas=3, apples=5)
assert sum_fruits(fruit_salad) == 8
```
</code></pre>

Demo

<div class="termy">
```python
fruit_salad = FruitSalad(bananas=3, apples=5)
assert sum_fruits(fruit_salad) == 8
```
</div>

Instructions
!!! Step-by-Step

**Necessary Imports**
`funcchain` for chaining functionality, and `pydantic` for the data models.
```python
from funcchain import chain, settings
from pydantic import BaseModel
```

**Defining the Data Models**
We define two Pydantic models: `FruitSalad` with integer fields for the number of bananas and apples, and `Result`, which will hold the sum of the fruits.
Of course feel free to change those classes according to your needs but use of `pydantic` is required.
```python
class FruitSalad(BaseModel):
bananas: int = 0
apples: int = 0

class Result(BaseModel):
sum: int
```



**Summing Function**
The `sum_fruits` function is intended to take a `FruitSalad` object and use `chain()` for solving this task with an LLM. The result is returned as a `Result` object.
```python
def sum_fruits(fruit_salad: FruitSalad) -> Result:
"""
Sum the number of fruits in a fruit salad.
"""
return chain()
```


**Execution Block**
```python
if __name__ == "__main__":
fruit_salad = FruitSalad(bananas=3, apples=5)
assert sum_fruits(fruit_salad) == 8
```

In the primary execution section of the script, we instantiate a `FruitSalad` object with predefined quantities of bananas and apples. We then verify that the `sum_fruits` function accurately calculates the total count of fruits, which should be 8 in this case.
Loading

0 comments on commit 709fc08

Please sign in to comment.