Skip to content

Commit

Permalink
Merge pull request #1 from jekalmin/add-predefined
Browse files Browse the repository at this point in the history
Change "Custom Functions" to "Functions", add predefined
  • Loading branch information
jekalmin authored Oct 10, 2023
2 parents 924f516 + 0cccd04 commit 017ed77
Show file tree
Hide file tree
Showing 10 changed files with 228 additions and 148 deletions.
61 changes: 51 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,18 +42,56 @@ https://github.com/jekalmin/extended_openai_conversation/assets/2917984/4a575ee7
## Customize
### Options
By clicking a button from Edit Assist, Options can be customized.<br/>
Options are same as [OpenAI Conversation](https://www.home-assistant.io/integrations/openai_conversation/) options except for "Maximum function calls per conversation"
Options include [OpenAI Conversation](https://www.home-assistant.io/integrations/openai_conversation/) options and two new options.


- `Maximum Function Calls Per Conversation`: limit the number of function calls in a single conversation.
(Sometimes function is called over and over again, possibly running into infinite loop)
- `Functions`: A list of mappings of function spec to function.
- `spec`: Function which would be passed to [functions](https://platform.openai.com/docs/api-reference/chat/create#chat/create-functions) of [chat API](https://platform.openai.com/docs/api-reference/chat/create).
- `function`: function that will be called.

"Maximum function calls per conversation" is added to limit the number of function calls in a single conversation.<br/>
(Sometimes function is called over and over again, possibly running into infinite calls)

| Edit Assist | Options |
|----------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <img width="608" alt="1" src="https://github.com/jekalmin/extended_openai_conversation/assets/2917984/bb394cd4-5790-4ac9-9311-dbcab0fcca56"> | <img width="590" alt="스크린샷 2023-10-08 오후 2 15 17" src="https://github.com/jekalmin/extended_openai_conversation/assets/2917984/2d686958-7a9a-4107-9904-eac7c2ffbbb4"> |
| <img width="608" alt="1" src="https://github.com/jekalmin/extended_openai_conversation/assets/2917984/bb394cd4-5790-4ac9-9311-dbcab0fcca56"> | <img width="591" alt="스크린샷 2023-10-10 오후 10 53 57" src="https://github.com/jekalmin/extended_openai_conversation/assets/2917984/431e4bc5-87a0-4d7b-8da0-6273f955877f"> |


### Functions
Below is a default configuration of functions.

```yaml
- spec:
name: execute_services
description: Use this function to execute service of devices in Home Assistant.
parameters:
type: object
properties:
list:
type: array
items:
type: object
properties:
domain:
type: string
description: The domain of the service
service:
type: string
description: The service to be called
service_data:
type: object
description: The service data object to indicate what to control.
The key "entity_id" is required. The value of "entity_id" should be retrieved from a list of available devices.
required:
- domain
- service
- service_data
function:
type: predefined
name: execute_service
```
### Custom Functions
This is an example of configuration of custom functions.
This is an example of configuration of functions.
#### Example 1.
```yaml
Expand Down Expand Up @@ -95,7 +133,7 @@ This is an example of configuration of custom functions.
name: '{{item}}'
```
Copy and paste above configuration into "Custom Functions" .
Copy and paste above configuration into "Functions".
Then you will be able to let OpenAI call your function.
Expand Down Expand Up @@ -128,10 +166,13 @@ In order to accomplish "send it to Line" like [example3](https://github.com/jeka
```
Supported function types are following:
- script
- template
## DEBUG
- `predefined`: Pre-defined function that is provided by "extended_openai_conversation".
- Currently `name: execute_service` is only supported.
- `script`: A list of services that will be called
- `template`: The value to be returned from function.

## Logging
In order to monitor logs of API requests and responses, add following config to `configuration.yaml` file

```yaml
Expand Down
117 changes: 29 additions & 88 deletions custom_components/extended_openai_conversation/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,6 @@
CONF_TOP_P,
CONF_MAX_FUNCTION_CALLS_PER_CONVERSATION,
CONF_FUNCTIONS,
CONF_CUSTOM_FUNCTIONS,
DEFAULT_CHAT_MODEL,
DEFAULT_MAX_TOKENS,
DEFAULT_PROMPT,
Expand All @@ -56,12 +55,14 @@
EntityNotExposed,
CallServiceError,
FunctionNotFound,
PredefinedNotFound,
)

from .helpers import (
CustomFunctionExecutor,
ScriptCustomFunctionExecutor,
TemplateCustomFunctionExecutor,
FunctionExecutor,
PredefinedFunctionExecutor,
ScriptFunctionExecutor,
TemplateFunctionExecutor,
convert_to_template,
)

Expand All @@ -71,12 +72,13 @@
CONFIG_SCHEMA = cv.config_entry_only_config_schema(DOMAIN)


FUNCTION_EXECUTORS: dict[str, CustomFunctionExecutor] = {
"script": ScriptCustomFunctionExecutor(),
"template": TemplateCustomFunctionExecutor(),
FUNCTION_EXECUTORS: dict[str, FunctionExecutor] = {
"predefined": PredefinedFunctionExecutor(),
"script": ScriptFunctionExecutor(),
"template": TemplateFunctionExecutor(),
}

# hass.data key for logging information.
# hass.data key for agent.
DATA_AGENT = "agent"


Expand Down Expand Up @@ -173,6 +175,7 @@ async def async_process(
CallServiceError,
EntityNotExposed,
FunctionNotFound,
PredefinedNotFound,
) as err:
intent_response = intent.IntentResponse(language=user_input.language)
intent_response.async_set_error(
Expand Down Expand Up @@ -228,19 +231,19 @@ def get_exposed_entities(self):
)
return exposed_entities

def get_custom_functions(self):
def get_functions(self):
try:
custom_functions = self.entry.options.get(CONF_CUSTOM_FUNCTIONS)
if not custom_functions:
function = self.entry.options.get(CONF_FUNCTIONS)
if not function:
return []
result = yaml.safe_load(custom_functions)
result = yaml.safe_load(function)
if result:
for setting in result:
for function in setting["function"].values():
convert_to_template(function)
return result
except:
_LOGGER.error("failed to load custom functions")
_LOGGER.error("Failed to load functions")
return []

async def query(
Expand All @@ -255,8 +258,7 @@ async def query(
max_tokens = self.entry.options.get(CONF_MAX_TOKENS, DEFAULT_MAX_TOKENS)
top_p = self.entry.options.get(CONF_TOP_P, DEFAULT_TOP_P)
temperature = self.entry.options.get(CONF_TEMPERATURE, DEFAULT_TEMPERATURE)
custom_functions = self.get_custom_functions()
functions = CONF_FUNCTIONS + list(map(lambda s: s["spec"], custom_functions))
functions = list(map(lambda s: s["spec"], self.get_functions()))
function_call = "auto"
if n_requests == self.entry.options.get(
CONF_MAX_FUNCTION_CALLS_PER_CONVERSATION,
Expand Down Expand Up @@ -294,104 +296,43 @@ def execute_function_call(
exposed_entities,
n_requests,
):
custom_functions = self.get_custom_functions()
function_name = message["function_call"]["name"]
custom_function = next(
(s for s in custom_functions if s["spec"]["name"] == function_name),
function = next(
(s for s in self.get_functions() if s["spec"]["name"] == function_name),
None,
)
if function_name == "execute_services":
return self.execute_services(
user_input, messages, message, exposed_entities, n_requests
)
if custom_function is not None:
return self.execute_custom_function(
if function is not None:
return self.execute_function(
user_input,
messages,
message,
exposed_entities,
n_requests,
custom_function,
function,
)
else:
raise FunctionNotFound(message["function_call"]["name"])
raise FunctionNotFound(message["function_call"]["name"])

async def execute_services(
async def execute_function(
self,
user_input: conversation.ConversationInput,
messages,
message,
exposed_entities,
n_requests,
function,
):
function_executor = FUNCTION_EXECUTORS[function["function"]["type"]]
arguments = json.loads(message["function_call"]["arguments"])

result = []
for service_argument in arguments.get("list", []):
domain = service_argument["domain"]
service = service_argument["service"]
service_data = service_argument.get(
"service_data", service_argument.get("data", {})
)
entity_id = service_data.get("entity_id", service_argument.get("entity_id"))

if isinstance(entity_id, str):
entity_id = [e.strip() for e in entity_id.split(',')]
service_data["entity_id"] = entity_id

if entity_id is None:
raise CallServiceError(domain, service, service_data)
if not self.hass.services.has_service(domain, service):
raise ServiceNotFound(domain, service)
if any(self.hass.states.get(entity) is None for entity in entity_id):
raise EntityNotFound(entity_id)
exposed_entity_ids = map(lambda e: e["entity_id"], exposed_entities)
if not set(entity_id).issubset(exposed_entity_ids):
raise EntityNotExposed(entity_id)

try:
await self.hass.services.async_call(
domain=domain,
service=service,
service_data=service_data,
)
result.append(True)
except HomeAssistantError:
_LOGGER.error(e)
result.append(False)

messages.append(
{
"role": "function",
"name": message["function_call"]["name"],
"content": str(result),
}
)
return await self.query(user_input, messages, exposed_entities, n_requests)

async def execute_custom_function(
self,
user_input: conversation.ConversationInput,
messages,
message,
exposed_entities,
n_requests,
custom_function,
):
custom_function_executor = FUNCTION_EXECUTORS[
custom_function["function"]["type"]
]
arguments = json.loads(message["function_call"]["arguments"])

result = await custom_function_executor.execute(
self.hass, custom_function, arguments, user_input
result = await function_executor.execute(
self.hass, function, arguments, user_input, exposed_entities
)

messages.append(
{
"role": "function",
"name": message["function_call"]["name"],
"content": result,
"content": str(result),
}
)
return await self.query(user_input, messages, exposed_entities, n_requests)
13 changes: 10 additions & 3 deletions custom_components/extended_openai_conversation/config_flow.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
from functools import partial
import logging
import types
import yaml
from types import MappingProxyType
from typing import Any

Expand All @@ -19,6 +20,7 @@
NumberSelector,
NumberSelectorConfig,
TemplateSelector,
AttributeSelector,
)

from .const import (
Expand All @@ -28,13 +30,14 @@
CONF_TEMPERATURE,
CONF_TOP_P,
CONF_MAX_FUNCTION_CALLS_PER_CONVERSATION,
CONF_CUSTOM_FUNCTIONS,
CONF_FUNCTIONS,
DEFAULT_CHAT_MODEL,
DEFAULT_MAX_TOKENS,
DEFAULT_PROMPT,
DEFAULT_TEMPERATURE,
DEFAULT_TOP_P,
DEFAULT_MAX_FUNCTION_CALLS_PER_CONVERSATION,
DEFAULT_CONF_FUNCTIONS,
DOMAIN,
DEFAULT_NAME,
)
Expand All @@ -48,6 +51,8 @@
}
)

DEFAULT_CONF_FUNCTIONS_STR = yaml.dump(DEFAULT_CONF_FUNCTIONS, sort_keys=False)

DEFAULT_OPTIONS = types.MappingProxyType(
{
CONF_PROMPT: DEFAULT_PROMPT,
Expand All @@ -56,6 +61,7 @@
CONF_MAX_FUNCTION_CALLS_PER_CONVERSATION: DEFAULT_MAX_FUNCTION_CALLS_PER_CONVERSATION,
CONF_TOP_P: DEFAULT_TOP_P,
CONF_TEMPERATURE: DEFAULT_TEMPERATURE,
CONF_FUNCTIONS: DEFAULT_CONF_FUNCTIONS_STR,
}
)

Expand Down Expand Up @@ -174,7 +180,8 @@ def openai_config_option_schema(options: MappingProxyType[str, Any]) -> dict:
default=DEFAULT_MAX_FUNCTION_CALLS_PER_CONVERSATION,
): int,
vol.Optional(
CONF_CUSTOM_FUNCTIONS,
description={"suggested_value": options.get(CONF_CUSTOM_FUNCTIONS)},
CONF_FUNCTIONS,
description={"suggested_value": options.get(CONF_FUNCTIONS)},
default=DEFAULT_CONF_FUNCTIONS_STR,
): TemplateSelector(),
}
Loading

0 comments on commit 017ed77

Please sign in to comment.