-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python implementation #16
Conversation
Almost finished pip integration Commands are streamed and can be used with the zsh feature
structure) Added docs, lint/format ruff-compliant Possibility to add or modify openai key Multiple commands (ex: output of 'alias') now support zsh buffering
This looks excellent! Thanks for you work. I'll give it a proper review and run through some tests ASAP. |
Removed pylint comments Fixed minor bug on GPT-4 prompt parsing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor feedback
return model | ||
|
||
|
||
def stream_commands(response: llm.Response, command_prefix: str = "") -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It can be a class no?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think it should be a class, even tho the function structure is not the best.
Imo, namespacing is made with module names and I don't believe we got a need to do a class (not really complex logic [but still only logic, as a function should be], nothing to store for a long time, no need to send an object representing this to any other functions/class).
|
||
def setup(): | ||
"""Checks if `llm` has an openai key and prompt to change it or create one""" | ||
import llm.cli |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know yet how performant will the __pycache__ be (imo even 0.1s for help command feels clunky)
Hi @edouard-sn. Are you able to explain the use of 80caa04#diff-a2cbd9f31a7210a200606eb185a572e28807c04a7682611e8416698e02081336R60 |
@all-contributors please add @edouard-sn for code, ideas, maintenance, platform. Please add @antwxne for review. |
I've put up a pull request to add @edouard-sn! 🎉 I've put up a pull request to add @antwxne! 🎉 |
Didn't have a lot to do this week-end, so decided to quickly port clipea in Python.
I tried to stay close to how @dave1010 thought the app while adapting it to Python.
I used the
llm
library and you will see some local import for llm, it has been done in order to not have 0.2+ sec on import when the user is, for instance, only asking for help and the router imports all commands.I dropped the
brew
support, taking into account that now it is pip-compliant.Didn't upload to PyPi but can do it if needed.
Ressources (config) are straight in the
./clipea
package folder, imo it could be done better but is fine for now.Feel free to discuss the implementation below.
Closes #3