Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multifile processing, prompts config #79

Open
wants to merge 83 commits into
base: main
Choose a base branch
from
Open

Conversation

ivs
Copy link

@ivs ivs commented Oct 21, 2024

Currently WIP!

Prompts moved to toml config files, so later prompts can be easily tuned for concrete task. Source code prompt config added to prompts/code.tom

There're some doubts regarding entities capitalization, currently I switched it of, cos for code it harmful I think, mb it's nice to control by parameter, cos it looks like it beneficial for natural language.

Every chunk embeded with metadata now, inserted text has !none in that fields

####
## FILENAME: README.md
## FILEPATH: ./README.md
## CHUNK_NUM: 0
###

insert accepts either text or Path object, or list of one of them:

def insert(self, input_data: Union[str, os.PathLike, List[Union[str, os.PathLike]]])

File name, path and chunk number added in entity and relationship extraction.

LarFii and others added 28 commits October 17, 2024 16:02
update Step_3.py and openai compatible script
Add lines that can be uncommented if running in a jupyter notebook
Add comment specifying jupyter req
chore: added pre-commit-hooks and ruff formatting for commit-hooks
Add examples/vram_management_demo.py
make graph visualization become colorful
@Jaykumaran
Copy link

Jaykumaran commented Nov 4, 2024

You should have synced fork the main branch before creating PR, as there are merge conflicts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.