-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pure zsh history import function #137
Comments
baodrate
added a commit
to baodrate/zsh-histdb
that referenced
this issue
Mar 22, 2023
originally posted at github.com/larkery/issues/137 Function to parse zsh's histfile and add it to the database. Leverages zsh's history file parsing so it handles newlines and arbitrary characters very well. The other tools suggested in the README require other languages and don't handle multiline commands soundly. Inserts are batched and it's pretty fast. A bit of time is wasted calling `fc -l` to read the timestamps, but it takes ~1s to parse ~20k lines for me so it should be fast enough for most use-cases. Notes: * this sets the `session` to `0` * (sqlite starts autoincrement ids at 1 so it shouldn't coincide with any actual histdb sessions) * `dir` is set to the empty string (`''`) instead of `NULL` because sqlite doesn't let you use `NULL` as part of a key * Added `unique(session, command_id, place_id, start_time) on conflict ignore` constraint to the `history` table so history instances are de-duped. * Useful for importing history file backups that likely contain dupes
baodrate
added a commit
to baodrate/zsh-histdb
that referenced
this issue
Mar 22, 2023
originally posted at github.com/larkery/issues/137 Function to parse zsh's histfile and add it to the database. Leverages zsh's history file parsing so it handles newlines and arbitrary characters very well. The other tools suggested in the README require other languages and don't handle multiline commands soundly. Inserts are batched and it's pretty fast. A bit of time is wasted calling `fc -l` to read the timestamps, but it takes ~1s to parse ~20k lines for me so it should be fast enough for most use-cases. Notes: * this sets the `session` to `0` * (sqlite starts autoincrement ids at 1 so it shouldn't coincide with any actual histdb sessions) * `dir` is set to the empty string (`''`) instead of `NULL` because sqlite doesn't let you use `NULL` as part of a key * Added `unique(session, command_id, place_id, start_time) on conflict ignore` constraint to the `history` table so history instances are de-duped. * Useful for importing history file backups that likely contain dupes
baodrate
added a commit
to baodrate/zsh-histdb
that referenced
this issue
Mar 22, 2023
originally posted at github.com/larkery/issues/137 Function to parse zsh's histfile and add it to the database. Leverages zsh's history file parsing so it handles newlines and arbitrary characters very well. The other tools suggested in the README require other languages and don't handle multiline commands soundly. Inserts are batched and it's pretty fast. A bit of time is wasted calling `fc -l` to read the timestamps, but it takes ~1s to parse ~20k lines for me so it should be fast enough for most use-cases. Notes: * this sets the `session` to `0` * (sqlite starts autoincrement ids at 1 so it shouldn't coincide with any actual histdb sessions) * `dir` is set to the empty string (`''`) instead of `NULL` because sqlite doesn't let you use `NULL` as part of a key * Added `unique(session, command_id, place_id, start_time) on conflict ignore` constraint to the `history` table so history instances are de-duped. * Useful for importing history file backups that likely contain dupes
baodrate
added a commit
to baodrate/zsh-histdb
that referenced
this issue
Mar 22, 2023
originally posted at github.com/larkery/issues/137 Function to parse zsh's histfile and add it to the database. Leverages zsh's history file parsing so it handles newlines and arbitrary characters very well. The other tools suggested in the README require other languages and don't handle multiline commands soundly. Inserts are batched and it's pretty fast. A bit of time is wasted calling `fc -l` to read the timestamps, but it takes ~1s to parse ~20k lines for me so it should be fast enough for most use-cases. Notes: * this sets the `session` to `0` * (sqlite starts autoincrement ids at 1 so it shouldn't coincide with any actual histdb sessions) * `dir` is set to the empty string (`''`) instead of `NULL` because sqlite doesn't let you use `NULL` as part of a key * Added `unique(session, command_id, place_id, start_time) on conflict ignore` constraint to the `history` table so history instances are de-duped. * Useful for importing history file backups that likely contain dupes
my solution uses sed and .import:
it takes a couple seconds to import 150k lines. I don't think this handles duration etc correctly though, and I don't think histdb is right for me so I probably won't improve this. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
In case this helps anyone else, here's a function to parse zsh's histfile and add it to the database. Leverages zsh's history file parsing so it handles newlines and arbitrary characters very well. The other tools suggested in the README require other languages and don't handle multiline commands soundly.
Inserts are batched and it's pretty fast. A bit of time is wasted calling
fc -l
to read the timestamps, but it takes ~1s to parse ~20k lines for me so it should be fast enough for most use-cases.Note that:
session
to0
(sqlite starts its autoincrement ids at 1so it shouldn't coincide with any histdb sessionszsh-histdb actually initializes session to 0 if none currently exist, so a different value might be desirable (-1?))dir
is set to the empty string (''
) instead ofNULL
because sqlite doesn't let you useNULL
as part of a keyunique(session, command_id, place_id, start_time) on conflict ignore
constraint to thehistory
table so history instances are de-duped. Useful if you have a bunch of history file backups to import.on conflict ... do update ... where ...
clauseThe text was updated successfully, but these errors were encountered: