-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handling in-press publications needs its own automated system #2
Comments
As noted by @butlerpd in SasView ticket 617, this should extend beyond auto-updates and incorporate some way to handle duplicates as well. |
I have been working on this in branch auto-updater, and can delete duplicates, but I'm still working on updating publications that have changed (e.g. from in press to having a volume and page number). |
I created a test library in zotero that is a subset of the main sasview library. The latest commits to my branch are able to update entries with changes directly from the publisher and delete duplicates in that library with no apparent issues. These changes will require all publications be within a group, not the top level library. This will also change how publications are added. Implementation steps:
|
I've tested the new scripts against a mirror database of the sasview publications and the updating worked well, but the one duplicate I introduced was not deleted. A little work is needed before I turn the crons back on. |
I found the issue removing duplicates and tested it on the server. I've started both recurring jobs pointing to the original sasview zotero database. |
Currently, there is no easy way to update publications when they go from 'In Press' to fully published. The current method is to add a 2nd copy of the same publication and delete the previous. The original is often not removed from publications.md. To remove the item from the list, the pages and json files associated with the page need to be deleted from the server performing the updates.
Instead, I propose an automated method to look for any updates to the list. This could also run as a cron on the server, but could be much less frequent (daily/weekly?) than the current process.
The text was updated successfully, but these errors were encountered: