-
Notifications
You must be signed in to change notification settings - Fork 0
agenda
Fall Semester 2017
Last update: 2017-11-13
- review parties
- community hangouts
- ???
A placename+location reconciliation service for Pleiades, written by @ryanfb. More info: "Introducing Geocollider" on the Pleiades News Blog.
Concerns/Todo:
- Testing: CSV file upload
- Testing: OpenRefine reconciliation service
- Hosting
The Pleiades privacy letter is out-of-date. It needs to be reviewed and brought up to date; probably we should retire it and write a new "privacy policy" document to go in its place. Getting this done will require us to take some policy positions and perhaps make technical changes accordingly. Some possible concerns include:
- HTTPS vs. HTTP
- Google Analytics
- Third-party Javascript libraries?
- Server logs capture and retention
- Cookies
- Public/Community Privacy policy (outdated)
As we approach a new version of the PeriodO client that will permit the creation of referenceable user-generated collections, which will make it possible for Pleiades to manage a definitive and updatable period list within PeriodO, we should start talking about how the existing vocabulary can be cleaned up and how we should manage new additions (cf. the request for periods for the Islamic western Mediterranean). Who should be allowed to make these, and what should our documentation standards be, since Pleiades itself is acting as an authority? This will be especially important in the near future for two reasons: the collection of periods for the Near East that Müge is working on, which we will want to incorporate into the PeriodO collection; and the potential collaboration with the Aga Khan archive, which PeriodO has also approached for period information.
Specific items:
- guidelines for adding new periods (user and reviewer documentation)
- links to PeriodO documentation for how to do this (will be improved with client update)
- status check on Müge periods
- conversation about Aga Khan periods
Work in progress:
- Modern location used as proxy for a known ancient site
- Modern location of physical feature also of interest in antiquity (perhaps has moved, like a river)
- Modern location (as a village) known or assumed to be associated with an attested ancient site, but not fully proved via archaeology
Places that contain multiple locations amongst which there is significantly large spatial distance.
Work in progress: spreadsheet for location discrepancies.
Deprecate/withdraw? #282
- a meeting has been scheduled at MIT in January, immediately following AIA/SCS. Elliott, Holman, Moss, Robinson to attend.
- standardize DARMC citations (done)
- standardize Wikipedia citations (done - but citation guide needs update)
- standardize short title usage in the Pleiades Zotero library
- ~250 works are missing short titles entirely
- create new guidance document for the zotero library?
- align legacy citations in Pleiades to zotero library, filling in the missing blanks where necessary (this will probably also involving putting missing works into the Zotero library)
- standardize existing citations for other standard works like PECS, New Pauly, etc.
Criteria for identifying/modifying via script
- PECS reference?
- OSM reference with non-modern dates and place type built/cultural?
Guidelines: a textually-attested name with multiple possible (tentative) archaeologically-attested locations
i.e., there are multiple, discrete archaeological sites whose ancient name is not conclusively proved and any of them (or none) may have been a place mentioned by name in a surviving ancient text.
- connection type for relationships between a city and its fortifications?
ISAW Libraries has requested Chinese dynasties be added to time periods collection. Lex Berman and Ruth Mostern have recommended:
- DDBC Time Authority (for accuracy)
- "a quick list of reign periods to mess with locally from Ci Hai appendix"
- "It would be ideal, especially for the earlier periods, for someone working on imperial China to use reign periods rather than dynasties"
Reclaim Pleiades account on datahub.io and bring it back under curation.