mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2025-01-27 05:49:12 +01:00
Add notes for 2023-05-13
This commit is contained in:
@ -16,4 +16,52 @@ categories: ["Notes"]
|
||||
|
||||
- Spend some time looking at duplicate DOIs again...
|
||||
|
||||
## 2024-05-06
|
||||
|
||||
- Spend some time looking at duplicate DOIs again...
|
||||
|
||||
## 2024-05-07
|
||||
|
||||
- Discuss RSS feeds and OpenSearch with IWMI
|
||||
- It seems our OpenSearch feed settings are using the defaults, so I need to copy some of those over from our old DSpace 6 branch
|
||||
- I saw a patch for an interesting issue on DSpace GitHub: [Error submitting or deleting items - URI too long when user is in a large number of groups](https://github.com/DSpace/DSpace/issues/9544)
|
||||
- I hadn't realized it, but we have lots of those errors:
|
||||
|
||||
```console
|
||||
$ zstdgrep -a 'URI Too Long' log/dspace.log-2024-04-* | wc -l
|
||||
1423
|
||||
```
|
||||
|
||||
- Spend some time looking at duplicate DOIs again...
|
||||
|
||||
## 2024-05-08
|
||||
|
||||
- Spend some time looking at duplicate DOIs again...
|
||||
- I finally finished looking at the duplicate DOIs for journal articles
|
||||
- I updated the list of handle redirects and there are 386 of them!
|
||||
|
||||
## 2024-05-09
|
||||
|
||||
- Spend some time working on the IFPRI 2020–2021 batch
|
||||
- I started by checking for exact duplicates (1.0 similarity) using DOI, type, and issue date
|
||||
|
||||
## 2024-05-12
|
||||
|
||||
- I couldn't figure out how to do a complex join on withdrawn items along with their metadata, so I pull out a few like titles, handles, and provenance separately:
|
||||
|
||||
```psql
|
||||
dspace=# \COPY (SELECT i.uuid, m.text_value AS uri FROM item i JOIN metadatavalue m ON i.uuid = m.dspace_object_id WHERE withdrawn AND m.metadata_field_id=25) TO /tmp/withdrawn-handles.csv CSV HEADER;
|
||||
dspace=# \COPY (SELECT i.uuid, m.text_value AS title FROM item i JOIN metadatavalue m ON i.uuid = m.dspace_object_id WHERE withdrawn AND m.metadata_field_id=64) TO /tmp/withdrawn-titles.csv CSV HEADER;
|
||||
dspace=# \COPY (SELECT i.uuid, m.text_value AS submitted_by FROM item i JOIN metadatavalue m ON i.uuid = m.dspace_object_id WHERE withdrawn AND m.metadata_field_id=28 AND m.text_value LIKE 'Submitted by%') TO /tmp/withdrawn-submitted-by.csv CSV HEADER;
|
||||
```
|
||||
|
||||
- Then joined them:
|
||||
|
||||
```console
|
||||
$ csvjoin -c uuid /tmp/withdrawn-title.csv /tmp/withdrawn-handles.csv /tmp/withdrawn-submitted-by.csv > /tmp/withdrawn.csv
|
||||
```
|
||||
|
||||
- This gives me an insight into who submitted at 334 of the duplicates over the past few years...
|
||||
- I fixed a few hundred titles with leading/trailing whitespace, newlines, and ligatures like ff, fi, fl, ffi, and ffl
|
||||
|
||||
<!-- vim: set sw=2 ts=2: -->
|
||||
|
Reference in New Issue
Block a user