2023-01-01
- Apply some more ORCID identifiers to items on CGSpace using my
2022-09-22-add-orcids.csv
file
- I want to update all ORCID names and refresh them in the database
- I see we have some new ones that aren’t in our list if I combine with this file:
$ cat dspace/config/controlled-vocabularies/cg-creator-identifier.xml | grep - oE '[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}' | sort -u | wc -l
1939
$ cat dspace/config/controlled-vocabularies/cg-creator-identifier.xml 2022-09-22-add-orcids.csv| grep -oE '[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}' | sort -u | wc -l
1973
- I will extract and process them with my
resolve-orcids.py
script:
$ cat dspace/config/controlled-vocabularies/cg-creator-identifier.xml 2022-09-22-add-orcids.csv| grep -oE '[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}' | sort -u > /tmp/2023-01-01-orcids.txt
$ ./ilri/resolve-orcids.py -i /tmp/2023-01-01-orcids.txt -o /tmp/2023-01-01-orcids-names.txt -d
$ ./ilri/update-orcids.py -i /tmp/2023-01-01-orcids-names.txt -db dspace -u dspace -p 'fuuu' -m 247
- Load on CGSpace is high around 9.x
- I see there is a CIAT bot harvesting via the REST API with IP 45.5.186.2
- Other than that I don’t see any particular system stats as alarming
- There has been a marked increase in load in the last few weeks, perhaps due to Initiative activity…
- Perhaps there are some stuck PostgreSQL locks from CLI tools?
$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | grep -o -E '(dspaceWeb|dspaceApi|dspaceCli)' | sort | uniq -c
58 dspaceCli
46 dspaceWeb
- The current time on the server is 08:52 and I see the dspaceCli locks were started at 04:00 and 05:00… so I need to check which cron jobs those belong to as I think I noticed this last month too
- I’m going to wait and see if they finish, but by tomorrow I will kill them
2023-01-02
- The load on the server is now very low and there are no more locks from dspaceCli
- So there was some long-running process that was running and had to finish!
- That finally sheds some light on the “high load on Sunday” problem where I couldn’t find any other distinct pattern in the nginx or Tomcat requests
2023-01-03
- The load from the server on Sundays, which I have noticed for a long time, seems to be coming from the DSpace checker cron job
- This checks the checksums of all bitstreams to see if they match the ones in the database
- I exported the entire CGSpace metadata to do country/region checks with
csv-metadata-quality
- I extracted only the items with countries, which was about 48,000, then split the file into parts of 10,000 items, but the upload found 2,000 changes in the first one and took several hours to complete…
- IWMI sent me ORCID identifiers for new scientsts, bringing our total to 2,010
2023-01-04
- I finally finished applying the region imports (in five batches of 10,000)
- It was about 7,500 missing regions in total…
- Now I will move on to doing the Initiative mappings
- I modified my
fix-initiative-mappings.py
script to only write out the items that have updated mappings
- This makes it way easier to apply fixes to the entire CGSpace because we don’t try to import 100,000 items with no changes in mappings
- More dspaceCli locks from 04:00 this morning (current time on server is 07:33) and today is a Wednesday
- The checker cron job runs on
0,3
, which is Sunday and Wednesday, so this is from that…
- Finally at 16:30 I decided to kill the PIDs associated with those locks…
- I am going to disable that cron job for now and watch the server load for a few weeks
- Start a harvest on AReS
2023-01-08
- It’s Sunday and I see some PostgreSQL locks belonging to dspaceCli that started at 05:00
- That’s strange because I disabled the
dspace checker
one last week, so I’m not sure which this is…
- It’s currently 2:30PM on the server so these locks have been there for almost twelve hours
- I exported the entire CGSpace to update the Initiative mappings
- Items were mapped to ~58 new Initiative collections
- Then I ran the ORCID import to catch any new ones that might not have been tagged
- Then I started a harvest on AReS
2023-01-09
- Fix some invalid Initiative names on CGSpace and then check for missing mappings
- Check for missing regions in the Initiatives collection
- Export a list of author affiliations from the Initiatives community for Peter to check
- Was slightly ghetto because I did it from a CSV export of the Initiatives community, then imported to OpenRefine to split multi-value fields, then did some sed nonsense to handle the quoting:
$ csvcut -c 'cg.contributor.affiliation[en_US]' ~/Downloads/2023-01-09-initiatives.csv | \
sed -e 's/^"//' -e 's/"$//' -e 's/||/\n/g' | \
sort -u | \
sed -e 's/^\(.*\)/"\1/' -e 's/\(.*\)$/\1"/' > /tmp/2023-01-09-initiatives-affiliations.csv
2023-01-10
- Export the CGSpace Initiatives collection to check for missing regions and collection mappings
2023-01-11
- I’m trying the DSpace 7 REST API again
$ curl --head https://dspace7test.ilri.org/server/api
...
set-cookie: DSPACE-XSRF-COOKIE=42c78c56-613d-464f-89ea-79142fc5b519; Path=/server; Secure; HttpOnly; SameSite=None
dspace-xsrf-token: 42c78c56-613d-464f-89ea-79142fc5b519
$ curl -v -X POST https://dspace7test.ilri.org/server/api/authn/login --data "user=alantest%40cgiar.org&password=dspace" -H "X-XSRF-TOKEN: 42c78c56-613d-464f-89ea-79142fc5b519" -b "DSPACE-XSRF-COOKIE=42c78c56-613d-464f-89ea-79142fc5b519"
...
authorization: Bearer eyJh...9-0
$ curl -v "https://dspace7test.ilri.org/api/core/items" -H "Authorization: Bearer eyJh...9-0"
- I created a pull request to fix the docs
- I did quite a lot of cleanup and updates on the IFPRI batch items for the Gender Equality batch upload
- Then I uploaded them to CGSpace
- I added about twenty more ORCID identifiers to my list and tagged them on CGSpace
2023-01-12
- I exported the entire CGSpace and did some cleanups on all metadata in OpenRefine
- I was primarily interested in normalizing the DOIs, but I also normalized a bunch of publishing places
- After this imports I will export it again to do the Initiative and region mappings
- I ran the
fix-initiative-mappings.py
script and got forty-nine new mappings…
- I added several dozen new ORCID identifiers to my list and tagged ~500 on CGSpace
- Start a harvest on AReS