cgspace-notes/content/posts/2023-12.md

7.1 KiB

title date author categories
December, 2023 2023-12-01T08:48:36+03:00 Alan Orth
Notes

2023-12-01

  • There is still high load on CGSpace and I don't know why
    • I don't see a high number of sessions compared to previous days in the last few weeks
$ for file in dspace.log.2023-11-[23]*; do echo "$file"; grep -a -oE 'session_id=[A-Z0-9]{32}' "$file" | sort | uniq | wc -l; done
dspace.log.2023-11-20
22865
dspace.log.2023-11-21
20296
dspace.log.2023-11-22
19688
dspace.log.2023-11-23
17906
dspace.log.2023-11-24
18453
dspace.log.2023-11-25
17513
dspace.log.2023-11-26
19037
dspace.log.2023-11-27
21103
dspace.log.2023-11-28
23023
dspace.log.2023-11-29
23545
dspace.log.2023-11-30
21298
  • Even the number of unique IPs is not very high compared to the last week or so:
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.1 | sort | uniq | wc -l
17023
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.2.gz | sort | uniq | wc -l
17294
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.3.gz | sort | uniq | wc -l
22057
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.4.gz | sort | uniq | wc -l
32956
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.5.gz | sort | uniq | wc -l
11415
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.6.gz | sort | uniq | wc -l
15444
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.7.gz | sort | uniq | wc -l
12648
  • It doesn't make any sense so I think I'm going to restart the server...
    • After restarting the server the load went down to normal levels... who knows...
  • I started trying to see how I'm going to generate the fake statistics for the Alliance bitstream that was replaced
    • I exported all the statistics for the owningItem now:
$ chrt -b 0 ./run.sh -s http://localhost:8081/solr/statistics -a export -o /tmp/stats-export.json -f 'owningItem:b5862bfa-9799-4167-b1cf-76f0f4ea1e18' -k uid
  • Importing them into DSpace Test didn't show the statistics in the Atmire module, but I see them in Solr...

2023-12-02

  • Export CGSpace to check for missing Initiative collection mappings
  • Start a harvest on AReS

2023-12-04

  • Send a message to Altmetric support because the item IWMI highlighted last month still doesn't show the attention score for the Handle after I tweeted it several times weeks ago
  • Spent some time writing a Python script to fix the literal MaxMind City JSON objects in our Solr statistics
    • There are about 1.6 million of these, so I exported them using solr-import-export-json with the query city:com* but ended up finding many that have missing bundles, container bitstreams, etc:
city:com* AND -bundleName:[* TO *] AND -containerBitstream:[* TO *] AND -file_id:[* TO *] AND -owningItem:[* TO *] AND -version_id:[* TO *]
  • (Note the negation to find fields that are missing)
  • I don't know what I want to do with these yet

2023-12-05

  • I finished the fix_maxmind_stats.py script and fixed 1.6 million records and imported them on CGSpace after testing on DSpace 7 Test
  • Altmetric said there was a glitch regarding the Handle and DOI linking and they successfully re-scraped the item page and linked them
    • They sent me a list of current production IPs and I notice that some of them are in our nginx bot network list:
$ for network in $(csvcut -c network /tmp/ips.csv | sed 1d | sort -u); do grepcidr $network ~/src/git/rmg-ansible-public/roles/dspace/files/nginx/bot-networks.conf; done
108.128.0.0/13 'bot';
46.137.0.0/16 'bot';
52.208.0.0/13 'bot';
52.48.0.0/13 'bot';
54.194.0.0/15 'bot';
54.216.0.0/14 'bot';
54.220.0.0/15 'bot';
54.228.0.0/15 'bot';
63.32.242.35/32     'bot';
63.32.0.0/14 'bot';
99.80.0.0/15 'bot'
  • I will remove those for now so that Altmetric doesn't have any unexpected issues harvesting

2023-12-08

  • Finalized the script to generate Solr statistics for Alliance research Mirjam
    • The script is ilri/generate_solr_statistics.py
    • I generated ~3,200 statistics based on her records of the download statistics of that item and imported them on CGSpace
  • Did some work on the DSpace 7 submission form
  • Peter asked for lists of affiliations, investors, and publishers to do some cleanups
    • I generated a list from a CSV export instead of doing it based on a SQL dump...
$ csvcut -c 'cg.contributor.affiliation[en_US]' /tmp/initiatives.csv       \
  | sed -e 1d -e 's/^"//' -e 's/"$//' -e 's/||/\n/g' -e '/^$/d'            \
  | sort | uniq -c | sort -hr                                              \
  | awk 'BEGIN { FS = "^[[:space:]]+[[:digit:]]+[[:space:]]+" } {print $2}'\
  | sed -e '1i cg.contributor.affiliation' -e 's/^\(.*\)$/"\1"/'           \
  > /tmp/2023-12-08-initiatives-affiliations.csv
  • Export a list of authors as well:
localhost/dspace7= ☘ \COPY (SELECT DISTINCT text_value AS "dc.contributor.author", count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id = 3 GROUP BY "dc.contributor.author" ORDER BY count DESC) to /tmp/2023-12-08-authors.csv WITH CSV HEADER;
COPY 102435

2023-12-11

  • Work on OpenRXV dependencies and podman a bit
  • Peter noticed that the statistics for this month are very very low on CGSpace
    • I don't know what is going on, perhaps it is related to me adjusting the nginx config last week?
    • Ah, it's probably because of the spider patterns I updated on 2023-11

2023-12-16

  • Export CGSpace to check for missing Initiative collection mappings
  • Start a harvest on AReS

2023-12-17

  • Pull latest master branch for OpenRXV and deploy on the server
    • I threw away some changes in the tree regarding the Angular base ref, and it broke AReS
    • So note to self: we need to set the base ref in frontend/Dockerfile before building!
  • Now Salem fixed the country map

2023-12-18

  • Work a bit on the IFPRI-ISNAR archive from Leigh
  • More work on the DSpace 7 home page

2023-12-19

  • More work on the DSpace 7 home page
  • The Alliance TIP team is testing deposits to the DSpace 7 REST API and getting an HTTP 500 error
    • In the DSpace logs I see this after they log in, create the item, and update the metadata:
2023-12-19 17:49:28,022 ERROR unknown unknown org.dspace.rest.Resource @ Something get wrong. Aborting context in finally statement.

2023-12-20

  • The Alliance guys said that submitting via REST works now... sigh, so that's just some old DSpace 5/6 REST API bug
  • I lowercased all our AGROVOC keywords in dcterms.subject in SQL:
dspace=# BEGIN;
BEGIN
dspace=*# UPDATE metadatavalue SET text_value=LOWER(text_value) WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id=187 AND text_value ~ '[[:upper:]]';
UPDATE 462
dspace=*# COMMIT;
COMMIT