cgspace-notes/content/posts/2022-05.md

7.6 KiB

title date author categories
May, 2022 2022-05-04T09:13:39+03:00 Alan Orth
Notes

2022-05-04

  • I found a few more IPs making requests using the shady Chrome 44 user agent in the last few days so I will add them to the block list too:
    • 18.207.136.176
    • 185.189.36.248
    • 50.118.223.78
    • 52.70.76.123
    • 3.236.10.11
  • Looking at the Solr statistics for 2022-04
    • 52.191.137.59 is Microsoft, but they are using a normal user agent and making tens of thousands of requests
    • 64.39.98.62 is owned by Qualys, and all their requests are probing for /etc/passwd etc
    • 185.192.69.15 is in the Netherlands and is using a normal user agent, but making excessive automated HTTP requests to paths forbidden in robots.txt
    • 157.55.39.159 is owned by Microsoft and identifies as bingbot so I don't know why its requests were logged in Solr
    • 52.233.67.176 is owned by Microsoft and uses a normal user agent, but making excessive automated HTTP requests
    • 157.55.39.144 is owned by Microsoft and uses a normal user agent, but making excessive automated HTTP requests
    • 207.46.13.177 is owned by Microsoft and identifies as bingbot so I don't know why its requests were logged in Solr
    • If I query Solr for time:2022-04* AND dns:*msnbot* AND dns:*.msn.com. I see a handful of IPs that made 41,000 requests
  • I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script
  • Now looking at the Solr statistics by user agent I see:
    • SomeRandomText
    • RestSharp/106.11.7.0
    • MetaInspector/5.7.0 (+https://github.com/jaimeiniesta/metainspector)
    • wp_is_mobile
    • Mozilla/5.0 (compatible; um-LN/1.0; mailto: techinfo@ubermetrics-technologies.com; Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.1"
    • insomnia/2022.2.1
    • ZoteroTranslationServer
    • omgili/0.5 +http://omgili.com
    • curb
    • Sprout Social (Link Attachment)
  • I purged 2,900 hits from these user agents from Solr using my check-spider-hits.sh script
  • I made a pull request to COUNTER-Robots for some of these agents
    • In the mean time I will add them to our local overrides in DSpace
  • Run all system updates on AReS server, update all Docker containers, and restart the server
    • Start a harvest on AReS

2022-05-05

  • Update PostgreSQL JDBC driver to 42.3.5 in the Ansible infrastructure playbooks and deploy on DSpace Test
  • Peter asked me how many items we add to CGSpace every year
    • I wrote a SQL query to check the number of items grouped by their accession dates since 2009:
localhost/dspacetest= ☘ SELECT EXTRACT(year from text_value::date) AS YYYY, COUNT(*) FROM metadatavalue WHERE metadata_field_id=11 GROUP BY YYYY ORDER BY YYYY DESC LIMIT 14;
 yyyy │ count 
──────┼───────
 2022 │  2073
 2021 │  6471
 2020 │  4074
 2019 │  7330
 2018 │  8899
 2017 │  6860
 2016 │  8451
 2015 │ 15692
 2014 │ 16479
 2013 │  4388
 2012 │  6472
 2011 │  2694
 2010 │  2457
 2009 │   293
  • Note that I had an issue with casting text_value to date because one item had an accession date of 2016 instead of 2016-09-29T20:14:47Z
    • Once I fixed that PostgreSQL was able to extract() the year
    • There were some other methods I tried that worked also, for example TO_DATE():
localhost/dspacetest= ☘ SELECT EXTRACT(year from TO_DATE(text_value, 'YYYY-MM-DD"T"HH24:MI:SS"Z"')) AS YYYY, COUNT(*) FROM metadatavalue WHERE metadata_field_id=11 GROUP BY YYYY ORDER BY YYYY DESC LIMIT 14;
  • But it seems PostgreSQL is smart enough to recognize date formatting in strings automatically when we cast so we don't need to convert to date first
  • Another thing I noticed is that a few hundred items have accession dates from decades ago, perhaps this is due to importing items from the CGIAR Library?
  • I spent some time merging a few pull requests for DSpace 6.4 and porting one to main for DSpace 7.x
  • I also submitted a pull request to migrate Mirage 2's build from bower and compass to yarn and node-sass

2022-05-07

  • Start a harvest on AReS

2022-05-09

  • Submit an issue to Atmire's bug tracker inquiring about DSpace 6.4 support

2022-05-10

2022-05-12

  • CGSpace meeting with Abenet and Peter
    • We discussed the future of CGSpace and DSpace in general in the new One CGIAR
    • We discussed how to prepare for bringing in content from the Initiatives, whether we need new metadata fields to support people from IFPRI etc
    • We discussed the need for good quality Drupal and WordPress modules so sites can harvest content from the repository
    • Peter asked me to send him a list of investors/funders/donors so he can clean it up, but also to try to align it with ROR and evntually do something like we do with country codes, adding the ROR IDs and potentially showing the badge on item views
    • We also discussed removing some Mirage 2 themes for old programs and CRPs that don't have custom branding, ie only Google Analytics
  • Export a list of donors for Peter to clean up:
localhost/dspacetest= ☘ \COPY (SELECT DISTINCT text_value as "cg.contributor.donor", count(*) FROM metadatavalue WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id = 248 GROUP BY text_value ORDER BY count DESC) to /tmp/2022-05-12-donors.csv WITH CSV HEADER;
COPY 1184
  • Then I created a CSV from our cg-creator-identifier.xml controlled vocabulary and ran it against our database with add-orcid-identifiers-csv.py to see if any author names by chance matched that are missing ORCIDs in CGSpace
$ ./ilri/add-orcid-identifiers-csv.py -i /tmp/2022-05-12-add-orcids.csv -db dspace -u dspace -p 'fuuu' | tee /tmp/orcid.log
$ grep -c "Adding ORCID" /tmp/add-orcids.log
85
  • So it's only eighty-five, but better than nothing...
  • I removed the custom Mirage 2 themes for some old projects:
    • AgriFood
    • AVCD
    • LIVES
    • FeedTheFuture
    • DrylandSystems
    • TechnicalConsortium
    • EADD
  • That should knock off a few minutes of the maven build time!
  • I generated a report from the AReS nginx logs on linode18:
# zcat --force /var/log/nginx/access.log.* | grep 'GET /explorer' | goaccess --log-format=COMBINED - -o /tmp/ares_report.html

2022-05-13

  • Peter finalized the corrections on donors from yesterday so I extracted them into fix/delete CSVs and ran them on CGSpace:
$ ./ilri/fix-metadata-values.py -i 2022-05-13-fix-CGSpace-Donors.csv -db dspace -u dspace -p 'fuuu' -f cg.contributor.donor -m 248 -t correct -d
$ ./ilri/delete-metadata-values.py -i 2022-05-13-delete-CGSpace-Donors.csv -db dspace -u dspace -p 'fuuu' -f cg.contributor.donor -m 248 -d
  • I cleaned up a few records manually (like some that had \r\n) then re-exported the donors and checked against the latest ROR dump:
$ ./ilri/ror-lookup.py -i /tmp/2022-05-13-donors.csv -r v1.0-2022-03-17-ror-data.json -o /tmp/2022-05-13-ror.csv
$ csvgrep -c matched -m true /tmp/2022-05-13-ror.csv | wc -l
230
$ csvgrep -c matched -m false /tmp/2022-05-13-ror.csv | csvcut -c organization > /tmp/2022-05-13-ror-unmatched.csv
  • Then I sent Peter a list so he can try to update some from ROR