September, 2019
2019-09-01
- Linode emailed to say that CGSpace (linode18) had a high rate of outbound traffic for several hours this morning
Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning:
# zcat --force /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep -E "01/Sep/2019:0" | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10 440 17.58.101.255 441 157.55.39.101 485 207.46.13.43 728 169.60.128.125 730 207.46.13.108 758 157.55.39.9 808 66.160.140.179 814 207.46.13.212 2472 163.172.71.23 6092 3.94.211.189 # zcat --force /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/oai.log /var/log/nginx/oai.log.1 | grep -E "01/Sep/2019:0" | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10 33 2a01:7e00::f03c:91ff:fe16:fcb 57 3.83.192.124 57 3.87.77.25 57 54.82.1.8 822 2a01:9cc0:47:1:1a:4:0:2 1223 45.5.184.72 1633 172.104.229.92 5112 205.186.128.185 7249 2a01:7e00::f03c:91ff:fe18:7396 9124 45.5.186.2
3.94.211.189
is MauiBot, and most of its requests are to Discovery and get rate limited with HTTP 503163.172.71.23
is some IP on Online SAS in France and its user agent is:Mozilla/5.0 ((Windows; U; Windows NT 6.1; fr; rv:1.9.2) Gecko/20100115 Firefox/3.6)
It actually got mostly HTTP 200 responses:
# zcat --force /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep -E "01/Sep/2019:0" | grep 163.172.71.23 | awk '{print $9}' | sort | uniq -c 1775 200 703 499 72 503
And it was mostly requesting Discover pages:
# zcat --force /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep -E "01/Sep/2019:0" | grep 163.172.71.23 | grep -o -E "(bitstream|discover|handle)" | sort | uniq -c 2350 discover 71 handle
I’m not sure why the outbound traffic rate was so high…
2019-09-02
- Follow up with Carol and Francesca from Bioversity as they were on holiday during the mid-to-late August
- I told them to check the temporary collection on DSpace Test where I uploaded the 1,427 items so they can see how it will look
- Also, I told them to advise me about the strange file extensions (.7z, .zip, .lck)
- Also, I reminded Abenet to check the metadata, as the institutional authors at least will need some modification
2019-09-10
- Altmetric responded to say that they have fixed an issue with their badge code so now research outputs with multiple handles are showing badges!
- Follow up with Bosede about the mixup with PDFs in the items uploaded in 2018-12 (aka Daniel1807.xsl)
- These are the same ones that Peter noticed last week, that Bosede and I had been discussing earlier this year that we never sorted out
- It looks like these items were uploaded by Sisay on 2018-12-19 so we can use the accession date as a filter to narrow it down to 230 items (of which only 104 have PDFs, according to the Daniel1807.xls input input file)
- Now I just checked a few manually and they are correct in the original input file, so something must have happened when Sisay was processing them for upload
- I have asked Sisay to fix them…
- Continue working on CG Core v2 migration, focusing on the crosswalk mappings
- I think we can skip the MODS crosswalk for now because it is only used in AIP exports that are meant for non-DSpace systems
- We should probably do the QDC crosswalk as well as those in
xhtml-head-item.properties
… - Ouch, there is potentially a lot of work in the OAI metadata formats like DIM, METS, and QDC (see
dspace/config/crosswalks/oai/*.xsl
) - In general I think I should only modify the left side of the crosswalk mappings (ie, where metadata is coming from) so we maintain the same exact output for search engines, etc
2019-09-11
- Maria Garruccio asked me to add two new Bioversity ORCID identifiers to CGSpace so I created a pull request
- Marissa Van Epp asked me to add new CCAFS Phase II project tags to CGSpace so I created a pull request
- I will wait until I hear from her to merge it because there is one tag that seems to be a duplicate because its name (PII-WA_agrosylvopast) is similar to one that already exists (PII-WA_AgroSylvopastoralSystems)
- More work on the CG Core v2 migrations
- I have updated my notes on the possible changes and done more work on the XMLUI replacements
2019-09-12
- Deploy PostgreSQL JDBC driver version 42.2.7 on DSpace Test and update the Ansible infrastructure scripts
2019-09-15
- Deploy Bioversity ORCID identifier updates to CGSpace
- Deploy PostgreSQL JDBC driver 42.2.7 on CGSpace
- Run system updates on CGSpace (linode18) and restart the server
- After restarting the system Tomcat came back up, but not all Solr statistics cores were loaded
- I had to restart Tomcat one more time until the cores were loaded (verified in the Solr admin)
- Update nginx TLS cipher suite to the latest Mozilla intermediate recommendations for nginx 1.16.0 and openssl 1.0.2
- DSpace Test (linode19) is running Ubuntu 18.04 with nginx 1.17.x and openssl 1.1.1 so it can even use TLS v1.3 if we override the nginx ssl protocol in its host vars
XMLUI item view pages are blank on CGSpace right now
Like earliert this year, I see the following error in the Cocoon log while browsing:
2019-09-15 15:32:18,137 WARN org.apache.cocoon.components.xslt.TraxErrorListener - Can not load requested doc: unknown protocol: cocoon at jndi:/localhost/themes/CIAT/xsl/../../0_CGIAR/xsl//aspect/artifactbrowser/common.xsl:141:90
Around the same time I see the following in the DSpace log:
2019-09-15 15:32:18,079 INFO org.dspace.usage.LoggerUsageEventListener @ aorth@blah:session_id=A11C362A7127004C24E77198AF9E4418:ip_addr=x.x.x.x:view_item:handle=10568/103644 2019-09-15 15:32:18,135 WARN org.dspace.core.PluginManager @ Cannot find named plugin for interface=org.dspace.content.crosswalk.DisseminationCrosswalk, name="METSRIGHTS"
I see a lot of these errors today, but not earlier this month:
# grep -c 'Cannot find named plugin' dspace.log.2019-09-* dspace.log.2019-09-01:0 dspace.log.2019-09-02:0 dspace.log.2019-09-03:0 dspace.log.2019-09-04:0 dspace.log.2019-09-05:0 dspace.log.2019-09-06:0 dspace.log.2019-09-07:0 dspace.log.2019-09-08:0 dspace.log.2019-09-09:0 dspace.log.2019-09-10:0 dspace.log.2019-09-11:0 dspace.log.2019-09-12:0 dspace.log.2019-09-13:0 dspace.log.2019-09-14:0 dspace.log.2019-09-15:808
Something must have happened when I restarted Tomcat a few hours ago, because earlier in the DSpace log I see a bunch of errors like this:
2019-09-15 13:59:24,136 ERROR org.dspace.core.PluginManager @ Name collision in named plugin, implementation class="org.dspace.content.crosswalk.METSRightsCrosswalk", name="METSRIGHTS" 2019-09-15 13:59:24,136 ERROR org.dspace.core.PluginManager @ Name collision in named plugin, implementation class="org.dspace.content.crosswalk.OREDisseminationCrosswalk", name="ore" 2019-09-15 13:59:24,136 ERROR org.dspace.core.PluginManager @ Name collision in named plugin, implementation class="org.dspace.content.crosswalk.DIMDisseminationCrosswalk", name="dim"
I restarted Tomcat and the item views came back, but then the Solr statistics cores didn’t all load properly
- After restarting Tomcat once again, both the item views and the Solr statistics cores all came back OK
2019-09-19
For some reason my podman PostgreSQL container isn’t working so I had to use Docker to re-create it for my testing work today:
# docker pull docker.io/library/postgres:9.6-alpine # docker create volume dspacedb_data # docker run --name dspacedb -v dspacedb_data:/var/lib/postgresql/data -e POSTGRES_PASSWORD=postgres -p 5432:5432 -d postgres:9.6-alpine $ createuser -h localhost -U postgres --pwprompt dspacetest $ createdb -h localhost -U postgres -O dspacetest --encoding=UNICODE dspacetest $ psql -h localhost -U postgres dspacetest -c 'alter user dspacetest superuser;' $ pg_restore -h localhost -U postgres -d dspacetest -O --role=dspacetest -h localhost ~/Downloads/cgspace_2019-08-31.backup $ psql -h localhost -U postgres dspacetest -c 'alter user dspacetest nosuperuser;' $ psql -h localhost -U postgres -f ~/src/git/DSpace/dspace/etc/postgres/update-sequences.sql dspacetest
Elizabeth from CIAT sent me a list of sixteen authors who need to have their ORCID identifiers tagged with their publications
- I manually checked the ORCID profile links to make sure they matched the names
Then I created an input file to use with my
add-orcid-identifiers-csv.py
script:dc.contributor.author,cg.creator.id "Kihara, Job","Job Kihara: 0000-0002-4394-9553" "Twyman, Jennifer","Jennifer Twyman: 0000-0002-8581-5668" "Ishitani, Manabu","Manabu Ishitani: 0000-0002-6950-4018" "Arango, Jacobo","Jacobo Arango: 0000-0002-4828-9398" "Chavarriaga Aguirre, Paul","Paul Chavarriaga-Aguirre: 0000-0001-7579-3250" "Paul, Birthe","Birthe Paul: 0000-0002-5994-5354" "Eitzinger, Anton","Anton Eitzinger: 0000-0001-7317-3381" "Hoek, Rein van der","Rein van der Hoek: 0000-0003-4528-7669" "Aranzales Rondón, Ericson","Ericson Aranzales Rondon: 0000-0001-7487-9909" "Staiger-Rivas, Simone","Simone Staiger: 0000-0002-3539-0817" "de Haan, Stef","Stef de Haan: 0000-0001-8690-1886" "Pulleman, Mirjam","Mirjam Pulleman: 0000-0001-9950-0176" "Abera, Wuletawu","Wuletawu Abera: 0000-0002-3657-5223" "Tamene, Lulseged","Lulseged Tamene: 0000-0002-3806-8890" "Andrieu, Nadine","Nadine Andrieu: 0000-0001-9558-9302" "Ramírez-Villegas, Julián","Julian Ramirez-Villegas: 0000-0002-8044-583X"
I tested the file on my local development machine with the following invocation:
$ ./add-orcid-identifiers-csv.py -i 2019-09-19-ciat-orcids.csv -db dspace -u dspace -p 'fuuu'
In my test environment this added 390 ORCID identifier
I ran the same updates on CGSpace and DSpace Test and then started a Discovery re-index to force the search index to update
Update the PostgreSQL JDBC driver to version 42.2.8 in our Ansible infrastructure scripts
- There is only one minor fix to a usecase we aren’t using so I will deploy this on the servers the next time I do updates
Run system updates on DSpace Test (linode19) and reboot it
Start looking at IITA’s latest round of batch updates that Sisay had uploaded to DSpace Test earlier this month
- For posterity, IITA’s original input file was 20196th.xls and Sisay uploaded it as “IITA_Sep_06” to DSpace Test
- Sisay said he did ran the csv-metadata-quality script on the records, but I assume he didn’t run the unsafe fixes or AGROVOC checks because I still see unneccessary Unicode, excessive whitespace, one invalid ISBN, missing dates and a few invalid AGROVOC fields
- In addition, a few records were missing authorship type
- I deleted two invalid AGROVOC terms because they were ambiguous
- Validate and normalize affiliations against our 2019-04 list using reconcile-csv and OpenRefine:
$ lein run ~/src/git/DSpace/2019-04-08-affiliations.csv name id
- I always forget how to copy the reconciled values in OpenRefine, but you need to make a new colum and populate it using this GREL:
if(cell.recon.matched, cell.recon.match.name, value)
- I also looked through the IITA subjects to normalize some values
Follow up with Marissa again about the CCAFS phase II project tags
Generate a list of the top 1500 authors on CGSpace:
dspace=# \copy (SELECT DISTINCT text_value, count(*) FROM metadatavalue WHERE metadata_field_id = (SELECT metadata_field_id FROM metadatafieldregistry WHERE element = 'contributor' AND qualifier = 'author') AND resource_type_id = 2 GROUP BY text_value ORDER BY count DESC LIMIT 1500) to /tmp/2019-09-19-top-1500-authors.csv WITH CSV HEADER;
Then I used
csvcut
to select the column of author names, strip the header and quote characters, and saved the sorted file:$ csvcut -c text_value /tmp/2019-09-19-top-1500-authors.csv | grep -v text_value | sed 's/"//g' | sort > dspace/config/controlled-vocabularies/dc-contributor-author.xml
After adding the XML formatting back to the file I formatted it using XML tidy:
$ tidy -xml -utf8 -m -iq -w 0 dspace/config/controlled-vocabularies/dc-contributor-author.xml
I created and merged a pull request for the updates
- This is the first time we’ve updated this controlled vocabulary since 2018-09