mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-24 23:50:17 +01:00
291 lines
17 KiB
Markdown
291 lines
17 KiB
Markdown
---
|
|
title: "May, 2020"
|
|
date: 2020-05-02T09:52:04+03:00
|
|
author: "Alan Orth"
|
|
categories: ["Notes"]
|
|
---
|
|
|
|
## 2020-05-02
|
|
|
|
- Peter said that CTA is having problems submitting an item to CGSpace
|
|
- Looking at the PostgreSQL stats it seems to be the same issue that Tezira was having last week, as I see the number of connections in 'idle in transaction' and 'waiting for lock' state are increasing again
|
|
- I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2.11, and there were some bugs related to transactions fixed in 42.2.12 (which I had updated in the Ansible playbooks, but not deployed yet)
|
|
|
|
<!--more-->
|
|
|
|
## 2020-05-03
|
|
|
|
- Purge a few remaining bots from CGSpace Solr statistics that I had identified a few months ago
|
|
- `lua-resty-http/0.10 (Lua) ngx_lua/10000`
|
|
- `omgili/0.5 +http://omgili.com`
|
|
- `IZaBEE/IZaBEE-1.01 (Buzzing Abound The Web; https://izabee.com; info at izabee dot com)`
|
|
- `Twurly v1.1 (https://twurly.org)`
|
|
- `Pattern/2.6 +http://www.clips.ua.ac.be/pattern`
|
|
- `CyotekWebCopy/1.7 CyotekHTTP/2.0`
|
|
- This is only about 2,500 hits total from the last ten years, and half of these bots no longer seem to exist, so I won't bother submitting them to the COUNTER-Robots project
|
|
- I noticed that our custom themes were incorrectly linking to the OpenSearch XML file
|
|
- The bug [was fixed](https://jira.lyrasis.org/browse/DS-2592) for Mirage2 in 2015
|
|
- Note that this did not prevent OpenSearch itself from working
|
|
- I will patch this on our DSpace 5.x and 6.x branches
|
|
|
|
## 2020-05-06
|
|
|
|
- Atmire responded asking for more information about the Solr statistics processing bug in CUA so I sent them some full logs
|
|
- Also I asked again about the Maven variable interpolation issue for `cua.version.number`, and if they would be willing to upgrade CUA to use Font Awesome 5 instead of 4.
|
|
|
|
## 2020-05-07
|
|
|
|
- Linode sent an alert that there was high CPU usage on CGSpace (linode18) early this morning
|
|
- I looked at the nginx logs using goaccess and I found a few IPs making lots of requests around then:
|
|
|
|
```
|
|
# cat /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "07/May/2020:(01|03|04)" | goaccess --log-format=COMBINED -
|
|
```
|
|
|
|
- The two main IPs making requests around then are 188.134.31.88 and 212.34.8.188
|
|
- The first is in Russia and it is hitting mostly XMLUI Discover links using _dozens_ of different user agents, a total of 20,000 requests this week
|
|
- The second IP is CodeObia testing AReS, a total of 171,000 hits this month
|
|
- I will purge both of those IPs from the Solr stats using my `check-spider-ip-hits.sh` script:
|
|
|
|
```
|
|
$ ./check-spider-ip-hits.sh -f /tmp/ips -s statistics -p
|
|
Purging 171641 hits from 212.34.8.188 in statistics
|
|
Purging 20691 hits from 188.134.31.88 in statistics
|
|
|
|
Total number of bot hits purged: 192332
|
|
```
|
|
|
|
- And then I will add 188.134.31.88 to the nginx bad bot list and tell CodeObia to please use a "bot" user agent
|
|
- I also changed the nginx config to block requests with blank user agents
|
|
|
|
## 2020-05-11
|
|
|
|
- Bizu said she was having issues submitting to CGSpace last week
|
|
- The issue sounds like the one Tezira and CTA were having in the last few weeks
|
|
- I looked at the PostgreSQL graphs and see there are a lot of connections in "idle in transaction" and "waiting for lock" state:
|
|
|
|
![PostgreSQL connections](/cgspace-notes/2020/05/postgres_connections_cgspace-week.png)
|
|
|
|
- I think I'll downgrade the PostgreSQL JDBC driver from 42.2.12 to 42.2.10, which was the version we were using before these issues started happening
|
|
- Atmire sent some feedback about my ongoing issues with their CUA module, but none of it was conclusive yet
|
|
- Regarding Font Awesome 5 they will check how much work it will take and give me a quote
|
|
- Abenet said some users are questioning why the statistics dropped so much lately, so I made a [post to Yammer](https://www.yammer.com/dspacedevelopers/#/Threads/show?threadId=674923030216704) to explain about the robots
|
|
- Last week Peter had asked me to add a new ILRI author's ORCID iD
|
|
- I added it to the controlled vocabulary and tagged the user's existing ~11 items in CGSpace using this CSV file with my `add-orcid-identifiers-csv.py` script:
|
|
|
|
```
|
|
$ cat 2020-05-11-add-orcids.csv
|
|
dc.contributor.author,cg.creator.id
|
|
"Lutakome, P.","Pius Lutakome: 0000-0002-0804-2649"
|
|
"Lutakome, Pius","Pius Lutakome: 0000-0002-0804-2649"
|
|
$ ./add-orcid-identifiers-csv.py -i 2020-05-11-add-orcids.csv -db dspace -u dspace -p 'fuuu' -d
|
|
```
|
|
|
|
- Run system updates on CGSpace (linode18) and reboot it
|
|
- I had to restart Tomcat five times before all Solr statistics cores came up OK, ugh.
|
|
|
|
## 2020-05-12
|
|
|
|
- Peter noticed that CGSpace is no longer on AReS, because I blocked all requests that don't specify a user agent
|
|
- I've temporarily disabled that restriction and asked Moayad to look into how he can specify a user agent in the AReS harvester
|
|
|
|
## 2020-05-13
|
|
|
|
- Atmire responded about Font Awesome and said they can switch to version 5 for 16 credits
|
|
- I told them to go ahead
|
|
- Also, Atmire gave me a small workaround for the `cua.version.number` interpolation issue and said they would look into the crash that happens when processing our Solr stats
|
|
- Run system updates and reboot AReS server (linode20) for the first time in almost 100 days
|
|
- I notice that AReS now has some of CGSpace's data in it (but not all) since I dropped the user-agent restriction on the REST API yesterday
|
|
|
|
## 2020-05-17
|
|
|
|
- Create an issue in the OpenRXV project for Moayad to change the default harvester user agent ([#36](https://github.com/ilri/OpenRXV/issues/36))
|
|
|
|
## 2020-05-18
|
|
|
|
- Atmire responded and said they still can't figure out the CUA statistics issue, though they seem to only be trying to understand what's going on using static analysis
|
|
- I told them that they should try to run the code with the Solr statistics that I shared with them a few weeks ago
|
|
|
|
## 2020-05-19
|
|
|
|
- Add ORCID identifier for Sirak Bahta
|
|
- I added it to the controlled vocabulary and tagged the user's existing ~40 items in CGSpace using this CSV file with my `add-orcid-identifiers-csv.py` script:
|
|
|
|
```
|
|
$ cat 2020-05-19-add-orcids.csv
|
|
dc.contributor.author,cg.creator.id
|
|
"Bahta, Sirak T.","Sirak Bahta: 0000-0002-5728-2489"
|
|
$ ./add-orcid-identifiers-csv.py -i 2020-05-19-add-orcids.csv -db dspace -u dspace -p 'fuuu' -d
|
|
```
|
|
|
|
- An IITA user is having issues submitting to CGSpace and I see there are a rising number of PostgreSQL connections waiting in transaction and in lock:
|
|
|
|
![PostgreSQL connections](/cgspace-notes/2020/05/postgres_connections_cgspace-week2.png)
|
|
|
|
- This is the same issue Tezira, Bizu, and CTA were having in the last few weeks and it I already downgraded the PostgreSQL JDBC driver version to the last version I was using before this started (42.2.10)
|
|
- I will downgrade it to version 42.2.9 for now...
|
|
- The only other thing I can think of is that I upgraded Tomcat to 7.0.103 in March
|
|
- Run system updates on DSpace Test (linode26) and reboot it
|
|
- Run system updates on CGSpace (linode18) and reboot it
|
|
- After the system came back up I had to restart Tomcat 7 three times before all the Solr statistics cores came up OK
|
|
- Send Atmire a snapshot of the CGSpace database for them to possibly troubleshoot the CUA issue with DSpace 6
|
|
|
|
## 2020-05-20
|
|
|
|
- Send CodeObia some logos and footer text for the next phase of OpenRXV development ([#18](https://github.com/ilri/OpenRXV/issues/18))
|
|
|
|
## 2020-05-25
|
|
|
|
- Add ORCID identifier for CIAT author Manuel Francisco
|
|
- I added it to the controlled vocabulary and tagged the user's existing ~27 items in CGSpace using this CSV file with my `add-orcid-identifiers-csv.py` script:
|
|
|
|
```
|
|
$ cat 2020-05-25-add-orcids.csv
|
|
dc.contributor.author,cg.creator.id
|
|
"Díaz, Manuel F.","Manuel Francisco Diaz Baca: 0000-0001-8996-5092"
|
|
"Díaz, Manuel Francisco","Manuel Francisco Diaz Baca: 0000-0001-8996-5092"
|
|
$ ./add-orcid-identifiers-csv.py -i 2020-05-25-add-orcids.csv -db dspace -u dspace -p 'fuuu' -d
|
|
```
|
|
|
|
- Last week Maria asked again about searching for items by accession or issue date
|
|
- A few months ago I had told her to search for the ISO8601 date in Discovery search, which appears to work because it filters the results down quite a bit
|
|
- She pointed out that the results include hits that don't exactly match, for example if part of the search string appears elsewhere like in the timestamp
|
|
- I checked in Solr and the results are the same, so perhaps it's a limitation in Solr...?
|
|
- So this effectively means that we don't have a way to create reports for items in an arbitrary date range shorter than a year:
|
|
- DSpace advanced search is buggy or simply not designed to work like that
|
|
- AReS Explorer currently only allows filtering by year, but will allow months soon
|
|
- Atmire Listings and Reports only allows a "Timespan" of a year
|
|
|
|
## 2020-05-29
|
|
|
|
- Linode alerted to say that the CPU load on CGSpace (linode18) was high for a few hours this morning
|
|
- Looking at the nginx logs for this morning with goaccess:
|
|
|
|
```
|
|
# cat /var/log/nginx/*.log.1 | grep -E "29/May/2020:(02|03|04|05)" | goaccess --log-format=COMBINED -
|
|
```
|
|
|
|
- The top is 172.104.229.92, which is the AReS harvester (still not using a user agent, but it's tagged as a bot in the nginx mapping)
|
|
- Second is 188.134.31.88, which is a Russian host that we also saw in the last few weeks, using a browser user agent and hitting the XMLUI (but it is tagged as a bot in nginx as well)
|
|
- Another one is 51.158.106.4, which is some Scaleway IP making requests to XMLUI with different browser user agents that I am pretty sure I have seen before but never blocked
|
|
- According to Solr it has made about 800 requests this year, but still... it's a bot.
|
|
- One I don't think I've seen before is 95.217.58.146, which is making requests to XMLUI with a Drupal user agent
|
|
- According to [viewdns.info](https://viewdns.info/reverseip/?host=95.217.58.146&t=1) it belongs to [landvoc.org](https://landvoc.org/)
|
|
- I should add Drupal to the list of bots...
|
|
- Atmire got back to me about the [Solr CUA issue in the DSpace 6 upgrade](https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=706) and they cannot reproduce the error
|
|
- The next step is for me to migrate DSpace Test (linode26) to DSpace 6 and try to reproduce the error there
|
|
|
|
## 2020-05-31
|
|
|
|
- Start preparing to migrate DSpace Test (linode26) to the `6_x-dev-atmire-modules` branch
|
|
- Run all system updates and reboot
|
|
- For now I will disable all yearly Solr statistics cores except the current `statistics` one
|
|
- Prepare PostgreSQL with a clean snapshot of CGSpace's DSpace 5.8 database:
|
|
|
|
```
|
|
$ sudo su - postgres
|
|
$ dropdb dspacetest
|
|
$ createdb -O dspacetest --encoding=UNICODE dspacetest
|
|
$ psql dspacetest -c 'alter user dspacetest superuser;'
|
|
$ pg_restore -d dspacetest -O --role=dspacetest /tmp/cgspace_2020-05-31.backup
|
|
$ psql dspacetest -c 'alter user dspacetest nosuperuser;'
|
|
# run DSpace 5 version of update-sequences.sql!!!
|
|
$ psql -f /home/dspace/src/git/DSpace/dspace/etc/postgres/update-sequences.sql dspacetest
|
|
$ psql dspacetest -c "DELETE FROM schema_version WHERE version IN ('5.8.2015.12.03.3');"
|
|
$ psql dspacetest -c 'CREATE EXTENSION pgcrypto;'
|
|
$ exit
|
|
```
|
|
|
|
- Now switch to the DSpace 6.x branch and start a build:
|
|
|
|
```
|
|
$ chrt -i 0 ionice -c2 -n7 nice -n19 mvn -U -Dmirage2.on=true -Dmirage2.deps.included=false package
|
|
...
|
|
[ERROR] Failed to execute goal on project additions: Could not resolve dependencies for project org.dspace.modules:additions:jar:6.3: Failed to collect dependencies at com.atmire:atmire-listings-and-reports-api:jar:6.x-2.10.8-0-SNAPSHOT: Failed to read artifact descriptor for com.atmire:atmire-listings-and-reports-api:jar:6.x-2.10.8-0-SNAPSHOT: Could not transfer artifact com.atmire:atmire-listings-and-reports-api:pom:6.x-2.10.8-0-SNAPSHOT from/to atmire.com-snapshots (https://atmire.com/artifactory/atmire.com-snapshots): Not authorized , ReasonPhrase:Unauthorized. -> [Help 1]
|
|
```
|
|
|
|
- Great! I will have to send Atmire a note about this... but for now I can sync over my local `~/.m2` directory and the build completes
|
|
- After the Maven build completed successfully I installed the updated code with Ant (make sure to delete the old spring directory):
|
|
|
|
```
|
|
$ cd dspace/target/dspace-installer
|
|
$ rm -rf /blah/dspacetest/config/spring
|
|
$ ant update
|
|
```
|
|
|
|
- Database migrations take 10:18.287s during the first startup...
|
|
- perhaps when we do the production CGSpace migration I can do this in advance and tell users not to make any submissions?
|
|
- I had a mistake in my Solr internal URL parameter so DSpace couldn't find it, but once I fixed that DSpace starts up OK!
|
|
- Once the initial Discovery reindexing was completed (after three hours or so!) I started the Solr statistics UUID migration:
|
|
|
|
```
|
|
$ export JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8"
|
|
$ dspace solr-upgrade-statistics-6x -i statistics -n 250000
|
|
$ dspace solr-upgrade-statistics-6x -i statistics -n 1000000
|
|
$ dspace solr-upgrade-statistics-6x -i statistics -n 1000000
|
|
...
|
|
```
|
|
|
|
- It's taking about 35 minutes for 1,000,000 records...
|
|
- Some issues towards the end of this core:
|
|
|
|
```
|
|
Exception: Error while creating field 'p_group_id{type=uuid,properties=indexed,stored,multiValued}' from value '10'
|
|
org.apache.solr.client.solrj.impl.HttpSolrServer$RemoteSolrException: Error while creating field 'p_group_id{type=uuid,properties=indexed,stored,multiValued}' from value '10'
|
|
at org.apache.solr.client.solrj.impl.HttpSolrServer.executeMethod(HttpSolrServer.java:552)
|
|
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:210)
|
|
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:206)
|
|
at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:124)
|
|
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:68)
|
|
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:54)
|
|
at org.dspace.util.SolrUpgradePre6xStatistics.batchUpdateStats(SolrUpgradePre6xStatistics.java:161)
|
|
at org.dspace.util.SolrUpgradePre6xStatistics.run(SolrUpgradePre6xStatistics.java:456)
|
|
at org.dspace.util.SolrUpgradePre6xStatistics.main(SolrUpgradePre6xStatistics.java:365)
|
|
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
|
|
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
|
|
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
|
at java.lang.reflect.Method.invoke(Method.java:498)
|
|
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:229)
|
|
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:81)
|
|
```
|
|
|
|
- So basically there are some documents that have IDs that have *not* been converted to UUID, and have *not* been labeled as "unmigrated" either...
|
|
- Of these 101,257 documents, 90,000 are of type 5 (search), 9,000 are type storage, and 800 are type view, but it's weird because if I look at their type/statistics_type using a facet the storage ones disappear...
|
|
- For now I will export these documents from the statistics core and then delete them:
|
|
|
|
```
|
|
$ ./run.sh -s http://localhost:8081/solr/statistics -a export -o statistics-unmigrated.json -k uid -f '(*:* NOT id:/.{36}/) AND (*:* NOT id:/.+-unmigrated/)'
|
|
$ curl -s "http://localhost:8081/solr/statistics/update?softCommit=true" -H "Content-Type: text/xml" --data-binary "<delete><query>(*:* NOT id:/.{36}/) AND (*:* NOT id:/.+-unmigrated/)</query></delete>"
|
|
```
|
|
|
|
- Now the UUID conversion script says there is nothing left to convert, so I can try to run the Atmire CUA conversion utility:
|
|
|
|
```
|
|
$ export JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8"
|
|
$ dspace dsrun com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdateCLI -t 1
|
|
```
|
|
|
|
- The processing is very slow and there are lots of errors like this:
|
|
|
|
```
|
|
Record uid: 7b5b3900-28e8-417f-9c1c-e7d88a753221 couldn't be processed
|
|
com.atmire.statistics.util.update.atomic.ProcessingException: something went wrong while processing record uid: 7b5b3900-28e8-417f-9c1c-e7d88a753221, an error occured in the com.atmire.statistics.util.update.atomic.processor.ContainerOwnerDBProcessor
|
|
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.applyProcessors(AtomicStatisticsUpdater.java:304)
|
|
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.processRecords(AtomicStatisticsUpdater.java:176)
|
|
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.performRun(AtomicStatisticsUpdater.java:161)
|
|
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.update(AtomicStatisticsUpdater.java:128)
|
|
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdateCLI.main(AtomicStatisticsUpdateCLI.java:78)
|
|
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
|
|
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
|
|
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
|
at java.lang.reflect.Method.invoke(Method.java:498)
|
|
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:229)
|
|
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:81)
|
|
Caused by: java.lang.NullPointerException
|
|
```
|
|
|
|
- Experiment a bit with the Python [country-converter](https://pypi.org/project/country-converter/) library as it can convert between different formats (like ISO 3166 and UN m49)
|
|
- We need to eventually find a format we can use for all CGIAR DSpaces...
|
|
|
|
<!-- vim: set sw=2 ts=2: -->
|