2020-05-02
- Peter said that CTA is having problems submitting an item to CGSpace
- Looking at the PostgreSQL stats it seems to be the same issue that Tezira was having last week, as I see the number of connections in ‘idle in transaction’ and ‘waiting for lock’ state are increasing again
- I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2.11, and there were some bugs related to transactions fixed in 42.2.12 (which I had updated in the Ansible playbooks, but not deployed yet)
2020-05-03
- Purge a few remaining bots from CGSpace Solr statistics that I had identified a few months ago
lua-resty-http/0.10 (Lua) ngx_lua/10000
omgili/0.5 +http://omgili.com
IZaBEE/IZaBEE-1.01 (Buzzing Abound The Web; https://izabee.com; info at izabee dot com)
Twurly v1.1 (https://twurly.org)
Pattern/2.6 +http://www.clips.ua.ac.be/pattern
CyotekWebCopy/1.7 CyotekHTTP/2.0
- This is only about 2,500 hits total from the last ten years, and half of these bots no longer seem to exist, so I won’t bother submitting them to the COUNTER-Robots project
- I noticed that our custom themes were incorrectly linking to the OpenSearch XML file
- The bug was fixed for Mirage2 in 2015
- Note that this did not prevent OpenSearch itself from working
- I will patch this on our DSpace 5.x and 6.x branches
2020-05-06
- Atmire responded asking for more information about the Solr statistics processing bug in CUA so I sent them some full logs
- Also I asked again about the Maven variable interpolation issue for
cua.version.number
, and if they would be willing to upgrade CUA to use Font Awesome 5 instead of 4.
2020-05-07
- Linode sent an alert that there was high CPU usage on CGSpace (linode18) early this morning
- I looked at the nginx logs using goaccess and I found a few IPs making lots of requests around then:
# cat /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "07/May/2020:(01|03|04)" | goaccess --log-format=COMBINED -
- The two main IPs making requests around then are 188.134.31.88 and 212.34.8.188
- The first is in Russia and it is hitting mostly XMLUI Discover links using dozens of different user agents, a total of 20,000 requests this week
- The second IP is CodeObia testing AReS, a total of 171,000 hits this month
- I will purge both of those IPs from the Solr stats using my
check-spider-ip-hits.sh
script:
$ ./check-spider-ip-hits.sh -f /tmp/ips -s statistics -p
Purging 171641 hits from 212.34.8.188 in statistics
Purging 20691 hits from 188.134.31.88 in statistics
Total number of bot hits purged: 192332
- And then I will add 188.134.31.88 to the nginx bad bot list and tell CodeObia to please use a “bot” user agent
- I also changed the nginx config to block requests with blank user agents
2020-05-11
- Bizu said she was having issues submitting to CGSpace last week
- The issue sounds like the one Tezira and CTA were having in the last few weeks
- I looked at the PostgreSQL graphs and see there are a lot of connections in “idle in transaction” and “waiting for lock” state:
- I think I’ll downgrade the PostgreSQL JDBC driver from 42.2.12 to 42.2.10, which was the version we were using before these issues started happening
- Atmire sent some feedback about my ongoing issues with their CUA module, but none of it was conclusive yet
- Regarding Font Awesome 5 they will check how much work it will take and give me a quote
- Abenet said some users are questioning why the statistics dropped so much lately, so I made a post to Yammer to explain about the robots
- Last week Peter had asked me to add a new ILRI author’s ORCID iD
- I added it to the controlled vocabulary and tagged the user’s existing ~11 items in CGSpace using this CSV file with my
add-orcid-identifiers-csv.py
script:
$ cat 2020-05-11-add-orcids.csv
dc.contributor.author,cg.creator.id
"Lutakome, P.","Pius Lutakome: 0000-0002-0804-2649"
"Lutakome, Pius","Pius Lutakome: 0000-0002-0804-2649"
$ ./add-orcid-identifiers-csv.py -i 2020-05-11-add-orcids.csv -db dspace -u dspace -p 'fuuu' -d
- Run system updates on CGSpace (linode18) and reboot it
- I had to restart Tomcat five times before all Solr statistics cores came up OK, ugh.
2020-05-12
- Peter noticed that CGSpace is no longer on AReS, because I blocked all requests that don’t specify a user agent
- I’ve temporarily disabled that restriction and asked Moayad to look into how he can specify a user agent in the AReS harvester
2020-05-13
- Atmire responded about Font Awesome and said they can switch to version 5 for 16 credits
- Also, Atmire gave me a small workaround for the
cua.version.number
interpolation issue and said they would look into the crash that happens when processing our Solr stats
- Run system updates and reboot AReS server (linode20) for the first time in almost 100 days
- I notice that AReS now has some of CGSpace’s data in it (but not all) since I dropped the user-agent restriction on the REST API yesterday
2020-05-17
- Create an issue in the OpenRXV project for Moayad to change the default harvester user agent (#36)
2020-05-18
- Atmire responded and said they still can’t figure out the CUA statistics issue, though they seem to only be trying to understand what’s going on using static analysis
- I told them that they should try to run the code with the Solr statistics that I shared with them a few weeks ago
2020-05-19
- Add ORCID identifier for Sirak Bahta
- I added it to the controlled vocabulary and tagged the user’s existing ~40 items in CGSpace using this CSV file with my
add-orcid-identifiers-csv.py
script:
$ cat 2020-05-19-add-orcids.csv
dc.contributor.author,cg.creator.id
"Bahta, Sirak T.","Sirak Bahta: 0000-0002-5728-2489"
$ ./add-orcid-identifiers-csv.py -i 2020-05-19-add-orcids.csv -db dspace -u dspace -p 'fuuu' -d
- An IITA user is having issues submitting to CGSpace and I see there are a rising number of PostgreSQL connections waiting in transaction and in lock:
- This is the same issue Tezira, Bizu, and CTA were having in the last few weeks and it I already downgraded the PostgreSQL JDBC driver version to the last version I was using before this started (42.2.10)
- I will downgrade it to version 42.2.9 for now…
- The only other thing I can think of is that I upgraded Tomcat to 7.0.103 in March
- Run system updates on DSpace Test (linode26) and reboot it
- Run system updates on CGSpace (linode18) and reboot it
- After the system came back up I had to restart Tomcat 7 three times before all the Solr statistics cores came up OK
- Send Atmire a snapshot of the CGSpace database for them to possibly troubleshoot the CUA issue with DSpace 6
2020-05-20
- Send CodeObia some logos and footer text for the next phase of OpenRXV development (#18)
2020-05-25
- Add ORCID identifier for CIAT author Manuel Francisco
- I added it to the controlled vocabulary and tagged the user’s existing ~27 items in CGSpace using this CSV file with my
add-orcid-identifiers-csv.py
script:
$ cat 2020-05-25-add-orcids.csv
dc.contributor.author,cg.creator.id
"Díaz, Manuel F.","Manuel Francisco Diaz Baca: 0000-0001-8996-5092"
"Díaz, Manuel Francisco","Manuel Francisco Diaz Baca: 0000-0001-8996-5092"
$ ./add-orcid-identifiers-csv.py -i 2020-05-25-add-orcids.csv -db dspace -u dspace -p 'fuuu' -d
- Last week Maria asked again about searching for items by accession or issue date
- A few months ago I had told her to search for the ISO8601 date in Discovery search, which appears to work because it filters the results down quite a bit
- She pointed out that the results include hits that don’t exactly match, for example if part of the search string appears elsewhere like in the timestamp
- I checked in Solr and the results are the same, so perhaps it’s a limitation in Solr…?
- So this effectively means that we don’t have a way to create reports for items in an arbitrary date range shorter than a year:
- DSpace advanced search is buggy or simply not designed to work like that
- AReS Explorer currently only allows filtering by year, but will allow months soon
- Atmire Listings and Reports only allows a “Timespan” of a year