Update notes

This commit is contained in:
Alan Orth 2020-06-09 12:28:04 +03:00
parent 6075c827d1
commit 4280ad7388
Signed by: alanorth
GPG Key ID: 0FB860CC9C45B1B9
4 changed files with 221 additions and 2 deletions

View File

@ -129,4 +129,209 @@ sys 3m13.929s
- Unfortunately the item is still not showing up in the item mapper... - Unfortunately the item is still not showing up in the item mapper...
- Something happened to AReS Explorer (linode20) so I ran all system updates and rebooted it - Something happened to AReS Explorer (linode20) so I ran all system updates and rebooted it
## 2020-06-07
- Peter said he was annoyed with a CSV export from CGSpace because of the different `text_lang` attributes and asked if we can fix it
- The last time I normalized these was in 2019-06, and currently it looks like this:
```
dspace=# SELECT DISTINCT text_lang, count(text_lang) FROM metadatavalue WHERE resource_type_id=2 GROUP BY text_lang ORDER BY count DESC;
text_lang | count
-------------+---------
en_US | 2158377
en | 149540
| 49206
es_ES | 18
fr | 4
Peer Review | 1
| 0
(7 rows)
```
- In theory we can have different languages for metadata fields but in practice we don't do that, so we might as well normalize everything to "en_US" (and perhaps I should make a curation task to do this)
- For now I will do it manually on CGSpace and DSpace Test:
```
dspace=# UPDATE metadatavalue SET text_lang='en_US' WHERE resource_type_id=2;
UPDATE 2414738
```
- Note: DSpace Test doesn't have the `resource_type_id` column because it's running DSpace 6 and [the schema changed to use an object model there](https://wiki.lyrasis.org/display/DSPACE/DSpace+Service+based+api)
- We need to use this on DSpace 6:
```
dspace=# UPDATE metadatavalue SET text_lang='en_US' WHERE dspace_object_id IN (SELECT uuid FROM item);
```
- Peter asked if it was possible to find all ILRI items that have "zoonoses" or "zoonotic" in their titles and check if they have the ILRI subject "ZOONOTIC DISEASES" (and add it if not)
- Unfortunately the only way we have currently would be to export the entire ILRI community as a CSV and filter/edit it in OpenRefine
## 2020-06-08
- I manually mapped the two Big Data items that Maria had asked about last week by exporting their metadata to CSV and re-importing it
- I still need to look into the underlying issue there, seems to be something in Solr
- Something strang is that, when I search for part of the title in Discovery I get 2,000 results on CGSpace, while on my local DSpace 5.8 environment I get 2!
![CGSpace Discovery search results](/cgspace-notes/2020/06/cgspace-discovery-search.png)
![CGSpace Discovery search results](/cgspace-notes/2020/06/localhost-discovery-search.png)
- On DSpace Test, which is currently running DSpace 6, I get 2,000 results but the top one is the correct match and the item does show up in the item mapper
- Interestingly, if I search directly in the Solr `search` core on CGSpace with a query like `handle:10568/108315` I don't see the item, but on my local Solr I see them!
- Peter asked if it was easy for me to add ILRI subject "ZOONOTIC DISEASES" to any items in the ILRI community that had "zoonotic" or "zoonoses" in their title, but were missing the ILRI subject
- I exported the ILRI community metadata, cut the three fields I needed, and then filtered and edited the CSV in OpenRefine:
```
$ dspace metadata-export -i 10568/1 -f /tmp/2020-06-08-ILRI.csv
$ csvcut -c 'id,cg.subject.ilri[en_US],dc.title[en_US]' ~/Downloads/2020-06-08-ILRI.csv > /tmp/ilri.csv
```
- Moayad asked why he's getting HTTP 500 errors on CGSpace
- I looked in the Nginx logs and I see some HTTP 500 responses, but nothing in nginx's error.log
- Looking in Tomcat's log I see there are many:
```
# journalctl --since=today -u tomcat7 | grep -c 'Internal Server Error'
482
```
- They are all related to the REST API, like:
```
Jun 07 02:00:27 linode18 tomcat7[6286]: SEVERE: Mapped exception to response: 500 (Internal Server Error)
Jun 07 02:00:27 linode18 tomcat7[6286]: javax.ws.rs.WebApplicationException
Jun 07 02:00:27 linode18 tomcat7[6286]: at org.dspace.rest.Resource.processException(Resource.java:151)
Jun 07 02:00:27 linode18 tomcat7[6286]: at org.dspace.rest.ItemsResource.getItems(ItemsResource.java:195)
Jun 07 02:00:27 linode18 tomcat7[6286]: at sun.reflect.GeneratedMethodAccessor548.invoke(Unknown Source)
Jun 07 02:00:27 linode18 tomcat7[6286]: at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
Jun 07 02:00:27 linode18 tomcat7[6286]: at java.lang.reflect.Method.invoke(Method.java:498)
Jun 07 02:00:27 linode18 tomcat7[6286]: at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
...
```
- And:
```
Jun 08 09:28:29 linode18 tomcat7[6286]: SEVERE: Mapped exception to response: 500 (Internal Server Error)
Jun 08 09:28:29 linode18 tomcat7[6286]: javax.ws.rs.WebApplicationException
Jun 08 09:28:29 linode18 tomcat7[6286]: at org.dspace.rest.Resource.processFinally(Resource.java:169)
Jun 08 09:28:29 linode18 tomcat7[6286]: at org.dspace.rest.HandleResource.getObject(HandleResource.java:81)
Jun 08 09:28:29 linode18 tomcat7[6286]: at sun.reflect.GeneratedMethodAccessor360.invoke(Unknown Source)
Jun 08 09:28:29 linode18 tomcat7[6286]: at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
Jun 08 09:28:29 linode18 tomcat7[6286]: at java.lang.reflect.Method.invoke(Method.java:498)
```
- And:
```
Jun 06 08:19:54 linode18 tomcat7[6286]: SEVERE: Mapped exception to response: 500 (Internal Server Error)
Jun 06 08:19:54 linode18 tomcat7[6286]: javax.ws.rs.WebApplicationException
Jun 06 08:19:54 linode18 tomcat7[6286]: at org.dspace.rest.Resource.processException(Resource.java:151)
Jun 06 08:19:54 linode18 tomcat7[6286]: at org.dspace.rest.CollectionsResource.getCollectionItems(CollectionsResource.java:289)
Jun 06 08:19:54 linode18 tomcat7[6286]: at sun.reflect.GeneratedMethodAccessor598.invoke(Unknown Source)
Jun 06 08:19:54 linode18 tomcat7[6286]: at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
Jun 06 08:19:54 linode18 tomcat7[6286]: at java.lang.reflect.Method.invoke(Method.java:498)
```
- Looking back, I see ~800 of these errors since I changed the database configuration last week:
```
# journalctl --since=2020-06-04 --until=today -u tomcat7 | grep -c 'javax.ws.rs.WebApplicationException'
795
```
- And only ~280 in the entire month before that...
```
# journalctl --since=2020-05-01 --until=2020-06-04 -u tomcat7 | grep -c 'javax.ws.rs.WebApplicationException'
286
```
- So it seems to be related to the database, perhaps that there are less connections in the pool?
- ... and on that note, working without the JDBC driver and DSpace's built-in connection pool since 2020-06-04 hasn't actually solved anything, the issue with locks and idle in transaction connections is creeping up again!
![PostgreSQL connections day](/cgspace-notes/2020/06/postgres_connections_ALL-day2.png)
![PostgreSQL connections week](/cgspace-notes/2020/06/postgres_connections_ALL-week2.png)
- It seems to have started today around 10:00 AM... I need to pore over the logs to see if there is a correlation
- I think there is some kind of attack going on because I see a bunch of requests for sequential Handles from a similar IP range in a datacenter in Sweden where the user *does not* re-use their DSpace `session_id`
- Looking in the nginx logs I see most (all?) of these requests are using the following user agent:
```
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36
```
- Looking at the nginx access logs I see that, other than something that seems like Google Feedburner, all hosts using this user agent are all in Sweden!
```
# zcat --force /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/access.log.*.gz | grep 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36' | grep -v '/feed' | awk '{print $1}' | sort | uniq -c | sort -n
1624 192.36.136.246
1627 192.36.241.95
1629 192.165.45.204
1636 192.36.119.28
1641 192.36.217.7
1648 192.121.146.160
1648 192.36.23.35
1651 192.36.109.94
1659 192.36.24.93
1665 192.36.154.13
1679 192.36.137.125
1682 192.176.249.42
1682 192.36.166.120
1683 192.36.172.86
1683 192.36.198.145
1689 192.36.226.212
1702 192.121.136.49
1704 192.36.207.54
1740 192.36.121.98
1774 192.36.173.93
```
- The earliest I see any of these hosts is 2020-06-05 (three days ago)
- I will purge them from the Solr statistics and add them to abusive IPs ipset in the Ansible deployment scripts
```
$ ./check-spider-ip-hits.sh -f /tmp/ips -s statistics -p
Purging 1423 hits from 192.36.136.246 in statistics
Purging 1387 hits from 192.36.241.95 in statistics
Purging 1398 hits from 192.165.45.204 in statistics
Purging 1413 hits from 192.36.119.28 in statistics
Purging 1418 hits from 192.36.217.7 in statistics
Purging 1418 hits from 192.121.146.160 in statistics
Purging 1416 hits from 192.36.23.35 in statistics
Purging 1449 hits from 192.36.109.94 in statistics
Purging 1440 hits from 192.36.24.93 in statistics
Purging 1465 hits from 192.36.154.13 in statistics
Purging 1447 hits from 192.36.137.125 in statistics
Purging 1453 hits from 192.176.249.42 in statistics
Purging 1462 hits from 192.36.166.120 in statistics
Purging 1499 hits from 192.36.172.86 in statistics
Purging 1457 hits from 192.36.198.145 in statistics
Purging 1467 hits from 192.36.226.212 in statistics
Purging 1489 hits from 192.121.136.49 in statistics
Purging 1478 hits from 192.36.207.54 in statistics
Purging 1502 hits from 192.36.121.98 in statistics
Purging 1544 hits from 192.36.173.93 in statistics
Total number of bot hits purged: 29025
```
- Skype with Enrico, Moayad, Jane, Peter, and Abenet to see the latest OpenRXV/AReS developments
- One thing Enrico mentioned to me during the call was that they had issues with Altmetric's user agents, and he said they are apparently using `Altmetribot` and `Postgenomic V2`
- I looked in our logs and indeed we have those, so I will add them to the nginx rate limit bypass
- I checked the Solr stats and it seems there are only a few thousand in 2016 and a few hundred in other years so I won't bother adding it to the DSpace robot user agents list
- Atmire sent an updated pull request for the Font Awesome 5 update for CUA ([#445](https://github.com/ilri/DSpace/pull/445)) so I filed feedback on [their tracker](https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=706)
## 2020-06-09
- I'm still thinking about the issue with PostgreSQL "idle in transaction" and "waiting for lock" connections
- As far as I can see from the Munin graphs this issue started in late April or early May
- I don't see any PostgreSQL updates around then, though I did update Tomcat to version 7.0.103 in March
- I will try to downgrade Tomcat to 7.0.99, which was the version I was using until early February, before we had seen any issues
- Also, I will use the PostgreSQL JDBC driver version 42.2.9, which is what we were using back then as well
- After deploying Tomcat 7.0.99 I had to restart Tomcat three times before all the Solr statistics cores came up OK
- Well look at that, the "idle in transaction" and locking issues started in April on DSpace Test too...
![PostgreSQL connections year DSpace Test](/cgspace-notes/2020/06/postgres_connections_ALL-day2.png)
<!-- vim: set sw=2 ts=2: --> <!-- vim: set sw=2 ts=2: -->

View File

@ -43,7 +43,7 @@ I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Tes
"@type": "BlogPosting", "@type": "BlogPosting",
"headline": "June, 2020", "headline": "June, 2020",
"url": "https://alanorth.github.io/cgspace-notes/2020-06/", "url": "https://alanorth.github.io/cgspace-notes/2020-06/",
"wordCount": "2127", "wordCount": "2263",
"datePublished": "2020-06-01T13:55:39+03:00", "datePublished": "2020-06-01T13:55:39+03:00",
"dateModified": "2020-06-06T13:33:23+03:00", "dateModified": "2020-06-06T13:33:23+03:00",
"author": { "author": {
@ -449,6 +449,20 @@ Total number of bot hits purged: 29025
</li> </li>
<li>Atmire sent an updated pull request for the Font Awesome 5 update for CUA (<a href="https://github.com/ilri/DSpace/pull/445">#445</a>) so I filed feedback on <a href="https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=706">their tracker</a></li> <li>Atmire sent an updated pull request for the Font Awesome 5 update for CUA (<a href="https://github.com/ilri/DSpace/pull/445">#445</a>) so I filed feedback on <a href="https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=706">their tracker</a></li>
</ul> </ul>
<h2 id="2020-06-09">2020-06-09</h2>
<ul>
<li>I&rsquo;m still thinking about the issue with PostgreSQL &ldquo;idle in transaction&rdquo; and &ldquo;waiting for lock&rdquo; connections
<ul>
<li>As far as I can see from the Munin graphs this issue started in late April or early May</li>
<li>I don&rsquo;t see any PostgreSQL updates around then, though I did update Tomcat to version 7.0.103 in March</li>
<li>I will try to downgrade Tomcat to 7.0.99, which was the version I was using until early February, before we had seen any issues</li>
<li>Also, I will use the PostgreSQL JDBC driver version 42.2.9, which is what we were using back then as well</li>
<li>After deploying Tomcat 7.0.99 I had to restart Tomcat three times before all the Solr statistics cores came up OK</li>
</ul>
</li>
<li>Well look at that, the &ldquo;idle in transaction&rdquo; and locking issues started in April on DSpace Test too&hellip;</li>
</ul>
<p><img src="/cgspace-notes/2020/06/postgres_connections_ALL-day2.png" alt="PostgreSQL connections year DSpace Test"></p>
<!-- raw HTML omitted --> <!-- raw HTML omitted -->

Binary file not shown.

After

Width:  |  Height:  |  Size: 10 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 10 KiB