mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-26 08:28:18 +01:00
670 lines
34 KiB
HTML
670 lines
34 KiB
HTML
<!DOCTYPE html>
|
||
<html lang="en" >
|
||
|
||
<head>
|
||
<meta charset="utf-8">
|
||
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
||
|
||
<meta property="og:title" content="June, 2020" />
|
||
<meta property="og:description" content="2020-06-01
|
||
|
||
I tried to run the AtomicStatisticsUpdateCLI CUA migration script on DSpace Test (linode26) again and it is still going very slowly and has tons of errors like I noticed yesterday
|
||
|
||
I sent Atmire the dspace.log from today and told them to log into the server to debug the process
|
||
|
||
|
||
In other news, I checked the statistics API on DSpace 6 and it’s working
|
||
I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error:
|
||
" />
|
||
<meta property="og:type" content="article" />
|
||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2020-06/" />
|
||
<meta property="article:published_time" content="2020-06-01T13:55:39+03:00" />
|
||
<meta property="article:modified_time" content="2020-06-23T16:13:27+03:00" />
|
||
|
||
<meta name="twitter:card" content="summary"/>
|
||
<meta name="twitter:title" content="June, 2020"/>
|
||
<meta name="twitter:description" content="2020-06-01
|
||
|
||
I tried to run the AtomicStatisticsUpdateCLI CUA migration script on DSpace Test (linode26) again and it is still going very slowly and has tons of errors like I noticed yesterday
|
||
|
||
I sent Atmire the dspace.log from today and told them to log into the server to debug the process
|
||
|
||
|
||
In other news, I checked the statistics API on DSpace 6 and it’s working
|
||
I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error:
|
||
"/>
|
||
<meta name="generator" content="Hugo 0.72.0" />
|
||
|
||
|
||
|
||
<script type="application/ld+json">
|
||
{
|
||
"@context": "http://schema.org",
|
||
"@type": "BlogPosting",
|
||
"headline": "June, 2020",
|
||
"url": "https://alanorth.github.io/cgspace-notes/2020-06/",
|
||
"wordCount": "3319",
|
||
"datePublished": "2020-06-01T13:55:39+03:00",
|
||
"dateModified": "2020-06-23T16:13:27+03:00",
|
||
"author": {
|
||
"@type": "Person",
|
||
"name": "Alan Orth"
|
||
},
|
||
"keywords": "Notes"
|
||
}
|
||
</script>
|
||
|
||
|
||
|
||
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2020-06/">
|
||
|
||
<title>June, 2020 | CGSpace Notes</title>
|
||
|
||
|
||
<!-- combined, minified CSS -->
|
||
|
||
<link href="https://alanorth.github.io/cgspace-notes/css/style.6da5c906cc7a8fbb93f31cd2316c5dbe3f19ac4aa6bfb066f1243045b8f6061e.css" rel="stylesheet" integrity="sha256-baXJBsx6j7uT8xzSMWxdvj8ZrEqmv7Bm8SQwRbj2Bh4=" crossorigin="anonymous">
|
||
|
||
|
||
<!-- minified Font Awesome for SVG icons -->
|
||
|
||
<script defer src="https://alanorth.github.io/cgspace-notes/js/fontawesome.min.f3d2a1f5980bab30ddd0d8cadbd496475309fc48e2b1d052c5c09e6facffcb0f.js" integrity="sha256-89Kh9ZgLqzDd0NjK29SWR1MJ/EjisdBSxcCeb6z/yw8=" crossorigin="anonymous"></script>
|
||
|
||
<!-- RSS 2.0 feed -->
|
||
|
||
|
||
|
||
|
||
|
||
|
||
</head>
|
||
|
||
<body>
|
||
|
||
|
||
<div class="blog-masthead">
|
||
<div class="container">
|
||
<nav class="nav blog-nav">
|
||
<a class="nav-link " href="https://alanorth.github.io/cgspace-notes/">Home</a>
|
||
</nav>
|
||
</div>
|
||
</div>
|
||
|
||
|
||
|
||
|
||
<header class="blog-header">
|
||
<div class="container">
|
||
<h1 class="blog-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/" rel="home">CGSpace Notes</a></h1>
|
||
<p class="lead blog-description" dir="auto">Documenting day-to-day work on the <a href="https://cgspace.cgiar.org">CGSpace</a> repository.</p>
|
||
</div>
|
||
</header>
|
||
|
||
|
||
|
||
|
||
<div class="container">
|
||
<div class="row">
|
||
<div class="col-sm-8 blog-main">
|
||
|
||
|
||
|
||
|
||
<article class="blog-post">
|
||
<header>
|
||
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2020-06/">June, 2020</a></h2>
|
||
<p class="blog-post-meta"><time datetime="2020-06-01T13:55:39+03:00">Mon Jun 01, 2020</time> by Alan Orth in
|
||
<span class="fas fa-folder" aria-hidden="true"></span> <a href="/cgspace-notes/categories/notes/" rel="category tag">Notes</a>
|
||
|
||
|
||
</p>
|
||
</header>
|
||
<h2 id="2020-06-01">2020-06-01</h2>
|
||
<ul>
|
||
<li>I tried to run the <code>AtomicStatisticsUpdateCLI</code> CUA migration script on DSpace Test (linode26) again and it is still going very slowly and has tons of errors like I noticed yesterday
|
||
<ul>
|
||
<li>I sent Atmire the dspace.log from today and told them to log into the server to debug the process</li>
|
||
</ul>
|
||
</li>
|
||
<li>In other news, I checked the statistics API on DSpace 6 and it’s working</li>
|
||
<li>I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error:</li>
|
||
</ul>
|
||
<pre><code>$ dspace oai import -c
|
||
OAI 2.0 manager action started
|
||
Loading @mire database changes for module MQM
|
||
Changes have been processed
|
||
Clearing index
|
||
Index cleared
|
||
Using full import.
|
||
Full import
|
||
java.lang.NullPointerException
|
||
at org.dspace.xoai.app.XOAI.willChangeStatus(XOAI.java:438)
|
||
at org.dspace.xoai.app.XOAI.index(XOAI.java:368)
|
||
at org.dspace.xoai.app.XOAI.index(XOAI.java:280)
|
||
at org.dspace.xoai.app.XOAI.indexAll(XOAI.java:227)
|
||
at org.dspace.xoai.app.XOAI.index(XOAI.java:134)
|
||
at org.dspace.xoai.app.XOAI.main(XOAI.java:560)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
|
||
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
||
at java.lang.reflect.Method.invoke(Method.java:498)
|
||
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:229)
|
||
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:81)
|
||
</code></pre><h2 id="2020-06-02">2020-06-02</h2>
|
||
<ul>
|
||
<li>I noticed that I was able to do a partial OAI import (ie, without <code>-c</code>)
|
||
<ul>
|
||
<li>Then I tried to clear the OAI Solr core and import, but I get the same error:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<pre><code>$ curl http://localhost:8080/solr/oai/update -H "Content-type: text/xml" --data-binary '<delete><query>*:*</query></delete>'
|
||
$ curl http://localhost:8080/solr/oai/update -H "Content-type: text/xml" --data-binary '<commit />'
|
||
$ ~/dspace63/bin/dspace oai import
|
||
OAI 2.0 manager action started
|
||
...
|
||
There are no indexed documents, using full import.
|
||
Full import
|
||
java.lang.NullPointerException
|
||
at org.dspace.xoai.app.XOAI.willChangeStatus(XOAI.java:438)
|
||
at org.dspace.xoai.app.XOAI.index(XOAI.java:368)
|
||
at org.dspace.xoai.app.XOAI.index(XOAI.java:280)
|
||
at org.dspace.xoai.app.XOAI.indexAll(XOAI.java:227)
|
||
at org.dspace.xoai.app.XOAI.index(XOAI.java:143)
|
||
at org.dspace.xoai.app.XOAI.main(XOAI.java:560)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
|
||
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
||
at java.lang.reflect.Method.invoke(Method.java:498)
|
||
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:229)
|
||
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:81)
|
||
</code></pre><ul>
|
||
<li>I found a <a href="https://jira.lyrasis.org/browse/DS-4363">bug report on DSpace Jira</a> describing this issue affecting someone else running DSpace 6.3
|
||
<ul>
|
||
<li>They suspect it has to do with the item having some missing group names in its authorization policies</li>
|
||
<li>I added some debugging to <code>dspace-oai/src/main/java/org/dspace/xoai/app/XOAI.java</code> to print the Handle of the item that causes the crash and then I looked at its authorization policies</li>
|
||
<li>Indeed there are some blank group names:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/06/item-authorizations-dspace63.png" alt="Missing group names in DSpace 6.3 item authorization policy"></p>
|
||
<ul>
|
||
<li>The same item on CGSpace (DSpace 5.8) also has groups with no name:</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/06/item-authorizations-dspace58.png" alt="Missing group names in DSpace 5.8 item authorization policy"></p>
|
||
<ul>
|
||
<li>I added some debugging and found exactly where this happens
|
||
<ul>
|
||
<li>As it turns out we can just check if the group policy is null there and it allows the OAI import to proceed</li>
|
||
<li>Aaaaand as it turns out, this was fixed in <code>dspace-6_x</code> in 2018 after DSpace 6.3 was released (see <a href="https://jira.lyrasis.org/browse/DS-4019">DS-4019</a>), so that was a waste of three hours.</li>
|
||
<li>I cherry picked 150e83558103ed7f50e8f323b6407b9cbdf33717 into our current <code>6_x-dev-atmire-modules</code> branch</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<h2 id="2020-06-04">2020-06-04</h2>
|
||
<ul>
|
||
<li>Maria was asking about some items they are trying to map from the CGIAR Big Data collection into their Alliance of Bioversity and CIAT journal articles collection, but for some reason the items don’t show up in the item mapper
|
||
<ul>
|
||
<li>The items don’t even show up in the XMLUI Discover advanced search, and actually I don’t even see any recent items on the recently submitted part of the collection (but the item pages exist of course)</li>
|
||
<li>Perhaps I need to try a full Discovery re-index:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<pre><code>$ time chrt -i 0 ionice -c2 -n7 nice -n19 dspace index-discovery -b
|
||
|
||
real 125m37.423s
|
||
user 11m20.312s
|
||
sys 3m19.965s
|
||
</code></pre><ul>
|
||
<li>Still I don’t see the item in XMLUI search or in the item mapper (and I made sure to clear the Cocoon cache)
|
||
<ul>
|
||
<li>I’m starting to think it’s something related to the database transaction issue…</li>
|
||
<li>I removed our custom JDBC driver from <code>/usr/local/apache-tomcat...</code> so that DSpace will use its own much older one, version 9.1-901-1.jdbc4</li>
|
||
<li>I ran all system updates on the server (linode18) and rebooted it</li>
|
||
<li>After it came back up I had to restart Tomcat five times before all Solr statistics cores came up properly</li>
|
||
<li>Unfortunately this means that the Tomcat JDBC pooling via JNDI doesn’t work, so we’re using only the 30 connections reserved for the DSpace CLI from DSpace’s own internal pool</li>
|
||
<li>Perhaps our previous issues with the database pool from a few years ago will be less now that we have much more aggressive blocking and rate limiting of bots in nginx</li>
|
||
</ul>
|
||
</li>
|
||
<li>I will also import a fresh database snapshot from CGSpace and check if I can map the item in my local environment
|
||
<ul>
|
||
<li>After importing and forcing a full reindex locally I can see the item in search and in the item mapper</li>
|
||
</ul>
|
||
</li>
|
||
<li>Abenet sent another message about two users who are having issues with submission, and I see the number of locks in PostgreSQL has sky rocketed again as of a few days ago:</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/06/postgres_locks_ALL-week.png" alt="PostgreSQL locks week"></p>
|
||
<ul>
|
||
<li>As far as I can tell this started happening for the first time in April, connections and locks:</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/06/postgres_connections_ALL-year.png" alt="PostgreSQL connections year">
|
||
<img src="/cgspace-notes/2020/06/postgres_locks_ALL-year.png" alt="PostgreSQL locks year"></p>
|
||
<ul>
|
||
<li>I think I need to just leave this as is with the DSpace default JDBC driver for now, but perhaps I could also downgrade the Tomcat version (I deployed Tomcat 7.0.103 in March, so perhaps that’s relevant)</li>
|
||
<li>Also, I’ll start <em>another</em> full reindexing to see if the issue with mapping is somehow also resolved now that the database connections are working better
|
||
<ul>
|
||
<li>Perhaps related, but this one finished much faster:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<pre><code>$ time chrt -i 0 ionice -c2 -n7 nice -n19 dspace index-discovery -b
|
||
|
||
real 101m41.195s
|
||
user 10m9.569s
|
||
sys 3m13.929s
|
||
</code></pre><ul>
|
||
<li>Unfortunately the item is still not showing up in the item mapper…</li>
|
||
<li>Something happened to AReS Explorer (linode20) so I ran all system updates and rebooted it</li>
|
||
</ul>
|
||
<h2 id="2020-06-07">2020-06-07</h2>
|
||
<ul>
|
||
<li>Peter said he was annoyed with a CSV export from CGSpace because of the different <code>text_lang</code> attributes and asked if we can fix it</li>
|
||
<li>The last time I normalized these was in 2019-06, and currently it looks like this:</li>
|
||
</ul>
|
||
<pre><code>dspace=# SELECT DISTINCT text_lang, count(text_lang) FROM metadatavalue WHERE resource_type_id=2 GROUP BY text_lang ORDER BY count DESC;
|
||
text_lang | count
|
||
-------------+---------
|
||
en_US | 2158377
|
||
en | 149540
|
||
| 49206
|
||
es_ES | 18
|
||
fr | 4
|
||
Peer Review | 1
|
||
| 0
|
||
(7 rows)
|
||
</code></pre><ul>
|
||
<li>In theory we can have different languages for metadata fields but in practice we don’t do that, so we might as well normalize everything to “en_US” (and perhaps I should make a curation task to do this)</li>
|
||
<li>For now I will do it manually on CGSpace and DSpace Test:</li>
|
||
</ul>
|
||
<pre><code>dspace=# UPDATE metadatavalue SET text_lang='en_US' WHERE resource_type_id=2;
|
||
UPDATE 2414738
|
||
</code></pre><ul>
|
||
<li>Note: DSpace Test doesn’t have the <code>resource_type_id</code> column because it’s running DSpace 6 and <a href="https://wiki.lyrasis.org/display/DSPACE/DSpace+Service+based+api">the schema changed to use an object model there</a>
|
||
<ul>
|
||
<li>We need to use this on DSpace 6:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<pre><code>dspace=# UPDATE metadatavalue SET text_lang='en_US' WHERE dspace_object_id IN (SELECT uuid FROM item);
|
||
</code></pre><ul>
|
||
<li>Peter asked if it was possible to find all ILRI items that have “zoonoses” or “zoonotic” in their titles and check if they have the ILRI subject “ZOONOTIC DISEASES” (and add it if not)
|
||
<ul>
|
||
<li>Unfortunately the only way we have currently would be to export the entire ILRI community as a CSV and filter/edit it in OpenRefine</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<h2 id="2020-06-08">2020-06-08</h2>
|
||
<ul>
|
||
<li>I manually mapped the two Big Data items that Maria had asked about last week by exporting their metadata to CSV and re-importing it
|
||
<ul>
|
||
<li>I still need to look into the underlying issue there, seems to be something in Solr</li>
|
||
<li>Something strang is that, when I search for part of the title in Discovery I get 2,000 results on CGSpace, while on my local DSpace 5.8 environment I get 2!</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/06/cgspace-discovery-search.png" alt="CGSpace Discovery search results"></p>
|
||
<p><img src="/cgspace-notes/2020/06/localhost-discovery-search.png" alt="CGSpace Discovery search results"></p>
|
||
<ul>
|
||
<li>On DSpace Test, which is currently running DSpace 6, I get 2,000 results but the top one is the correct match and the item does show up in the item mapper
|
||
<ul>
|
||
<li>Interestingly, if I search directly in the Solr <code>search</code> core on CGSpace with a query like <code>handle:10568/108315</code> I don’t see the item, but on my local Solr I see them!</li>
|
||
</ul>
|
||
</li>
|
||
<li>Peter asked if it was easy for me to add ILRI subject “ZOONOTIC DISEASES” to any items in the ILRI community that had “zoonotic” or “zoonoses” in their title, but were missing the ILRI subject
|
||
<ul>
|
||
<li>I exported the ILRI community metadata, cut the three fields I needed, and then filtered and edited the CSV in OpenRefine:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<pre><code>$ dspace metadata-export -i 10568/1 -f /tmp/2020-06-08-ILRI.csv
|
||
$ csvcut -c 'id,cg.subject.ilri[en_US],dc.title[en_US]' ~/Downloads/2020-06-08-ILRI.csv > /tmp/ilri.csv
|
||
</code></pre><ul>
|
||
<li>Moayad asked why he’s getting HTTP 500 errors on CGSpace
|
||
<ul>
|
||
<li>I looked in the Nginx logs and I see some HTTP 500 responses, but nothing in nginx’s error.log</li>
|
||
<li>Looking in Tomcat’s log I see there are many:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<pre><code># journalctl --since=today -u tomcat7 | grep -c 'Internal Server Error'
|
||
482
|
||
</code></pre><ul>
|
||
<li>They are all related to the REST API, like:</li>
|
||
</ul>
|
||
<pre><code>Jun 07 02:00:27 linode18 tomcat7[6286]: SEVERE: Mapped exception to response: 500 (Internal Server Error)
|
||
Jun 07 02:00:27 linode18 tomcat7[6286]: javax.ws.rs.WebApplicationException
|
||
Jun 07 02:00:27 linode18 tomcat7[6286]: at org.dspace.rest.Resource.processException(Resource.java:151)
|
||
Jun 07 02:00:27 linode18 tomcat7[6286]: at org.dspace.rest.ItemsResource.getItems(ItemsResource.java:195)
|
||
Jun 07 02:00:27 linode18 tomcat7[6286]: at sun.reflect.GeneratedMethodAccessor548.invoke(Unknown Source)
|
||
Jun 07 02:00:27 linode18 tomcat7[6286]: at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
||
Jun 07 02:00:27 linode18 tomcat7[6286]: at java.lang.reflect.Method.invoke(Method.java:498)
|
||
Jun 07 02:00:27 linode18 tomcat7[6286]: at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
|
||
...
|
||
</code></pre><ul>
|
||
<li>And:</li>
|
||
</ul>
|
||
<pre><code>Jun 08 09:28:29 linode18 tomcat7[6286]: SEVERE: Mapped exception to response: 500 (Internal Server Error)
|
||
Jun 08 09:28:29 linode18 tomcat7[6286]: javax.ws.rs.WebApplicationException
|
||
Jun 08 09:28:29 linode18 tomcat7[6286]: at org.dspace.rest.Resource.processFinally(Resource.java:169)
|
||
Jun 08 09:28:29 linode18 tomcat7[6286]: at org.dspace.rest.HandleResource.getObject(HandleResource.java:81)
|
||
Jun 08 09:28:29 linode18 tomcat7[6286]: at sun.reflect.GeneratedMethodAccessor360.invoke(Unknown Source)
|
||
Jun 08 09:28:29 linode18 tomcat7[6286]: at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
||
Jun 08 09:28:29 linode18 tomcat7[6286]: at java.lang.reflect.Method.invoke(Method.java:498)
|
||
</code></pre><ul>
|
||
<li>And:</li>
|
||
</ul>
|
||
<pre><code>Jun 06 08:19:54 linode18 tomcat7[6286]: SEVERE: Mapped exception to response: 500 (Internal Server Error)
|
||
Jun 06 08:19:54 linode18 tomcat7[6286]: javax.ws.rs.WebApplicationException
|
||
Jun 06 08:19:54 linode18 tomcat7[6286]: at org.dspace.rest.Resource.processException(Resource.java:151)
|
||
Jun 06 08:19:54 linode18 tomcat7[6286]: at org.dspace.rest.CollectionsResource.getCollectionItems(CollectionsResource.java:289)
|
||
Jun 06 08:19:54 linode18 tomcat7[6286]: at sun.reflect.GeneratedMethodAccessor598.invoke(Unknown Source)
|
||
Jun 06 08:19:54 linode18 tomcat7[6286]: at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
||
Jun 06 08:19:54 linode18 tomcat7[6286]: at java.lang.reflect.Method.invoke(Method.java:498)
|
||
</code></pre><ul>
|
||
<li>Looking back, I see ~800 of these errors since I changed the database configuration last week:</li>
|
||
</ul>
|
||
<pre><code># journalctl --since=2020-06-04 --until=today -u tomcat7 | grep -c 'javax.ws.rs.WebApplicationException'
|
||
795
|
||
</code></pre><ul>
|
||
<li>And only ~280 in the entire month before that…</li>
|
||
</ul>
|
||
<pre><code># journalctl --since=2020-05-01 --until=2020-06-04 -u tomcat7 | grep -c 'javax.ws.rs.WebApplicationException'
|
||
286
|
||
</code></pre><ul>
|
||
<li>So it seems to be related to the database, perhaps that there are less connections in the pool?
|
||
<ul>
|
||
<li>… and on that note, working without the JDBC driver and DSpace’s built-in connection pool since 2020-06-04 hasn’t actually solved anything, the issue with locks and idle in transaction connections is creeping up again!</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/06/postgres_connections_ALL-day2.png" alt="PostgreSQL connections day">
|
||
<img src="/cgspace-notes/2020/06/postgres_connections_ALL-week2.png" alt="PostgreSQL connections week"></p>
|
||
<ul>
|
||
<li>It seems to have started today around 10:00 AM… I need to pore over the logs to see if there is a correlation
|
||
<ul>
|
||
<li>I think there is some kind of attack going on because I see a bunch of requests for sequential Handles from a similar IP range in a datacenter in Sweden where the user <em>does not</em> re-use their DSpace <code>session_id</code></li>
|
||
<li>Looking in the nginx logs I see most (all?) of these requests are using the following user agent:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<pre><code>Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36
|
||
</code></pre><ul>
|
||
<li>Looking at the nginx access logs I see that, other than something that seems like Google Feedburner, all hosts using this user agent are all in Sweden!</li>
|
||
</ul>
|
||
<pre><code># zcat --force /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/access.log.*.gz | grep 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36' | grep -v '/feed' | awk '{print $1}' | sort | uniq -c | sort -n
|
||
1624 192.36.136.246
|
||
1627 192.36.241.95
|
||
1629 192.165.45.204
|
||
1636 192.36.119.28
|
||
1641 192.36.217.7
|
||
1648 192.121.146.160
|
||
1648 192.36.23.35
|
||
1651 192.36.109.94
|
||
1659 192.36.24.93
|
||
1665 192.36.154.13
|
||
1679 192.36.137.125
|
||
1682 192.176.249.42
|
||
1682 192.36.166.120
|
||
1683 192.36.172.86
|
||
1683 192.36.198.145
|
||
1689 192.36.226.212
|
||
1702 192.121.136.49
|
||
1704 192.36.207.54
|
||
1740 192.36.121.98
|
||
1774 192.36.173.93
|
||
</code></pre><ul>
|
||
<li>The earliest I see any of these hosts is 2020-06-05 (three days ago)</li>
|
||
<li>I will purge them from the Solr statistics and add them to abusive IPs ipset in the Ansible deployment scripts</li>
|
||
</ul>
|
||
<pre><code>$ ./check-spider-ip-hits.sh -f /tmp/ips -s statistics -p
|
||
Purging 1423 hits from 192.36.136.246 in statistics
|
||
Purging 1387 hits from 192.36.241.95 in statistics
|
||
Purging 1398 hits from 192.165.45.204 in statistics
|
||
Purging 1413 hits from 192.36.119.28 in statistics
|
||
Purging 1418 hits from 192.36.217.7 in statistics
|
||
Purging 1418 hits from 192.121.146.160 in statistics
|
||
Purging 1416 hits from 192.36.23.35 in statistics
|
||
Purging 1449 hits from 192.36.109.94 in statistics
|
||
Purging 1440 hits from 192.36.24.93 in statistics
|
||
Purging 1465 hits from 192.36.154.13 in statistics
|
||
Purging 1447 hits from 192.36.137.125 in statistics
|
||
Purging 1453 hits from 192.176.249.42 in statistics
|
||
Purging 1462 hits from 192.36.166.120 in statistics
|
||
Purging 1499 hits from 192.36.172.86 in statistics
|
||
Purging 1457 hits from 192.36.198.145 in statistics
|
||
Purging 1467 hits from 192.36.226.212 in statistics
|
||
Purging 1489 hits from 192.121.136.49 in statistics
|
||
Purging 1478 hits from 192.36.207.54 in statistics
|
||
Purging 1502 hits from 192.36.121.98 in statistics
|
||
Purging 1544 hits from 192.36.173.93 in statistics
|
||
|
||
Total number of bot hits purged: 29025
|
||
</code></pre><ul>
|
||
<li>Skype with Enrico, Moayad, Jane, Peter, and Abenet to see the latest OpenRXV/AReS developments
|
||
<ul>
|
||
<li>One thing Enrico mentioned to me during the call was that they had issues with Altmetric’s user agents, and he said they are apparently using <code>Altmetribot</code> and <code>Postgenomic V2</code></li>
|
||
<li>I looked in our logs and indeed we have those, so I will add them to the nginx rate limit bypass</li>
|
||
<li>I checked the Solr stats and it seems there are only a few thousand in 2016 and a few hundred in other years so I won’t bother adding it to the DSpace robot user agents list</li>
|
||
</ul>
|
||
</li>
|
||
<li>Atmire sent an updated pull request for the Font Awesome 5 update for CUA (<a href="https://github.com/ilri/DSpace/pull/445">#445</a>) so I filed feedback on <a href="https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=706">their tracker</a></li>
|
||
</ul>
|
||
<h2 id="2020-06-09">2020-06-09</h2>
|
||
<ul>
|
||
<li>I’m still thinking about the issue with PostgreSQL “idle in transaction” and “waiting for lock” connections
|
||
<ul>
|
||
<li>As far as I can see from the Munin graphs this issue started in late April or early May</li>
|
||
<li>I don’t see any PostgreSQL updates around then, though I did update Tomcat to version 7.0.103 in March</li>
|
||
<li>I will try to downgrade Tomcat to 7.0.99, which was the version I was using until early February, before we had seen any issues</li>
|
||
<li>Also, I will use the PostgreSQL JDBC driver version 42.2.9, which is what we were using back then as well</li>
|
||
<li>After deploying Tomcat 7.0.99 I had to restart Tomcat three times before all the Solr statistics cores came up OK</li>
|
||
</ul>
|
||
</li>
|
||
<li>Well look at that, the “idle in transaction” and locking issues started in April on DSpace Test too…</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/06/postgres_connections_ALL-day2.png" alt="PostgreSQL connections year DSpace Test"></p>
|
||
<h2 id="2020-06-13">2020-06-13</h2>
|
||
<ul>
|
||
<li>Atmire sent some questions about DSpace Test related to our ongoing CUA indexing issue
|
||
<ul>
|
||
<li>I had to clarify a few build steps and directories on the test server</li>
|
||
</ul>
|
||
</li>
|
||
<li>I notice that the PostgreSQL connection issues have not come back since 2020-06-09 when I downgraded Tomcat to 7.0.99… fingers crossed that it was something related to that!
|
||
<ul>
|
||
<li>On that note I notice that the AReS explorer is still not harvesting CGSpace properly…</li>
|
||
<li>I looked at the REST API logs on CGSpace (linode18) and saw that the AReS harvester is being denied due to not having a user agent, oops:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<pre><code>172.104.229.92 - - [13/Jun/2020:02:00:00 +0200] "GET /rest/items?expand=metadata,bitstreams,parentCommunityList&limit=50&offset=0 HTTP/1.1" 403 260 "-" "-"
|
||
</code></pre><ul>
|
||
<li>I created an nginx map based on the host’s IP address that sets a temporary user agent (ua) and then changed the conditional in the REST API location block so that it checks this mapped ua instead of the default one
|
||
<ul>
|
||
<li>That should allow AReS to harvest for now until they update their user agent</li>
|
||
<li>I restarted the AReS server’s docker containers with <code>docker-compose down</code> and <code>docker-compose up -d</code> and the next day I saw CGSpace was in AReS again finally</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<h2 id="2020-06-14">2020-06-14</h2>
|
||
<ul>
|
||
<li>Abenet asked for a list of authors from CIP’s community so that Gabriel can make some corrections
|
||
<ul>
|
||
<li>I generated a list of collections in CIPs two communities using the REST API:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<pre><code>$ curl -s 'https://cgspace.cgiar.org/rest/handle/10568/51671?expand=collections' 'https://cgspace.cgiar.org/rest/handle/10568/89346?expand=collections' | grep -oE '10568/[0-9]+' | sort | uniq > /tmp/cip-collections.txt
|
||
</code></pre><ul>
|
||
<li>Then I formatted it into a SQL query and exported a CSV:</li>
|
||
</ul>
|
||
<pre><code>dspace=# \COPY (SELECT DISTINCT text_value AS author, COUNT(*) FROM metadatavalue WHERE metadata_field_id = (SELECT metadata_field_id FROM metadatafieldregistry WHERE element = 'contributor' AND qualifier = 'author') AND resource_type_id = 2 AND resource_id IN (SELECT item_id FROM collection2item WHERE collection_id IN (SELECT resource_id FROM hANDle WHERE hANDle IN ('10568/100533', '10568/100653', '10568/101955', '10568/106580', '10568/108469', '10568/51671', '10568/53085', '10568/53086', '10568/53087', '10568/53088', '10568/53089', '10568/53090', '10568/53091', '10568/53092', '10568/53093', '10568/53094', '10568/64874', '10568/69069', '10568/70150', '10568/88229', '10568/89346', '10568/89347', '10568/99301', '10568/99302', '10568/99303', '10568/99304', '10568/99428'))) GROUP BY text_value ORDER BY count DESC) TO /tmp/cip-authors.csv WITH CSV;
|
||
COPY 3917
|
||
</code></pre><h2 id="2020-06-15">2020-06-15</h2>
|
||
<ul>
|
||
<li>Macaroni Bros emailed me to say that they are having issues with thumbnail links on the REST API
|
||
<ul>
|
||
<li>For some reason all the bitstream <code>retrieveLink</code> links are wrong because they use <code>/bitstreams/</code> instead of <code>/rest/bitstreams/</code>, which leads to an HTTP 404</li>
|
||
<li>I looked on DSpace Test, which is running DSpace 6 dev branch right now, and the links are OK there!</li>
|
||
<li>Looks like someone <a href="https://jira.lyrasis.org/browse/DS-3193">reported this issue on DSpace 6</a></li>
|
||
<li>Other large DSpace 5 sites have this same issue: <a href="https://openknowledge.worldbank.org/handle/10986/30568">https://openknowledge.worldbank.org/handle/10986/30568</a></li>
|
||
<li>I can’t believe nobody ever noticed this before…</li>
|
||
<li>I tried to port the patch from DS-3193 to DSpace 5.x and it builds, but causes an HTTP 500 Internal Server error when generating bitstream links</li>
|
||
<li>Well the correct URL should have <code>/rest/</code> anyways, and that’s how the URLs are in DSpace 6 anyways, so I will tell Macaroni to just make sure that those links use <code>/rest/</code></li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<h2 id="2020-06-16">2020-06-16</h2>
|
||
<ul>
|
||
<li>Looks like the PostgreSQL connection/lock issue might be fixed because it’s been six days with no reoccurrence:</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/06/postgres_connections_ALL-week3.png" alt="PostgreSQL connections week"></p>
|
||
<ul>
|
||
<li>And CGSpace is being harvested successfully by AReS every day still</li>
|
||
<li>Fix some CIP subjects that had two different styles of dashes, causing them to show up differently in Discovery
|
||
<ul>
|
||
<li><code>SWEETPOTATO AGRI‐FOOD SYSTEMS</code> → <code>SWEETPOTATO AGRI-FOOD SYSTEMS</code></li>
|
||
<li><code>POTATO AGRI‐FOOD SYSTEMS</code> → <code>POTATO AGRI-FOOD SYSTEMS</code></li>
|
||
</ul>
|
||
</li>
|
||
<li>They also asked me to update <code>INCLUSIVE VALUE CHAINS</code> to <code>INCLUSIVE GROWTH</code>, both in the existing items on CGSpace and the submission form</li>
|
||
</ul>
|
||
<h2 id="2020-06-18">2020-06-18</h2>
|
||
<ul>
|
||
<li>I guess Atmire fixed the CUA download issue after updating the version for Font Awesome 5, but now I get an error during ant update
|
||
<ul>
|
||
<li>I tried to remove the <code>config/spring</code> directory, but it still fails</li>
|
||
<li>The same issue happens on my local environment and on the DSpace Test server</li>
|
||
<li>I raised the issue with Atmire</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<h2 id="2020-06-21">2020-06-21</h2>
|
||
<ul>
|
||
<li>The PostgreSQL connections and locks still look OK on both CGSpace (linode18) and DSpace Test (linode26) after ten days or so
|
||
<ul>
|
||
<li>I decided to upgrade the JDBC driver on DSpace Test from 42.2.9 to 42.2.14 to leave it for a few weeks and see if the issue comes back, as it was present on the test server with DSpace 6 as well</li>
|
||
<li>As far as I can tell the issue is related to something in Tomcat > 7.0.99</li>
|
||
</ul>
|
||
</li>
|
||
<li>Run system updates and reboot DSpace Test</li>
|
||
<li>Re-deploy <code>5_x-prod</code> branch on CGSpace, run system updates, and reboot the server
|
||
<ul>
|
||
<li>I had to restart Tomcat 7 once after the server came back up because some of the Solr statistics cores didn’t come up the first time</li>
|
||
<li>Unfortunately I realized that I had forgotten to update the <code>5_x-prod</code> branch to include the REST API fix from last week so I had to rebuild and restart Tomcat again</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<h2 id="2020-06-22">2020-06-22</h2>
|
||
<ul>
|
||
<li>Skype with Peter, Abenet, Enrico, Jane, and Moayad about the latest OpenRXV developments
|
||
<ul>
|
||
<li>I will go visit CodeObia later this week to run through the list of issues and close the ones that are finished</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<h2 id="2020-06-23">2020-06-23</h2>
|
||
<ul>
|
||
<li>Peter said he’s having problems submitting an item on CGSpace and shit, it seems to be the same fucking PostgreSQL “idle in transaction” and “waiting for lock” issue we’ve been having sporadically the last few months</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/06/postgres_connections_ALL-day3.png" alt="PostgreSQL connections year CGSpace"></p>
|
||
<ul>
|
||
<li>The issue hadn’t occurred in almost two weeks after I downgraded Tomcat to 7.0.99 with the PostgreSQL JDBC driver version 42.2.9 so I thought it was fixed
|
||
<ul>
|
||
<li>Apparently it’s not related to the Tomcat or JDBC driver version, as even when I reverted back to DSpace’s really old built-in JDBC driver it still did the same thing!</li>
|
||
<li>Could it be a memory leak or something? Why now?</li>
|
||
<li>For now I will revert to the latest Tomcat 7.0.104 and PostgreSQL JDBC 42.2.14</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<h2 id="2020-06-24">2020-06-24</h2>
|
||
<ul>
|
||
<li>Spend some time with Moayad looking over issues for OpenRXV
|
||
<ul>
|
||
<li>We updated all the labels, tooltips, and filters</li>
|
||
<li>The next step is to go through the GitHub issues and close them if they are done</li>
|
||
</ul>
|
||
</li>
|
||
<li>I also discussed the <a href="https://github.com/ilri/dspace-statistics-api">dspace-statistics-api</a> with Mohammed Salem
|
||
<ul>
|
||
<li>He made some new features to add the harvesting of twelve months of item statistics</li>
|
||
<li>We talked about extending it to be “x” amount of months and years with some sensible defaults</li>
|
||
<li>The item and items endpoints could then have <code>?months=12</code> and <code>?years=2</code> to show stats for the past “x” months or years</li>
|
||
<li>We thought other arbitrary date ranges could be added with queries like <code>?date_from=2020-05</code> etc that would query Solr on the fly and obviously be slower…</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<!-- raw HTML omitted -->
|
||
|
||
|
||
|
||
|
||
|
||
</article>
|
||
|
||
|
||
|
||
</div> <!-- /.blog-main -->
|
||
|
||
<aside class="col-sm-3 ml-auto blog-sidebar">
|
||
|
||
|
||
|
||
<section class="sidebar-module">
|
||
<h4>Recent Posts</h4>
|
||
<ol class="list-unstyled">
|
||
|
||
|
||
<li><a href="/cgspace-notes/2020-06/">June, 2020</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2020-05/">May, 2020</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2020-04/">April, 2020</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2020-03/">March, 2020</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2020-02/">February, 2020</a></li>
|
||
|
||
</ol>
|
||
</section>
|
||
|
||
|
||
|
||
|
||
<section class="sidebar-module">
|
||
<h4>Links</h4>
|
||
<ol class="list-unstyled">
|
||
|
||
<li><a href="https://cgspace.cgiar.org">CGSpace</a></li>
|
||
|
||
<li><a href="https://dspacetest.cgiar.org">DSpace Test</a></li>
|
||
|
||
<li><a href="https://github.com/ilri/DSpace">CGSpace @ GitHub</a></li>
|
||
|
||
</ol>
|
||
</section>
|
||
|
||
</aside>
|
||
|
||
|
||
</div> <!-- /.row -->
|
||
</div> <!-- /.container -->
|
||
|
||
|
||
|
||
<footer class="blog-footer">
|
||
<p dir="auto">
|
||
|
||
Blog template created by <a href="https://twitter.com/mdo">@mdo</a>, ported to Hugo by <a href='https://twitter.com/mralanorth'>@mralanorth</a>.
|
||
|
||
</p>
|
||
<p>
|
||
<a href="#">Back to top</a>
|
||
</p>
|
||
</footer>
|
||
|
||
|
||
</body>
|
||
|
||
</html>
|