mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-26 16:38:19 +01:00
632 lines
25 KiB
HTML
632 lines
25 KiB
HTML
<!DOCTYPE html>
|
||
<html lang="en">
|
||
|
||
<head>
|
||
<meta charset="utf-8">
|
||
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
||
|
||
<meta property="og:title" content="December, 2017" />
|
||
<meta property="og:description" content="2017-12-01
|
||
|
||
|
||
Uptime Robot noticed that CGSpace went down
|
||
The logs say “Timeout waiting for idle object”
|
||
PostgreSQL activity says there are 115 connections currently
|
||
The list of connections to XMLUI and REST API for today:
|
||
|
||
|
||
" />
|
||
<meta property="og:type" content="article" />
|
||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2017-12/" />
|
||
|
||
|
||
|
||
<meta property="article:published_time" content="2017-12-01T13:53:54+03:00"/>
|
||
|
||
<meta property="article:modified_time" content="2017-12-18T17:03:58+02:00"/>
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
<meta name="twitter:card" content="summary"/><meta name="twitter:title" content="December, 2017"/>
|
||
<meta name="twitter:description" content="2017-12-01
|
||
|
||
|
||
Uptime Robot noticed that CGSpace went down
|
||
The logs say “Timeout waiting for idle object”
|
||
PostgreSQL activity says there are 115 connections currently
|
||
The list of connections to XMLUI and REST API for today:
|
||
|
||
|
||
"/>
|
||
<meta name="generator" content="Hugo 0.31.1" />
|
||
|
||
|
||
|
||
<script type="application/ld+json">
|
||
{
|
||
"@context": "http://schema.org",
|
||
"@type": "BlogPosting",
|
||
"headline": "December, 2017",
|
||
"url": "https://alanorth.github.io/cgspace-notes/2017-12/",
|
||
"wordCount": "2244",
|
||
"datePublished": "2017-12-01T13:53:54+03:00",
|
||
"dateModified": "2017-12-18T17:03:58+02:00",
|
||
"author": {
|
||
"@type": "Person",
|
||
"name": "Alan Orth"
|
||
},
|
||
"keywords": "Notes"
|
||
}
|
||
</script>
|
||
|
||
|
||
|
||
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2017-12/">
|
||
|
||
<title>December, 2017 | CGSpace Notes</title>
|
||
|
||
<!-- combined, minified CSS -->
|
||
<link href="https://alanorth.github.io/cgspace-notes/css/style.css" rel="stylesheet" integrity="sha384-O8wjsnz02XiyrPxnhfF6AVOv6YLBaEGRCnVF+DL3gCPBy9cieyHcpixIrVyD2JS5" crossorigin="anonymous">
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
</head>
|
||
|
||
<body>
|
||
|
||
<div class="blog-masthead">
|
||
<div class="container">
|
||
<nav class="nav blog-nav">
|
||
<a class="nav-link " href="https://alanorth.github.io/cgspace-notes/">Home</a>
|
||
|
||
|
||
</nav>
|
||
</div>
|
||
</div>
|
||
|
||
<header class="blog-header">
|
||
<div class="container">
|
||
<h1 class="blog-title"><a href="https://alanorth.github.io/cgspace-notes/" rel="home">CGSpace Notes</a></h1>
|
||
<p class="lead blog-description">Documenting day-to-day work on the <a href="https://cgspace.cgiar.org">CGSpace</a> repository.</p>
|
||
</div>
|
||
</header>
|
||
|
||
<div class="container">
|
||
<div class="row">
|
||
<div class="col-sm-8 blog-main">
|
||
|
||
|
||
|
||
|
||
<article class="blog-post">
|
||
<header>
|
||
<h2 class="blog-post-title"><a href="https://alanorth.github.io/cgspace-notes/2017-12/">December, 2017</a></h2>
|
||
<p class="blog-post-meta"><time datetime="2017-12-01T13:53:54+03:00">Fri Dec 01, 2017</time> by Alan Orth in
|
||
|
||
<i class="fa fa-tag" aria-hidden="true"></i> <a href="/cgspace-notes/tags/notes" rel="tag">Notes</a>
|
||
|
||
</p>
|
||
</header>
|
||
<h2 id="2017-12-01">2017-12-01</h2>
|
||
|
||
<ul>
|
||
<li>Uptime Robot noticed that CGSpace went down</li>
|
||
<li>The logs say “Timeout waiting for idle object”</li>
|
||
<li>PostgreSQL activity says there are 115 connections currently</li>
|
||
<li>The list of connections to XMLUI and REST API for today:</li>
|
||
</ul>
|
||
|
||
<p></p>
|
||
|
||
<pre><code># cat /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E "1/Dec/2017" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
763 2.86.122.76
|
||
907 207.46.13.94
|
||
1018 157.55.39.206
|
||
1021 157.55.39.235
|
||
1407 66.249.66.70
|
||
1411 104.196.152.243
|
||
1503 50.116.102.77
|
||
1805 66.249.66.90
|
||
4007 70.32.83.92
|
||
6061 45.5.184.196
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>The number of DSpace sessions isn’t even that high:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ cat /home/cgspace.cgiar.org/log/dspace.log.2017-12-01 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | uniq | wc -l
|
||
5815
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Connections in the last two hours:</li>
|
||
</ul>
|
||
|
||
<pre><code># cat /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E "1/Dec/2017:(09|10)" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
78 93.160.60.22
|
||
101 40.77.167.122
|
||
113 66.249.66.70
|
||
129 157.55.39.206
|
||
130 157.55.39.235
|
||
135 40.77.167.58
|
||
164 68.180.229.254
|
||
177 87.100.118.220
|
||
188 66.249.66.90
|
||
314 2.86.122.76
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>What the fuck is going on?</li>
|
||
<li>I’ve never seen this 2.86.122.76 before, it has made quite a few unique Tomcat sessions today:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep 2.86.122.76 /home/cgspace.cgiar.org/log/dspace.log.2017-12-01 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | uniq | wc -l
|
||
822
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Appears to be some new bot:</li>
|
||
</ul>
|
||
|
||
<pre><code>2.86.122.76 - - [01/Dec/2017:09:02:53 +0000] "GET /handle/10568/78444?show=full HTTP/1.1" 200 29307 "-" "Mozilla/3.0 (compatible; Indy Library)"
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I restarted Tomcat and everything came back up</li>
|
||
<li>I can add Indy Library to the Tomcat crawler session manager valve but it would be nice if I could simply remap the useragent in nginx</li>
|
||
<li>I will also add ‘Drupal’ to the Tomcat crawler session manager valve because there are Drupals out there harvesting and they should be considered as bots</li>
|
||
</ul>
|
||
|
||
<pre><code># cat /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E "1/Dec/2017" | grep Drupal | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
3 54.75.205.145
|
||
6 70.32.83.92
|
||
14 2a01:7e00::f03c:91ff:fe18:7396
|
||
46 2001:4b99:1:1:216:3eff:fe2c:dc6c
|
||
319 2001:4b99:1:1:216:3eff:fe76:205b
|
||
</code></pre>
|
||
|
||
<h2 id="2017-12-03">2017-12-03</h2>
|
||
|
||
<ul>
|
||
<li>Linode alerted that CGSpace’s load was 327.5% from 6 to 8 AM again</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-12-04">2017-12-04</h2>
|
||
|
||
<ul>
|
||
<li>Linode alerted that CGSpace’s load was 255.5% from 8 to 10 AM again</li>
|
||
<li>I looked at the Munin stats on DSpace Test (linode02) again to see how the PostgreSQL tweaks from a few weeks ago were holding up:</li>
|
||
</ul>
|
||
|
||
<p><img src="/cgspace-notes/2017/12/postgres-connections-month.png" alt="DSpace Test PostgreSQL connections month" /></p>
|
||
|
||
<ul>
|
||
<li>The results look fantastic! So the <code>random_page_cost</code> tweak is massively important for informing the PostgreSQL scheduler that there is no “cost” to accessing random pages, as we’re on an SSD!</li>
|
||
<li>I guess we could probably even reduce the PostgreSQL connections in DSpace / PostgreSQL after using this</li>
|
||
<li>Run system updates on DSpace Test (linode02) and reboot it</li>
|
||
<li>I’m going to enable the PostgreSQL <code>random_page_cost</code> tweak on CGSpace</li>
|
||
<li>For reference, here is the past month’s connections:</li>
|
||
</ul>
|
||
|
||
<p><img src="/cgspace-notes/2017/12/postgres-connections-month-cgspace.png" alt="CGSpace PostgreSQL connections month" /></p>
|
||
|
||
<h2 id="2017-12-05">2017-12-05</h2>
|
||
|
||
<ul>
|
||
<li>Linode alerted again that the CPU usage on CGSpace was high this morning from 8 to 10 AM</li>
|
||
<li>CORE updated the entry for CGSpace on their index: <a href="https://core.ac.uk/search?q=repositories.id:(1016)&fullTextOnly=false">https://core.ac.uk/search?q=repositories.id:(1016)&fullTextOnly=false</a></li>
|
||
<li>Linode alerted again that the CPU usage on CGSpace was high this evening from 8 to 10 PM</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-12-06">2017-12-06</h2>
|
||
|
||
<ul>
|
||
<li>Linode alerted again that the CPU usage on CGSpace was high this morning from 6 to 8 AM</li>
|
||
<li>Uptime Robot alerted that the server went down and up around 8:53 this morning</li>
|
||
<li>Uptime Robot alerted that CGSpace was down and up again a few minutes later</li>
|
||
<li>I don’t see any errors in the DSpace logs but I see in nginx’s access.log that UptimeRobot was returned with HTTP 499 status (Client Closed Request)</li>
|
||
<li>Looking at the REST API logs I see some new client IP I haven’t noticed before:</li>
|
||
</ul>
|
||
|
||
<pre><code># cat /var/log/nginx/rest.log /var/log/nginx/rest.log.1 | grep -E "6/Dec/2017" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
18 95.108.181.88
|
||
19 68.180.229.254
|
||
30 207.46.13.151
|
||
33 207.46.13.110
|
||
38 40.77.167.20
|
||
41 157.55.39.223
|
||
82 104.196.152.243
|
||
1529 50.116.102.77
|
||
4005 70.32.83.92
|
||
6045 45.5.184.196
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>50.116.102.77 is apparently in the US on websitewelcome.com</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-12-07">2017-12-07</h2>
|
||
|
||
<ul>
|
||
<li>Uptime Robot reported a few times today that CGSpace was down and then up</li>
|
||
<li>At one point Tsega restarted Tomcat</li>
|
||
<li>I never got any alerts about high load from Linode though…</li>
|
||
<li>I looked just now and see that there are 121 PostgreSQL connections!</li>
|
||
<li>The top users right now are:</li>
|
||
</ul>
|
||
|
||
<pre><code># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E "7/Dec/2017" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
838 40.77.167.11
|
||
939 66.249.66.223
|
||
1149 66.249.66.206
|
||
1316 207.46.13.110
|
||
1322 207.46.13.151
|
||
1323 2001:da8:203:2224:c912:1106:d94f:9189
|
||
1414 157.55.39.223
|
||
2378 104.196.152.243
|
||
2662 66.249.66.219
|
||
5110 124.17.34.60
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>We’ve never seen 124.17.34.60 yet, but it’s really hammering us!</li>
|
||
<li>Apparently it is from China, and here is one of its user agents:</li>
|
||
</ul>
|
||
|
||
<pre><code>Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.2; Win64; x64; Trident/7.0; LCTE)
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>It is responsible for 4,500 Tomcat sessions today alone:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep 124.17.34.60 /home/cgspace.cgiar.org/log/dspace.log.2017-12-07 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | uniq | wc -l
|
||
4574
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I’ve adjusted the nginx IP mapping that I set up last month to account for 124.17.34.60 and 124.17.34.59 using a regex, as it’s the same bot on the same subnet</li>
|
||
<li>I was running the DSpace cleanup task manually and it hit an error:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ /home/cgspace.cgiar.org/bin/dspace cleanup -v
|
||
...
|
||
Error: ERROR: update or delete on table "bitstream" violates foreign key constraint "bundle_primary_bitstream_id_fkey" on table "bundle"
|
||
Detail: Key (bitstream_id)=(144666) is still referenced from table "bundle".
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>The solution is like I discovered in <a href="/cgspace-notes/2017-04">2017-04</a>, to set the <code>primary_bitstream_id</code> to null:</li>
|
||
</ul>
|
||
|
||
<pre><code>dspace=# update bundle set primary_bitstream_id=NULL where primary_bitstream_id in (144666);
|
||
UPDATE 1
|
||
</code></pre>
|
||
|
||
<h2 id="2017-12-13">2017-12-13</h2>
|
||
|
||
<ul>
|
||
<li>Linode alerted that CGSpace was using high CPU from 10:13 to 12:13 this morning</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-12-16">2017-12-16</h2>
|
||
|
||
<ul>
|
||
<li>Re-work the XMLUI base theme to allow child themes to override the header logo’s image and link destination: <a href="https://github.com/ilri/DSpace/pull/349">#349</a></li>
|
||
<li>This required a little bit of work to restructure the XSL templates</li>
|
||
<li>Optimize PNG and SVG image assets in the CGIAR base theme using pngquant and svgo: <a href="https://github.com/ilri/DSpace/pull/350">#350</a></li>
|
||
</ul>
|
||
|
||
<h2 id="2017-12-17">2017-12-17</h2>
|
||
|
||
<ul>
|
||
<li>Reboot DSpace Test to get new Linode Linux kernel</li>
|
||
<li>Looking at CCAFS bulk import for Magdalena Haman (she originally sent them in November but some of the thumbnails were missing and dates were messed up so she resent them now)</li>
|
||
<li>A few issues with the data and thumbnails:
|
||
|
||
<ul>
|
||
<li>Her thumbnail files all use capital JPG so I had to rename them to lowercase: <code>rename -fc *.JPG</code></li>
|
||
<li>thumbnail20.jpg is 1.7MB so I have to resize it</li>
|
||
<li>I also had to add the .jpg to the thumbnail string in the CSV</li>
|
||
<li>The thumbnail11.jpg is missing</li>
|
||
<li>The dates are in super long ISO8601 format (from Excel?) like <code>2016-02-07T00:00:00Z</code> so I converted them to simpler forms in GREL: <code>value.toString("yyyy-MM-dd")</code></li>
|
||
<li>I trimmed the whitespaces in a few fields but it wasn’t many</li>
|
||
<li>Rename her thumbnail column to filename, and format it so SAFBuilder adds the files to the thumbnail bundle with this GREL in OpenRefine: <code>value + "__bundle:THUMBNAIL"</code></li>
|
||
<li>Rename dc.identifier.status and dc.identifier.url columns to cg.identifier.status and cg.identifier.url</li>
|
||
<li>Item 4 has weird characters in citation, ie: Nagoya et de Trait</li>
|
||
<li>Some author names need normalization, ie: <code>Aggarwal, Pramod</code> and <code>Aggarwal, Pramod K.</code></li>
|
||
<li>Something weird going on with duplicate authors that have the same text value, like <code>Berto, Jayson C.</code> and <code>Balmeo, Katherine P.</code></li>
|
||
<li>I will send her feedback on some author names like UNEP and ICRISAT and ask her for the missing thumbnail11.jpg</li>
|
||
</ul></li>
|
||
<li>I did a test import of the data locally after building with SAFBuilder but for some reason I had to specify the collection (even though the collections were specified in the <code>collection</code> field)</li>
|
||
</ul>
|
||
|
||
<pre><code>$ JAVA_OPTS="-Xmx512m -Dfile.encoding=UTF-8" ~/dspace/bin/dspace import --add --eperson=aorth@mjanja.ch --collection=10568/89338 --source /Users/aorth/Downloads/2016\ bulk\ upload\ thumbnails/SimpleArchiveFormat --mapfile=/tmp/ccafs.map &> /tmp/ccafs.log
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>It’s the same on DSpace Test, I can’t import the SAF bundle without specifying the collection:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ dspace import --add --eperson=aorth@mjanja.ch --mapfile=/tmp/ccafs.map --source=/tmp/ccafs-2016/SimpleArchiveFormat
|
||
No collections given. Assuming 'collections' file inside item directory
|
||
Adding items from directory: /tmp/ccafs-2016/SimpleArchiveFormat
|
||
Generating mapfile: /tmp/ccafs.map
|
||
Processing collections file: collections
|
||
Adding item from directory item_1
|
||
java.lang.NullPointerException
|
||
at org.dspace.app.itemimport.ItemImport.addItem(ItemImport.java:865)
|
||
at org.dspace.app.itemimport.ItemImport.addItems(ItemImport.java:736)
|
||
at org.dspace.app.itemimport.ItemImport.main(ItemImport.java:498)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
|
||
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
||
at java.lang.reflect.Method.invoke(Method.java:498)
|
||
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:226)
|
||
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:78)
|
||
java.lang.NullPointerException
|
||
Started: 1513521856014
|
||
Ended: 1513521858573
|
||
Elapsed time: 2 secs (2559 msecs)
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I even tried to debug it by adding verbose logging to the <code>JAVA_OPTS</code>:</li>
|
||
</ul>
|
||
|
||
<pre><code>-Dlog4j.configuration=file:/Users/aorth/dspace/config/log4j-console.properties -Ddspace.log.init.disable=true
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>… but the error message was the same, just with more INFO noise around it</li>
|
||
<li>For now I’ll import into a collection in DSpace Test but I’m really not sure what’s up with this!</li>
|
||
<li>Linode alerted that CGSpace was using high CPU from 4 to 6 PM</li>
|
||
<li>The logs for today show the CORE bot (137.108.70.7) being active in XMLUI:</li>
|
||
</ul>
|
||
|
||
<pre><code># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E "17/Dec/2017" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
671 66.249.66.70
|
||
885 95.108.181.88
|
||
904 157.55.39.96
|
||
923 157.55.39.179
|
||
1159 207.46.13.107
|
||
1184 104.196.152.243
|
||
1230 66.249.66.91
|
||
1414 68.180.229.254
|
||
4137 66.249.66.90
|
||
46401 137.108.70.7
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>And then some CIAT bot (45.5.184.196) is actively hitting API endpoints:</li>
|
||
</ul>
|
||
|
||
<pre><code># cat /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/oai.log /var/log/nginx/oai.log.1 | grep -E "17/Dec/2017" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
33 68.180.229.254
|
||
48 157.55.39.96
|
||
51 157.55.39.179
|
||
56 207.46.13.107
|
||
102 104.196.152.243
|
||
102 66.249.66.90
|
||
691 137.108.70.7
|
||
1531 50.116.102.77
|
||
4014 70.32.83.92
|
||
11030 45.5.184.196
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>That’s probably ok, as I don’t think the REST API connections use up a Tomcat session…</li>
|
||
<li>CIP emailed a few days ago to ask about unique IDs for authors and organizations, and if we can provide them via an API</li>
|
||
<li>Regarding the import issue above it seems to be a known issue that has a patch in DSpace 5.7:
|
||
|
||
<ul>
|
||
<li><a href="https://jira.duraspace.org/browse/DS-2633">https://jira.duraspace.org/browse/DS-2633</a></li>
|
||
<li><a href="https://jira.duraspace.org/browse/DS-3583">https://jira.duraspace.org/browse/DS-3583</a></li>
|
||
</ul></li>
|
||
<li>We’re on DSpace 5.5 but there is a one-word fix to the addItem() function here: <a href="https://github.com/DSpace/DSpace/pull/1731">https://github.com/DSpace/DSpace/pull/1731</a></li>
|
||
<li>I will apply it on our branch but I need to make a note to NOT cherry-pick it when I rebase on to the latest 5.x upstream later</li>
|
||
<li>Pull request: <a href="https://github.com/ilri/DSpace/pull/351">#351</a></li>
|
||
</ul>
|
||
|
||
<h2 id="2017-12-18">2017-12-18</h2>
|
||
|
||
<ul>
|
||
<li>Linode alerted this morning that there was high outbound traffic from 6 to 8 AM</li>
|
||
<li>The XMLUI logs show that the CORE bot from last night (137.108.70.7) is very active still:</li>
|
||
</ul>
|
||
|
||
<pre><code># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E "18/Dec/2017" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
190 207.46.13.146
|
||
191 197.210.168.174
|
||
202 86.101.203.216
|
||
268 157.55.39.134
|
||
297 66.249.66.91
|
||
314 213.55.99.121
|
||
402 66.249.66.90
|
||
532 68.180.229.254
|
||
644 104.196.152.243
|
||
32220 137.108.70.7
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>On the API side (REST and OAI) there is still the same CIAT bot (45.5.184.196) from last night making quite a number of requests this morning:</li>
|
||
</ul>
|
||
|
||
<pre><code># cat /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/oai.log /var/log/nginx/oai.log.1 | grep -E "18/Dec/2017" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
7 104.198.9.108
|
||
8 185.29.8.111
|
||
8 40.77.167.176
|
||
9 66.249.66.91
|
||
9 68.180.229.254
|
||
10 157.55.39.134
|
||
15 66.249.66.90
|
||
59 104.196.152.243
|
||
4014 70.32.83.92
|
||
8619 45.5.184.196
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I need to keep an eye on this issue because it has nice fixes for reducing the number of database connections in DSpace 5.7: <a href="https://jira.duraspace.org/browse/DS-3551">https://jira.duraspace.org/browse/DS-3551</a></li>
|
||
<li>Update text on CGSpace about page to give some tips to developers about using the resources more wisely (<a href="https://github.com/ilri/DSpace/pull/352">#352</a>)</li>
|
||
<li>Linode alerted that CGSpace was using 396.3% CPU from 12 to 2 PM</li>
|
||
<li>The REST and OAI API logs look pretty much the same as earlier this morning, but there’s a new IP harvesting XMLUI:</li>
|
||
</ul>
|
||
|
||
<pre><code># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E "18/Dec/2017" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
360 95.108.181.88
|
||
477 66.249.66.90
|
||
526 86.101.203.216
|
||
691 207.46.13.13
|
||
698 197.210.168.174
|
||
819 207.46.13.146
|
||
878 68.180.229.254
|
||
1965 104.196.152.243
|
||
17701 2.86.72.181
|
||
52532 137.108.70.7
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>2.86.72.181 appears to be from Greece, and has the following user agent:</li>
|
||
</ul>
|
||
|
||
<pre><code>Mozilla/3.0 (compatible; Indy Library)
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Surprisingly it seems they are re-using their Tomcat session for all those 17,000 requests:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep 2.86.72.181 dspace.log.2017-12-18 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | uniq | wc -l
|
||
1
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I guess there’s nothing I can do to them for now</li>
|
||
<li>In other news, I am curious how many PostgreSQL connection pool errors we’ve had in the last month:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep -c "Cannot get a connection, pool error Timeout waiting for idle object" dspace.log.2017-1* | grep -v :0
|
||
dspace.log.2017-11-07:15695
|
||
dspace.log.2017-11-08:135
|
||
dspace.log.2017-11-17:1298
|
||
dspace.log.2017-11-26:4160
|
||
dspace.log.2017-11-28:107
|
||
dspace.log.2017-11-29:3972
|
||
dspace.log.2017-12-01:1601
|
||
dspace.log.2017-12-02:1274
|
||
dspace.log.2017-12-07:2769
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I made a small fix to my <code>move-collections.sh</code> script so that it handles the case when a “to” or “from” community doesn’t exist</li>
|
||
<li>The script lives here: <a href="https://gist.github.com/alanorth/e60b530ed4989df0c731afbb0c640515">https://gist.github.com/alanorth/e60b530ed4989df0c731afbb0c640515</a></li>
|
||
<li>Major reorganization of four of CTA’s French collections</li>
|
||
<li>Basically moving their items into the English ones, then moving the English ones to the top-level of the CTA community, and deleting the old sub-communities</li>
|
||
<li>Move collection <sup>10568</sup>⁄<sub>51821</sub> from <sup>10568</sup>⁄<sub>42212</sub> to <sup>10568</sup>⁄<sub>42211</sub></li>
|
||
<li>Move collection <sup>10568</sup>⁄<sub>51400</sub> from <sup>10568</sup>⁄<sub>42214</sub> to <sup>10568</sup>⁄<sub>42211</sub></li>
|
||
<li>Move collection <sup>10568</sup>⁄<sub>56992</sub> from <sup>10568</sup>⁄<sub>42216</sub> to <sup>10568</sup>⁄<sub>42211</sub></li>
|
||
<li>Move collection <sup>10568</sup>⁄<sub>42218</sub> from <sup>10568</sup>⁄<sub>42217</sub> to <sup>10568</sup>⁄<sub>42211</sub></li>
|
||
<li>Export CSV of collection <sup>10568</sup>⁄<sub>63484</sub> and move items to collection <sup>10568</sup>⁄<sub>51400</sub></li>
|
||
<li>Export CSV of collection <sup>10568</sup>⁄<sub>64403</sub> and move items to collection <sup>10568</sup>⁄<sub>56992</sub></li>
|
||
<li>Export CSV of collection <sup>10568</sup>⁄<sub>56994</sub> and move items to collection <sup>10568</sup>⁄<sub>42218</sub></li>
|
||
<li>There are blank lines in this metadata, which causes DSpace to not detect changes in the CSVs</li>
|
||
<li>I had to use OpenRefine to remove all columns from the CSV except <code>id</code> and <code>collection</code>, and then update the <code>collection</code> field for the new mappings</li>
|
||
<li>Remove empty sub-communities: <sup>10568</sup>⁄<sub>42212</sub>, <sup>10568</sup>⁄<sub>42214</sub>, <sup>10568</sup>⁄<sub>42216</sub>, <sup>10568</sup>⁄<sub>42217</sub></li>
|
||
<li>I was in the middle of applying the metadata imports on CGSpace and the system ran out of PostgreSQL connections…</li>
|
||
<li>There were 128 PostgreSQL connections at the time… grrrr.</li>
|
||
<li>So I restarted Tomcat 7 and restarted the imports</li>
|
||
<li>I assume the PostgreSQL transactions were fine but I will remove the Discovery index for their community and re-run the light-weight indexing to hopefully re-construct everything:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ dspace index-discovery -r 10568/42211
|
||
$ schedtool -D -e ionice -c2 -n7 nice -n19 dspace index-discovery
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>The PostgreSQL issues are getting out of control, I need to figure out how to enable connection pools in Tomcat!</li>
|
||
</ul>
|
||
|
||
|
||
|
||
|
||
|
||
</article>
|
||
|
||
|
||
|
||
</div> <!-- /.blog-main -->
|
||
|
||
<aside class="col-sm-3 ml-auto blog-sidebar">
|
||
|
||
|
||
|
||
<section class="sidebar-module">
|
||
<h4>Recent Posts</h4>
|
||
<ol class="list-unstyled">
|
||
|
||
|
||
<li><a href="/cgspace-notes/2017-12/">December, 2017</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2017-11/">November, 2017</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2017-10/">October, 2017</a></li>
|
||
|
||
<li><a href="/cgspace-notes/cgiar-library-migration/">CGIAR Library Migration</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2017-09/">September, 2017</a></li>
|
||
|
||
</ol>
|
||
</section>
|
||
|
||
|
||
|
||
|
||
<section class="sidebar-module">
|
||
<h4>Links</h4>
|
||
<ol class="list-unstyled">
|
||
|
||
<li><a href="https://cgspace.cgiar.org">CGSpace</a></li>
|
||
|
||
<li><a href="https://dspacetest.cgiar.org">DSpace Test</a></li>
|
||
|
||
<li><a href="https://github.com/ilri/DSpace">CGSpace @ GitHub</a></li>
|
||
|
||
</ol>
|
||
</section>
|
||
|
||
</aside>
|
||
|
||
|
||
</div> <!-- /.row -->
|
||
</div> <!-- /.container -->
|
||
|
||
<footer class="blog-footer">
|
||
<p>
|
||
|
||
Blog template created by <a href="https://twitter.com/mdo">@mdo</a>, ported to Hugo by <a href='https://twitter.com/mralanorth'>@mralanorth</a>.
|
||
|
||
</p>
|
||
<p>
|
||
<a href="#">Back to top</a>
|
||
</p>
|
||
</footer>
|
||
|
||
</body>
|
||
|
||
</html>
|