mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-23 07:00:20 +01:00
930 lines
51 KiB
XML
930 lines
51 KiB
XML
<?xml version="1.0" encoding="utf-8" standalone="yes" ?>
|
||
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
|
||
<channel>
|
||
<title>Posts on CGSpace Notes</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/posts/</link>
|
||
<description>Recent content in Posts on CGSpace Notes</description>
|
||
<generator>Hugo -- gohugo.io</generator>
|
||
<language>en-us</language>
|
||
<lastBuildDate>Fri, 01 Feb 2019 21:37:30 +0200</lastBuildDate>
|
||
|
||
<atom:link href="https://alanorth.github.io/cgspace-notes/posts/index.xml" rel="self" type="application/rss+xml" />
|
||
|
||
|
||
<item>
|
||
<title>February, 2019</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2019-02/</link>
|
||
<pubDate>Fri, 01 Feb 2019 21:37:30 +0200</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2019-02/</guid>
|
||
<description><h2 id="2019-02-01">2019-02-01</h2>
|
||
|
||
<ul>
|
||
<li>Linode has alerted a few times since last night that the CPU usage on CGSpace (linode18) was high despite me increasing the alert threshold last week from 250% to 275%—I might need to increase it again!</li>
|
||
<li>The top IPs before, during, and after this latest alert tonight were:</li>
|
||
</ul>
|
||
|
||
<pre><code># zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E &quot;01/Feb/2019:(17|18|19|20|21)&quot; | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10
|
||
245 207.46.13.5
|
||
332 54.70.40.11
|
||
385 5.143.231.38
|
||
405 207.46.13.173
|
||
405 207.46.13.75
|
||
1117 66.249.66.219
|
||
1121 35.237.175.180
|
||
1546 5.9.6.51
|
||
2474 45.5.186.2
|
||
5490 85.25.237.71
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li><code>85.25.237.71</code> is the &ldquo;Linguee Bot&rdquo; that I first saw last month</li>
|
||
<li>The Solr statistics the past few months have been very high and I was wondering if the web server logs also showed an increase</li>
|
||
<li>There were just over 3 million accesses in the nginx logs last month:</li>
|
||
</ul>
|
||
|
||
<pre><code># time zcat --force /var/log/nginx/* | grep -cE &quot;[0-9]{1,2}/Jan/2019&quot;
|
||
3018243
|
||
|
||
real 0m19.873s
|
||
user 0m22.203s
|
||
sys 0m1.979s
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>January, 2019</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2019-01/</link>
|
||
<pubDate>Wed, 02 Jan 2019 09:48:30 +0200</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2019-01/</guid>
|
||
<description><h2 id="2019-01-02">2019-01-02</h2>
|
||
|
||
<ul>
|
||
<li>Linode alerted that CGSpace (linode18) had a higher outbound traffic rate than normal early this morning</li>
|
||
<li>I don&rsquo;t see anything interesting in the web server logs around that time though:</li>
|
||
</ul>
|
||
|
||
<pre><code># zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E &quot;02/Jan/2019:0(1|2|3)&quot; | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10
|
||
92 40.77.167.4
|
||
99 210.7.29.100
|
||
120 38.126.157.45
|
||
177 35.237.175.180
|
||
177 40.77.167.32
|
||
216 66.249.75.219
|
||
225 18.203.76.93
|
||
261 46.101.86.248
|
||
357 207.46.13.1
|
||
903 54.70.40.11
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>December, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-12/</link>
|
||
<pubDate>Sun, 02 Dec 2018 02:09:30 +0200</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-12/</guid>
|
||
<description><h2 id="2018-12-01">2018-12-01</h2>
|
||
|
||
<ul>
|
||
<li>Switch CGSpace (linode18) to use OpenJDK instead of Oracle JDK</li>
|
||
<li>I manually installed OpenJDK, then removed Oracle JDK, then re-ran the <a href="http://github.com/ilri/rmg-ansible-public">Ansible playbook</a> to update all configuration files, etc</li>
|
||
<li>Then I ran all system updates and restarted the server</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-12-02">2018-12-02</h2>
|
||
|
||
<ul>
|
||
<li>I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another <a href="https://usn.ubuntu.com/3831-1/">Ghostscript vulnerability last week</a></li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>November, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-11/</link>
|
||
<pubDate>Thu, 01 Nov 2018 16:41:30 +0200</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-11/</guid>
|
||
<description><h2 id="2018-11-01">2018-11-01</h2>
|
||
|
||
<ul>
|
||
<li>Finalize AReS Phase I and Phase II ToRs</li>
|
||
<li>Send a note about my <a href="https://github.com/ilri/dspace-statistics-api">dspace-statistics-api</a> to the dspace-tech mailing list</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-03">2018-11-03</h2>
|
||
|
||
<ul>
|
||
<li>Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage</li>
|
||
<li>Today these are the top 10 IPs:</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>October, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-10/</link>
|
||
<pubDate>Mon, 01 Oct 2018 22:31:54 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-10/</guid>
|
||
<description><h2 id="2018-10-01">2018-10-01</h2>
|
||
|
||
<ul>
|
||
<li>Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items</li>
|
||
<li>I created a GitHub issue to track this <a href="https://github.com/ilri/DSpace/issues/389">#389</a>, because I&rsquo;m super busy in Nairobi right now</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>September, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-09/</link>
|
||
<pubDate>Sun, 02 Sep 2018 09:55:54 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-09/</guid>
|
||
<description><h2 id="2018-09-02">2018-09-02</h2>
|
||
|
||
<ul>
|
||
<li>New <a href="https://jdbc.postgresql.org/documentation/changelog.html#version_42.2.5">PostgreSQL JDBC driver version 42.2.5</a></li>
|
||
<li>I&rsquo;ll update the DSpace role in our <a href="https://github.com/ilri/rmg-ansible-public">Ansible infrastructure playbooks</a> and run the updated playbooks on CGSpace and DSpace Test</li>
|
||
<li>Also, I&rsquo;ll re-run the <code>postgresql</code> tasks because the custom PostgreSQL variables are dynamic according to the system&rsquo;s RAM, and we never re-ran them after migrating to larger Linodes last month</li>
|
||
<li>I&rsquo;m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I&rsquo;m getting those autowire errors in Tomcat 8.5.30 again:</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>August, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-08/</link>
|
||
<pubDate>Wed, 01 Aug 2018 11:52:54 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-08/</guid>
|
||
<description><h2 id="2018-08-01">2018-08-01</h2>
|
||
|
||
<ul>
|
||
<li>DSpace Test had crashed at some point yesterday morning and I see the following in <code>dmesg</code>:</li>
|
||
</ul>
|
||
|
||
<pre><code>[Tue Jul 31 00:00:41 2018] Out of memory: Kill process 1394 (java) score 668 or sacrifice child
|
||
[Tue Jul 31 00:00:41 2018] Killed process 1394 (java) total-vm:15601860kB, anon-rss:5355528kB, file-rss:0kB, shmem-rss:0kB
|
||
[Tue Jul 31 00:00:41 2018] oom_reaper: reaped process 1394 (java), now anon-rss:0kB, file-rss:0kB, shmem-rss:0kB
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Judging from the time of the crash it was probably related to the Discovery indexing that starts at midnight</li>
|
||
<li>From the DSpace log I see that eventually Solr stopped responding, so I guess the <code>java</code> process that was OOM killed above was Tomcat&rsquo;s</li>
|
||
<li>I&rsquo;m not sure why Tomcat didn&rsquo;t crash with an OutOfMemoryError&hellip;</li>
|
||
<li>Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did a few months ago when we tried to run the whole CGSpace Solr core</li>
|
||
<li>The server only has 8GB of RAM so we&rsquo;ll eventually need to upgrade to a larger one because we&rsquo;ll start starving the OS, PostgreSQL, and command line batch processes</li>
|
||
<li>I ran all system updates on DSpace Test and rebooted it</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>July, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-07/</link>
|
||
<pubDate>Sun, 01 Jul 2018 12:56:54 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-07/</guid>
|
||
<description><h2 id="2018-07-01">2018-07-01</h2>
|
||
|
||
<ul>
|
||
<li>I want to upgrade DSpace Test to DSpace 5.8 so I took a backup of its current database just in case:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ pg_dump -b -v -o --format=custom -U dspace -f dspace-2018-07-01.backup dspace
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>During the <code>mvn package</code> stage on the 5.8 branch I kept getting issues with java running out of memory:</li>
|
||
</ul>
|
||
|
||
<pre><code>There is insufficient memory for the Java Runtime Environment to continue.
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>June, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-06/</link>
|
||
<pubDate>Mon, 04 Jun 2018 19:49:54 -0700</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-06/</guid>
|
||
<description><h2 id="2018-06-04">2018-06-04</h2>
|
||
|
||
<ul>
|
||
<li>Test the <a href="https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=560">DSpace 5.8 module upgrades from Atmire</a> (<a href="https://github.com/ilri/DSpace/pull/378">#378</a>)
|
||
|
||
<ul>
|
||
<li>There seems to be a problem with the CUA and L&amp;R versions in <code>pom.xml</code> because they are using SNAPSHOT and it doesn&rsquo;t build</li>
|
||
</ul></li>
|
||
<li>I added the new CCAFS Phase II Project Tag <code>PII-FP1_PACCA2</code> and merged it into the <code>5_x-prod</code> branch (<a href="https://github.com/ilri/DSpace/pull/379">#379</a>)</li>
|
||
<li>I proofed and tested the ILRI author corrections that Peter sent back to me this week:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ ./fix-metadata-values.py -i /tmp/2018-05-30-Correct-660-authors.csv -db dspace -u dspace -p 'fuuu' -f dc.contributor.author -t correct -m 3 -n
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I think a sane proofing workflow in OpenRefine is to apply the custom text facets for check/delete/remove and illegal characters that I developed in <a href="https://alanorth.github.io/cgspace-notes/cgspace-notes/2018-03/">March, 2018</a></li>
|
||
<li>Time to index ~70,000 items on CGSpace:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ time schedtool -D -e ionice -c2 -n7 nice -n19 [dspace]/bin/dspace index-discovery -b
|
||
|
||
real 74m42.646s
|
||
user 8m5.056s
|
||
sys 2m7.289s
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>May, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-05/</link>
|
||
<pubDate>Tue, 01 May 2018 16:43:54 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-05/</guid>
|
||
<description><h2 id="2018-05-01">2018-05-01</h2>
|
||
|
||
<ul>
|
||
<li>I cleared the Solr statistics core on DSpace Test by issuing two commands directly to the Solr admin interface:
|
||
|
||
<ul>
|
||
<li><a href="http://localhost:3000/solr/statistics/update?stream.body=%3Cdelete%3E%3Cquery%3E*:*%3C/query%3E%3C/delete%3E">http://localhost:3000/solr/statistics/update?stream.body=%3Cdelete%3E%3Cquery%3E*:*%3C/query%3E%3C/delete%3E</a></li>
|
||
<li><a href="http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E">http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E</a></li>
|
||
</ul></li>
|
||
<li>Then I reduced the JVM heap size from 6144 back to 5120m</li>
|
||
<li>Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the <a href="https://github.com/ilri/rmg-ansible-public">Ansible infrastructure scripts</a> to support hosts choosing which distribution they want to use</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>April, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-04/</link>
|
||
<pubDate>Sun, 01 Apr 2018 16:13:54 +0200</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-04/</guid>
|
||
<description><h2 id="2018-04-01">2018-04-01</h2>
|
||
|
||
<ul>
|
||
<li>I tried to test something on DSpace Test but noticed that it&rsquo;s down since god knows when</li>
|
||
<li>Catalina logs at least show some memory errors yesterday:</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>March, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-03/</link>
|
||
<pubDate>Fri, 02 Mar 2018 16:07:54 +0200</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-03/</guid>
|
||
<description><h2 id="2018-03-02">2018-03-02</h2>
|
||
|
||
<ul>
|
||
<li>Export a CSV of the IITA community metadata for Martin Mueller</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>February, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-02/</link>
|
||
<pubDate>Thu, 01 Feb 2018 16:28:54 +0200</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-02/</guid>
|
||
<description><h2 id="2018-02-01">2018-02-01</h2>
|
||
|
||
<ul>
|
||
<li>Peter gave feedback on the <code>dc.rights</code> proof of concept that I had sent him last week</li>
|
||
<li>We don&rsquo;t need to distinguish between internal and external works, so that makes it just a simple list</li>
|
||
<li>Yesterday I figured out how to monitor DSpace sessions using JMX</li>
|
||
<li>I copied the logic in the <code>jmx_tomcat_dbpools</code> provided by Ubuntu&rsquo;s <code>munin-plugins-java</code> package and used the stuff I discovered about JMX <a href="https://alanorth.github.io/cgspace-notes/cgspace-notes/2018-01/">in 2018-01</a></li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>January, 2018</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2018-01/</link>
|
||
<pubDate>Tue, 02 Jan 2018 08:35:54 -0800</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2018-01/</guid>
|
||
<description><h2 id="2018-01-02">2018-01-02</h2>
|
||
|
||
<ul>
|
||
<li>Uptime Robot noticed that CGSpace went down and up a few times last night, for a few minutes each time</li>
|
||
<li>I didn&rsquo;t get any load alerts from Linode and the REST and XMLUI logs don&rsquo;t show anything out of the ordinary</li>
|
||
<li>The nginx logs show HTTP 200s until <code>02/Jan/2018:11:27:17 +0000</code> when Uptime Robot got an HTTP 500</li>
|
||
<li>In dspace.log around that time I see many errors like &ldquo;Client closed the connection before file download was complete&rdquo;</li>
|
||
<li>And just before that I see this:</li>
|
||
</ul>
|
||
|
||
<pre><code>Caused by: org.apache.tomcat.jdbc.pool.PoolExhaustedException: [http-bio-127.0.0.1-8443-exec-980] Timeout: Pool empty. Unable to fetch a connection in 5 seconds, none available[size:50; busy:50; idle:0; lastwait:5000].
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Ah hah! So the pool was actually empty!</li>
|
||
<li>I need to increase that, let&rsquo;s try to bump it up from 50 to 75</li>
|
||
<li>After that one client got an HTTP 499 but then the rest were HTTP 200, so I don&rsquo;t know what the hell Uptime Robot saw</li>
|
||
<li>I notice this error quite a few times in dspace.log:</li>
|
||
</ul>
|
||
|
||
<pre><code>2018-01-02 01:21:19,137 ERROR org.dspace.app.xmlui.aspect.discovery.SidebarFacetsTransformer @ Error while searching for sidebar facets
|
||
org.dspace.discovery.SearchServiceException: org.apache.solr.search.SyntaxError: Cannot parse 'dateIssued_keyword:[1976+TO+1979]': Encountered &quot; &quot;]&quot; &quot;] &quot;&quot; at line 1, column 32.
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>And there are many of these errors every day for the past month:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep -c &quot;Error while searching for sidebar facets&quot; dspace.log.*
|
||
dspace.log.2017-11-21:4
|
||
dspace.log.2017-11-22:1
|
||
dspace.log.2017-11-23:4
|
||
dspace.log.2017-11-24:11
|
||
dspace.log.2017-11-25:0
|
||
dspace.log.2017-11-26:1
|
||
dspace.log.2017-11-27:7
|
||
dspace.log.2017-11-28:21
|
||
dspace.log.2017-11-29:31
|
||
dspace.log.2017-11-30:15
|
||
dspace.log.2017-12-01:15
|
||
dspace.log.2017-12-02:20
|
||
dspace.log.2017-12-03:38
|
||
dspace.log.2017-12-04:65
|
||
dspace.log.2017-12-05:43
|
||
dspace.log.2017-12-06:72
|
||
dspace.log.2017-12-07:27
|
||
dspace.log.2017-12-08:15
|
||
dspace.log.2017-12-09:29
|
||
dspace.log.2017-12-10:35
|
||
dspace.log.2017-12-11:20
|
||
dspace.log.2017-12-12:44
|
||
dspace.log.2017-12-13:36
|
||
dspace.log.2017-12-14:59
|
||
dspace.log.2017-12-15:104
|
||
dspace.log.2017-12-16:53
|
||
dspace.log.2017-12-17:66
|
||
dspace.log.2017-12-18:83
|
||
dspace.log.2017-12-19:101
|
||
dspace.log.2017-12-20:74
|
||
dspace.log.2017-12-21:55
|
||
dspace.log.2017-12-22:66
|
||
dspace.log.2017-12-23:50
|
||
dspace.log.2017-12-24:85
|
||
dspace.log.2017-12-25:62
|
||
dspace.log.2017-12-26:49
|
||
dspace.log.2017-12-27:30
|
||
dspace.log.2017-12-28:54
|
||
dspace.log.2017-12-29:68
|
||
dspace.log.2017-12-30:89
|
||
dspace.log.2017-12-31:53
|
||
dspace.log.2018-01-01:45
|
||
dspace.log.2018-01-02:34
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let&rsquo;s Encrypt if it&rsquo;s just a handful of domains</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>December, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-12/</link>
|
||
<pubDate>Fri, 01 Dec 2017 13:53:54 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-12/</guid>
|
||
<description><h2 id="2017-12-01">2017-12-01</h2>
|
||
|
||
<ul>
|
||
<li>Uptime Robot noticed that CGSpace went down</li>
|
||
<li>The logs say &ldquo;Timeout waiting for idle object&rdquo;</li>
|
||
<li>PostgreSQL activity says there are 115 connections currently</li>
|
||
<li>The list of connections to XMLUI and REST API for today:</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>November, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-11/</link>
|
||
<pubDate>Thu, 02 Nov 2017 09:37:54 +0200</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-11/</guid>
|
||
<description><h2 id="2017-11-01">2017-11-01</h2>
|
||
|
||
<ul>
|
||
<li>The CORE developers responded to say they are looking into their bot not respecting our robots.txt</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-11-02">2017-11-02</h2>
|
||
|
||
<ul>
|
||
<li>Today there have been no hits by CORE and no alerts from Linode (coincidence?)</li>
|
||
</ul>
|
||
|
||
<pre><code># grep -c &quot;CORE&quot; /var/log/nginx/access.log
|
||
0
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Generate list of authors on CGSpace for Peter to go through and correct:</li>
|
||
</ul>
|
||
|
||
<pre><code>dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
|
||
COPY 54701
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>October, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-10/</link>
|
||
<pubDate>Sun, 01 Oct 2017 08:07:54 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-10/</guid>
|
||
<description><h2 id="2017-10-01">2017-10-01</h2>
|
||
|
||
<ul>
|
||
<li>Peter emailed to point out that many items in the <a href="https://cgspace.cgiar.org/handle/10568/2703">ILRI archive collection</a> have multiple handles:</li>
|
||
</ul>
|
||
|
||
<pre><code>http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>There appears to be a pattern but I&rsquo;ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine</li>
|
||
<li>Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>CGIAR Library Migration</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/cgiar-library-migration/</link>
|
||
<pubDate>Mon, 18 Sep 2017 16:38:35 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/cgiar-library-migration/</guid>
|
||
<description><p>Rough notes for importing the CGIAR Library content. It was decided that this content would go to a new top-level community called <em>CGIAR System Organization</em>.</p></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>September, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-09/</link>
|
||
<pubDate>Thu, 07 Sep 2017 16:54:52 +0700</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-09/</guid>
|
||
<description><h2 id="2017-09-06">2017-09-06</h2>
|
||
|
||
<ul>
|
||
<li>Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two hours</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-09-07">2017-09-07</h2>
|
||
|
||
<ul>
|
||
<li>Ask Sisay to clean up the WLE approvers a bit, as Marianne&rsquo;s user account is both in the approvers step as well as the group</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>August, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-08/</link>
|
||
<pubDate>Tue, 01 Aug 2017 11:51:52 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-08/</guid>
|
||
<description><h2 id="2017-08-01">2017-08-01</h2>
|
||
|
||
<ul>
|
||
<li>Linode sent an alert that CGSpace (linode18) was using 350% CPU for the past two hours</li>
|
||
<li>I looked in the Activity pane of the Admin Control Panel and it seems that Google, Baidu, Yahoo, and Bing are all crawling with massive numbers of bots concurrently (~100 total, mostly Baidu and Google)</li>
|
||
<li>The good thing is that, according to <code>dspace.log.2017-08-01</code>, they are all using the same Tomcat session</li>
|
||
<li>This means our Tomcat Crawler Session Valve is working</li>
|
||
<li>But many of the bots are browsing dynamic URLs like:
|
||
|
||
<ul>
|
||
<li>/handle/10568/3353/discover</li>
|
||
<li>/handle/10568/16510/browse</li>
|
||
</ul></li>
|
||
<li>The <code>robots.txt</code> only blocks the top-level <code>/discover</code> and <code>/browse</code> URLs&hellip; we will need to find a way to forbid them from accessing these!</li>
|
||
<li>Relevant issue from DSpace Jira (semi resolved in DSpace 6.0): <a href="https://jira.duraspace.org/browse/DS-2962">https://jira.duraspace.org/browse/DS-2962</a></li>
|
||
<li>It turns out that we&rsquo;re already adding the <code>X-Robots-Tag &quot;none&quot;</code> HTTP header, but this only forbids the search engine from <em>indexing</em> the page, not crawling it!</li>
|
||
<li>Also, the bot has to successfully browse the page first so it can receive the HTTP header&hellip;</li>
|
||
<li>We might actually have to <em>block</em> these requests with HTTP 403 depending on the user agent</li>
|
||
<li>Abenet pointed out that the CGIAR Library Historical Archive collection I sent July 20th only had ~100 entries, instead of 2415</li>
|
||
<li>This was due to newline characters in the <code>dc.description.abstract</code> column, which caused OpenRefine to choke when exporting the CSV</li>
|
||
<li>I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using <code>g/^$/d</code></li>
|
||
<li>Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>July, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-07/</link>
|
||
<pubDate>Sat, 01 Jul 2017 18:03:52 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-07/</guid>
|
||
<description><h2 id="2017-07-01">2017-07-01</h2>
|
||
|
||
<ul>
|
||
<li>Run system updates and reboot DSpace Test</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-07-04">2017-07-04</h2>
|
||
|
||
<ul>
|
||
<li>Merge changes for WLE Phase II theme rename (<a href="https://github.com/ilri/DSpace/pull/329">#329</a>)</li>
|
||
<li>Looking at extracting the metadata registries from ICARDA&rsquo;s MEL DSpace database so we can compare fields with CGSpace</li>
|
||
<li>We can use PostgreSQL&rsquo;s extended output format (<code>-x</code>) plus <code>sed</code> to format the output into quasi XML:</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>June, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-06/</link>
|
||
<pubDate>Thu, 01 Jun 2017 10:14:52 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-06/</guid>
|
||
<description>2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we&rsquo;ll create a new sub-community for Phase II and create collections for the research themes there The current &ldquo;Research Themes&rdquo; community will be renamed to &ldquo;WLE Phase I Research Themes&rdquo; Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.</description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>May, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-05/</link>
|
||
<pubDate>Mon, 01 May 2017 16:21:52 +0200</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-05/</guid>
|
||
<description>2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it&rsquo;s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire&rsquo;s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.</description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>April, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-04/</link>
|
||
<pubDate>Sun, 02 Apr 2017 17:08:52 +0200</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-04/</guid>
|
||
<description><h2 id="2017-04-02">2017-04-02</h2>
|
||
|
||
<ul>
|
||
<li>Merge one change to CCAFS flagships that I had forgotten to remove last month (&ldquo;MANAGING CLIMATE RISK&rdquo;): <a href="https://github.com/ilri/DSpace/pull/317">https://github.com/ilri/DSpace/pull/317</a></li>
|
||
<li>Quick proof-of-concept hack to add <code>dc.rights</code> to the input form, including some inline instructions/hints:</li>
|
||
</ul>
|
||
|
||
<p><img src="https://alanorth.github.io/cgspace-notes/cgspace-notes/2017/04/dc-rights.png" alt="dc.rights in the submission form" /></p>
|
||
|
||
<ul>
|
||
<li>Remove redundant/duplicate text in the DSpace submission license</li>
|
||
<li>Testing the CMYK patch on a collection with 650 items:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p &quot;ImageMagick PDF Thumbnail&quot; -v &gt;&amp; /tmp/filter-media-cmyk.txt
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>March, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-03/</link>
|
||
<pubDate>Wed, 01 Mar 2017 17:08:52 +0200</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-03/</guid>
|
||
<description><h2 id="2017-03-01">2017-03-01</h2>
|
||
|
||
<ul>
|
||
<li>Run the 279 CIAT author corrections on CGSpace</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-03-02">2017-03-02</h2>
|
||
|
||
<ul>
|
||
<li>Skype with Michael and Peter, discussing moving the CGIAR Library to CGSpace</li>
|
||
<li>CGIAR people possibly open to moving content, redirecting library.cgiar.org to CGSpace and letting CGSpace resolve their handles</li>
|
||
<li>They might come in at the top level in one &ldquo;CGIAR System&rdquo; community, or with several communities</li>
|
||
<li>I need to spend a bit of time looking at the multiple handle support in DSpace and see if new content can be minted in both handles, or just one?</li>
|
||
<li>Need to send Peter and Michael some notes about this in a few days</li>
|
||
<li>Also, need to consider talking to Atmire about hiring them to bring ORCiD metadata to REST / OAI</li>
|
||
<li>Filed an issue on DSpace issue tracker for the <code>filter-media</code> bug that causes it to process JPGs even when limiting to the PDF thumbnail plugin: <a href="https://jira.duraspace.org/browse/DS-3516">DS-3516</a></li>
|
||
<li>Discovered that the ImageMagic <code>filter-media</code> plugin creates JPG thumbnails with the CMYK colorspace when the source PDF is using CMYK</li>
|
||
<li>Interestingly, it seems DSpace 4.x&rsquo;s thumbnails were sRGB, but forcing regeneration using DSpace 5.x&rsquo;s ImageMagick plugin creates CMYK JPGs if the source PDF was CMYK (see <a href="https://cgspace.cgiar.org/handle/10568/51999"><sup>10568</sup>&frasl;<sub>51999</sub></a>):</li>
|
||
</ul>
|
||
|
||
<pre><code>$ identify ~/Desktop/alc_contrastes_desafios.jpg
|
||
/Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>February, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-02/</link>
|
||
<pubDate>Tue, 07 Feb 2017 07:04:52 -0800</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-02/</guid>
|
||
<description><h2 id="2017-02-07">2017-02-07</h2>
|
||
|
||
<ul>
|
||
<li>An item was mapped twice erroneously again, so I had to remove one of the mappings manually:</li>
|
||
</ul>
|
||
|
||
<pre><code>dspace=# select * from collection2item where item_id = '80278';
|
||
id | collection_id | item_id
|
||
-------+---------------+---------
|
||
92551 | 313 | 80278
|
||
92550 | 313 | 80278
|
||
90774 | 1051 | 80278
|
||
(3 rows)
|
||
dspace=# delete from collection2item where id = 92551 and item_id = 80278;
|
||
DELETE 1
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Create issue on GitHub to track the addition of CCAFS Phase II project tags (<a href="https://github.com/ilri/DSpace/issues/301">#301</a>)</li>
|
||
<li>Looks like we&rsquo;ll be using <code>cg.identifier.ccafsprojectpii</code> as the field name</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>January, 2017</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2017-01/</link>
|
||
<pubDate>Mon, 02 Jan 2017 10:43:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2017-01/</guid>
|
||
<description><h2 id="2017-01-02">2017-01-02</h2>
|
||
|
||
<ul>
|
||
<li>I checked to see if the Solr sharding task that is supposed to run on January 1st had run and saw there was an error</li>
|
||
<li>I tested on DSpace Test as well and it doesn&rsquo;t work there either</li>
|
||
<li>I asked on the dspace-tech mailing list because it seems to be broken, and actually now I&rsquo;m not sure if we&rsquo;ve ever had the sharding task run successfully over all these years</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>December, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-12/</link>
|
||
<pubDate>Fri, 02 Dec 2016 10:43:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-12/</guid>
|
||
<description><h2 id="2016-12-02">2016-12-02</h2>
|
||
|
||
<ul>
|
||
<li>CGSpace was down for five hours in the morning while I was sleeping</li>
|
||
<li>While looking in the logs for errors, I see tons of warnings about Atmire MQM:</li>
|
||
</ul>
|
||
|
||
<pre><code>2016-12-02 03:00:32,352 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=CREATE, SubjectType=BUNDLE, SubjectID=70316, ObjectType=(Unknown), ObjectID=-1, TimeStamp=1480647632305, dispatcher=1544803905, detail=[null], transactionID=&quot;TX157907838689377964651674089851855413607&quot;)
|
||
2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=MODIFY_METADATA, SubjectType=BUNDLE, SubjectID =70316, ObjectType=(Unknown), ObjectID=-1, TimeStamp=1480647632309, dispatcher=1544803905, detail=&quot;dc.title&quot;, transactionID=&quot;TX157907838689377964651674089851855413607&quot;)
|
||
2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=ADD, SubjectType=ITEM, SubjectID=80044, Object Type=BUNDLE, ObjectID=70316, TimeStamp=1480647632311, dispatcher=1544803905, detail=&quot;THUMBNAIL&quot;, transactionID=&quot;TX157907838689377964651674089851855413607&quot;)
|
||
2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=ADD, SubjectType=BUNDLE, SubjectID=70316, Obje ctType=BITSTREAM, ObjectID=86715, TimeStamp=1480647632318, dispatcher=1544803905, detail=&quot;-1&quot;, transactionID=&quot;TX157907838689377964651674089851855413607&quot;)
|
||
2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=MODIFY, SubjectType=ITEM, SubjectID=80044, ObjectType=(Unknown), ObjectID=-1, TimeStamp=1480647632351, dispatcher=1544803905, detail=[null], transactionID=&quot;TX157907838689377964651674089851855413607&quot;)
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I see thousands of them in the logs for the last few months, so it&rsquo;s not related to the DSpace 5.5 upgrade</li>
|
||
<li>I&rsquo;ve raised a ticket with Atmire to ask</li>
|
||
<li>Another worrying error from dspace.log is:</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>November, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-11/</link>
|
||
<pubDate>Tue, 01 Nov 2016 09:21:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-11/</guid>
|
||
<description><h2 id="2016-11-01">2016-11-01</h2>
|
||
|
||
<ul>
|
||
<li>Add <code>dc.type</code> to the output options for Atmire&rsquo;s Listings and Reports module (<a href="https://github.com/ilri/DSpace/pull/286">#286</a>)</li>
|
||
</ul>
|
||
|
||
<p><img src="https://alanorth.github.io/cgspace-notes/cgspace-notes/2016/11/listings-and-reports.png" alt="Listings and Reports with output type" /></p></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>October, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-10/</link>
|
||
<pubDate>Mon, 03 Oct 2016 15:53:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-10/</guid>
|
||
<description><h2 id="2016-10-03">2016-10-03</h2>
|
||
|
||
<ul>
|
||
<li>Testing adding <a href="https://wiki.duraspace.org/display/DSDOC5x/ORCID+Integration#ORCIDIntegration-EditingexistingitemsusingBatchCSVEditing">ORCIDs to a CSV</a> file for a single item to see if the author orders get messed up</li>
|
||
<li>Need to test the following scenarios to see how author order is affected:
|
||
|
||
<ul>
|
||
<li>ORCIDs only</li>
|
||
<li>ORCIDs plus normal authors</li>
|
||
</ul></li>
|
||
<li>I exported a random item&rsquo;s metadata as CSV, deleted <em>all columns</em> except id and collection, and made a new coloum called <code>ORCID:dc.contributor.author</code> with the following random ORCIDs from the ORCID registry:</li>
|
||
</ul>
|
||
|
||
<pre><code>0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>September, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-09/</link>
|
||
<pubDate>Thu, 01 Sep 2016 15:53:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-09/</guid>
|
||
<description><h2 id="2016-09-01">2016-09-01</h2>
|
||
|
||
<ul>
|
||
<li>Discuss helping CCAFS with some batch tagging of ORCID IDs for their authors</li>
|
||
<li>Discuss how the migration of CGIAR&rsquo;s Active Directory to a flat structure will break our LDAP groups in DSpace</li>
|
||
<li>We had been using <code>DC=ILRI</code> to determine whether a user was ILRI or not</li>
|
||
<li>It looks like we might be able to use OUs now, instead of DCs:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b &quot;dc=cgiarad,dc=org&quot; -D &quot;admigration1@cgiarad.org&quot; -W &quot;(sAMAccountName=admigration1)&quot;
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>August, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-08/</link>
|
||
<pubDate>Mon, 01 Aug 2016 15:53:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-08/</guid>
|
||
<description><h2 id="2016-08-01">2016-08-01</h2>
|
||
|
||
<ul>
|
||
<li>Add updated distribution license from Sisay (<a href="https://github.com/ilri/DSpace/issues/259">#259</a>)</li>
|
||
<li>Play with upgrading Mirage 2 dependencies in <code>bower.json</code> because most are several versions of out date</li>
|
||
<li>Bootstrap is at 3.3.0 but upstream is at 3.3.7, and upgrading to anything beyond 3.3.1 breaks glyphicons and probably more</li>
|
||
<li>bower stuff is a dead end, waste of time, too many issues</li>
|
||
<li>Anything after Bootstrap 3.3.1 makes glyphicons disappear (HTTP 404 trying to access from incorrect path of <code>fonts</code>)</li>
|
||
<li>Start working on DSpace 5.1 → 5.5 port:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ git checkout -b 55new 5_x-prod
|
||
$ git reset --hard ilri/5_x-prod
|
||
$ git rebase -i dspace-5.5
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>July, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-07/</link>
|
||
<pubDate>Fri, 01 Jul 2016 10:53:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-07/</guid>
|
||
<description><h2 id="2016-07-01">2016-07-01</h2>
|
||
|
||
<ul>
|
||
<li>Add <code>dc.description.sponsorship</code> to Discovery sidebar facets and make investors clickable in item view (<a href="https://github.com/ilri/DSpace/issues/232">#232</a>)</li>
|
||
<li>I think this query should find and replace all authors that have &ldquo;,&rdquo; at the end of their names:</li>
|
||
</ul>
|
||
|
||
<pre><code>dspacetest=# update metadatavalue set text_value = regexp_replace(text_value, '(^.+?),$', '\1') where metadata_field_id=3 and resource_type_id=2 and text_value ~ '^.+?,$';
|
||
UPDATE 95
|
||
dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and resource_type_id=2 and text_value ~ '^.+?,$';
|
||
text_value
|
||
------------
|
||
(0 rows)
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>In this case the select query was showing 95 results before the update</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>June, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-06/</link>
|
||
<pubDate>Wed, 01 Jun 2016 10:53:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-06/</guid>
|
||
<description><h2 id="2016-06-01">2016-06-01</h2>
|
||
|
||
<ul>
|
||
<li>Experimenting with IFPRI OAI (we want to harvest their publications)</li>
|
||
<li>After reading the <a href="https://www.oclc.org/support/services/contentdm/help/server-admin-help/oai-support.en.html">ContentDM documentation</a> I found IFPRI&rsquo;s OAI endpoint: <a href="http://ebrary.ifpri.org/oai/oai.php">http://ebrary.ifpri.org/oai/oai.php</a></li>
|
||
<li>After reading the <a href="https://www.openarchives.org/OAI/openarchivesprotocol.html">OAI documentation</a> and testing with an <a href="http://validator.oaipmh.com/">OAI validator</a> I found out how to get their publications</li>
|
||
<li>This is their publications set: <a href="http://ebrary.ifpri.org/oai/oai.php?verb=ListRecords&amp;from=2016-01-01&amp;set=p15738coll2&amp;metadataPrefix=oai_dc">http://ebrary.ifpri.org/oai/oai.php?verb=ListRecords&amp;from=2016-01-01&amp;set=p15738coll2&amp;metadataPrefix=oai_dc</a></li>
|
||
<li>You can see the others by using the OAI <code>ListSets</code> verb: <a href="http://ebrary.ifpri.org/oai/oai.php?verb=ListSets">http://ebrary.ifpri.org/oai/oai.php?verb=ListSets</a></li>
|
||
<li>Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in <code>dc.identifier.fund</code> to <code>cg.identifier.cpwfproject</code> and then the rest to <code>dc.description.sponsorship</code></li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>May, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-05/</link>
|
||
<pubDate>Sun, 01 May 2016 23:06:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-05/</guid>
|
||
<description><h2 id="2016-05-01">2016-05-01</h2>
|
||
|
||
<ul>
|
||
<li>Since yesterday there have been 10,000 REST errors and the site has been unstable again</li>
|
||
<li>I have blocked access to the API now</li>
|
||
<li>There are 3,000 IPs accessing the REST API in a 24-hour period!</li>
|
||
</ul>
|
||
|
||
<pre><code># awk '{print $1}' /var/log/nginx/rest.log | uniq | wc -l
|
||
3168
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>April, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-04/</link>
|
||
<pubDate>Mon, 04 Apr 2016 11:06:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-04/</guid>
|
||
<description><h2 id="2016-04-04">2016-04-04</h2>
|
||
|
||
<ul>
|
||
<li>Looking at log file use on CGSpace and notice that we need to work on our cron setup a bit</li>
|
||
<li>We are backing up all logs in the log folder, including useless stuff like solr, cocoon, handle-plugin, etc</li>
|
||
<li>After running DSpace for over five years I&rsquo;ve never needed to look in any other log file than dspace.log, leave alone one from last year!</li>
|
||
<li>This will save us a few gigs of backup space we&rsquo;re paying for on S3</li>
|
||
<li>Also, I noticed the <code>checker</code> log has some errors we should pay attention to:</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>March, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-03/</link>
|
||
<pubDate>Wed, 02 Mar 2016 16:50:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-03/</guid>
|
||
<description><h2 id="2016-03-02">2016-03-02</h2>
|
||
|
||
<ul>
|
||
<li>Looking at issues with author authorities on CGSpace</li>
|
||
<li>For some reason we still have the <code>index-lucene-update</code> cron job active on CGSpace, but I&rsquo;m pretty sure we don&rsquo;t need it as of the latest few versions of Atmire&rsquo;s Listings and Reports module</li>
|
||
<li>Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>February, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-02/</link>
|
||
<pubDate>Fri, 05 Feb 2016 13:18:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-02/</guid>
|
||
<description><h2 id="2016-02-05">2016-02-05</h2>
|
||
|
||
<ul>
|
||
<li>Looking at some DAGRIS data for Abenet Yabowork</li>
|
||
<li>Lots of issues with spaces, newlines, etc causing the import to fail</li>
|
||
<li>I noticed we have a very <em>interesting</em> list of countries on CGSpace:</li>
|
||
</ul>
|
||
|
||
<p><img src="https://alanorth.github.io/cgspace-notes/cgspace-notes/2016/02/cgspace-countries.png" alt="CGSpace country list" /></p>
|
||
|
||
<ul>
|
||
<li>Not only are there 49,000 countries, we have some blanks (25)&hellip;</li>
|
||
<li>Also, lots of things like &ldquo;COTE D`LVOIRE&rdquo; and &ldquo;COTE D IVOIRE&rdquo;</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>January, 2016</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2016-01/</link>
|
||
<pubDate>Wed, 13 Jan 2016 13:18:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2016-01/</guid>
|
||
<description><h2 id="2016-01-13">2016-01-13</h2>
|
||
|
||
<ul>
|
||
<li>Move ILRI collection <code>10568/12503</code> from <code>10568/27869</code> to <code>10568/27629</code> using the <a href="https://gist.github.com/alanorth/392c4660e8b022d99dfa">move_collections.sh</a> script I wrote last year.</li>
|
||
<li>I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated.</li>
|
||
<li>Update GitHub wiki for documentation of <a href="https://github.com/ilri/DSpace/wiki/Maintenance-Tasks">maintenance tasks</a>.</li>
|
||
</ul></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>December, 2015</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2015-12/</link>
|
||
<pubDate>Wed, 02 Dec 2015 13:18:00 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2015-12/</guid>
|
||
<description><h2 id="2015-12-02">2015-12-02</h2>
|
||
|
||
<ul>
|
||
<li>Replace <code>lzop</code> with <code>xz</code> in log compression cron jobs on DSpace Test—it uses less space:</li>
|
||
</ul>
|
||
|
||
<pre><code># cd /home/dspacetest.cgiar.org/log
|
||
# ls -lh dspace.log.2015-11-18*
|
||
-rw-rw-r-- 1 tomcat7 tomcat7 2.0M Nov 18 23:59 dspace.log.2015-11-18
|
||
-rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo
|
||
-rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
<item>
|
||
<title>November, 2015</title>
|
||
<link>https://alanorth.github.io/cgspace-notes/2015-11/</link>
|
||
<pubDate>Mon, 23 Nov 2015 17:00:57 +0300</pubDate>
|
||
|
||
<guid>https://alanorth.github.io/cgspace-notes/2015-11/</guid>
|
||
<description><h2 id="2015-11-22">2015-11-22</h2>
|
||
|
||
<ul>
|
||
<li>CGSpace went down</li>
|
||
<li>Looks like DSpace exhausted its PostgreSQL connection pool</li>
|
||
<li>Last week I had increased the limit from 30 to 60, which seemed to help, but now there are many more idle connections:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace
|
||
78
|
||
</code></pre></description>
|
||
</item>
|
||
|
||
</channel>
|
||
</rss> |