cgspace-notes/docs/index.xml
2021-03-04 22:46:05 +02:00

1393 lines
82 KiB
XML
Raw Blame History

This file contains invisible Unicode characters

This file contains invisible Unicode characters that are indistinguishable to humans but may be processed differently by a computer. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>
<title>CGSpace Notes</title>
<link>https://alanorth.github.io/cgspace-notes/</link>
<description>Recent content on CGSpace Notes</description>
<generator>Hugo -- gohugo.io</generator>
<language>en-us</language>
<lastBuildDate>Mon, 01 Mar 2021 10:13:54 +0200</lastBuildDate><atom:link href="https://alanorth.github.io/cgspace-notes/index.xml" rel="self" type="application/rss+xml" />
<item>
<title>March, 2021</title>
<link>https://alanorth.github.io/cgspace-notes/2021-03/</link>
<pubDate>Mon, 01 Mar 2021 10:13:54 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2021-03/</guid>
<description>&lt;h2 id=&#34;2021-03-01&#34;&gt;2021-03-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Discuss some OpenRXV issues with Abdullah from CodeObia
&lt;ul&gt;
&lt;li&gt;He&amp;rsquo;s trying to work on the DSpace 6+ metadata schema autoimport using the DSpace 6+ REST API&lt;/li&gt;
&lt;li&gt;Also, we found some issues building and running OpenRXV currently due to ecosystem shift in the Node.js dependencies&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>February, 2021</title>
<link>https://alanorth.github.io/cgspace-notes/2021-02/</link>
<pubDate>Mon, 01 Feb 2021 10:13:54 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2021-02/</guid>
<description>&lt;h2 id=&#34;2021-02-01&#34;&gt;2021-02-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Abenet said that CIP found more duplicate records in their export from AReS
&lt;ul&gt;
&lt;li&gt;I re-opened &lt;a href=&#34;https://github.com/ilri/OpenRXV/issues/67&#34;&gt;the issue&lt;/a&gt; on OpenRXV where we had previously noticed this&lt;/li&gt;
&lt;li&gt;The shared link where the duplicates are is here: &lt;a href=&#34;https://cgspace.cgiar.org/explorer/shared/heEOz3YBnXdK69bR2ra6&#34;&gt;https://cgspace.cgiar.org/explorer/shared/heEOz3YBnXdK69bR2ra6&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;I had a call with CodeObia to discuss the work on OpenRXV&lt;/li&gt;
&lt;li&gt;Check the results of the AReS harvesting from last night:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code class=&#34;language-console&#34; data-lang=&#34;console&#34;&gt;$ curl -s &#39;http://localhost:9200/openrxv-items-temp/_count?q=*&amp;amp;pretty&#39;
{
&amp;quot;count&amp;quot; : 100875,
&amp;quot;_shards&amp;quot; : {
&amp;quot;total&amp;quot; : 1,
&amp;quot;successful&amp;quot; : 1,
&amp;quot;skipped&amp;quot; : 0,
&amp;quot;failed&amp;quot; : 0
}
}
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>January, 2021</title>
<link>https://alanorth.github.io/cgspace-notes/2021-01/</link>
<pubDate>Sun, 03 Jan 2021 10:13:54 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2021-01/</guid>
<description>&lt;h2 id=&#34;2021-01-03&#34;&gt;2021-01-03&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Peter notified me that some filters on AReS were broken again
&lt;ul&gt;
&lt;li&gt;It&amp;rsquo;s the same issue with the field names getting &lt;code&gt;.keyword&lt;/code&gt; appended to the end that I already &lt;a href=&#34;https://github.com/ilri/OpenRXV/issues/66&#34;&gt;filed an issue on OpenRXV about last month&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;I fixed the broken filters (careful to not edit any others, lest they break too!)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Fix an issue with start page number for the DSpace REST API and statistics API in OpenRXV
&lt;ul&gt;
&lt;li&gt;The start page had been &amp;ldquo;1&amp;rdquo; in the UI, but in the backend they were doing some gymnastics to adjust to the zero-based offset/limit/page of the DSpace REST API and the statistics API&lt;/li&gt;
&lt;li&gt;I adjusted it to default to 0 and added a note to the admin screen&lt;/li&gt;
&lt;li&gt;I realized that this issue was actually causing the first page of 100 statistics to be missing&amp;hellip;&lt;/li&gt;
&lt;li&gt;For example, &lt;a href=&#34;https://cgspace.cgiar.org/handle/10568/66839&#34;&gt;this item&lt;/a&gt; has 51 views on CGSpace, but 0 on AReS&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>December, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-12/</link>
<pubDate>Tue, 01 Dec 2020 11:32:54 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-12/</guid>
<description>&lt;h2 id=&#34;2020-12-01&#34;&gt;2020-12-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Atmire responded about the issue with duplicate data in our Solr statistics
&lt;ul&gt;
&lt;li&gt;They noticed that some records in the statistics-2015 core haven&amp;rsquo;t been migrated with the AtomicStatisticsUpdateCLI tool yet and assumed that I haven&amp;rsquo;t migrated any of the records yet&lt;/li&gt;
&lt;li&gt;That&amp;rsquo;s strange, as I checked all ten cores and 2015 is the only one with some unmigrated documents, as according to the &lt;code&gt;cua_version&lt;/code&gt; field&lt;/li&gt;
&lt;li&gt;I started processing those (about 411,000 records):&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>CGSpace DSpace 6 Upgrade</title>
<link>https://alanorth.github.io/cgspace-notes/cgspace-dspace6-upgrade/</link>
<pubDate>Sun, 15 Nov 2020 13:27:35 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/cgspace-dspace6-upgrade/</guid>
<description>&lt;p&gt;Notes about the DSpace 6 upgrade on CGSpace in 2020-11.&lt;/p&gt;</description>
</item>
<item>
<title>November, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-11/</link>
<pubDate>Sun, 01 Nov 2020 13:11:54 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-11/</guid>
<description>&lt;h2 id=&#34;2020-11-01&#34;&gt;2020-11-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Continue with processing the statistics-2019 Solr core with the AtomicStatisticsUpdateCLI tool on DSpace Test
&lt;ul&gt;
&lt;li&gt;So far we&amp;rsquo;ve spent at least fifty hours to process the statistics and statistics-2019 core&amp;hellip; wow.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>October, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-10/</link>
<pubDate>Tue, 06 Oct 2020 16:55:54 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-10/</guid>
<description>&lt;h2 id=&#34;2020-10-06&#34;&gt;2020-10-06&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Add tests for the new &lt;code&gt;/items&lt;/code&gt; POST handlers to the DSpace 6.x branch of my &lt;a href=&#34;https://github.com/ilri/dspace-statistics-api/tree/v6_x&#34;&gt;dspace-statistics-api&lt;/a&gt;
&lt;ul&gt;
&lt;li&gt;It took a bit of extra work because I had to learn how to mock the responses for when Solr is not available&lt;/li&gt;
&lt;li&gt;Tag and release version 1.3.0 on GitHub: &lt;a href=&#34;https://github.com/ilri/dspace-statistics-api/releases/tag/v1.3.0&#34;&gt;https://github.com/ilri/dspace-statistics-api/releases/tag/v1.3.0&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Trying to test the changes Atmire sent last week but I had to re-create my local database from a recent CGSpace dump
&lt;ul&gt;
&lt;li&gt;During the FlywayDB migration I got an error:&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>September, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-09/</link>
<pubDate>Wed, 02 Sep 2020 15:35:54 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-09/</guid>
<description>&lt;h2 id=&#34;2020-09-02&#34;&gt;2020-09-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Replace Marissa van Epp for Rhys Bucknall in the CCAFS groups on CGSpace because Marissa no longer works at CCAFS&lt;/li&gt;
&lt;li&gt;The AReS Explorer hasn&amp;rsquo;t updated its index since 2020-08-22 when I last forced it
&lt;ul&gt;
&lt;li&gt;I restarted it again now and told Moayad that the automatic indexing isn&amp;rsquo;t working&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Add &lt;code&gt;Alliance of Bioversity International and CIAT&lt;/code&gt; to affiliations on CGSpace&lt;/li&gt;
&lt;li&gt;Abenet told me that the general search text on AReS doesn&amp;rsquo;t get reset when you use the &amp;ldquo;Reset Filters&amp;rdquo; button
&lt;ul&gt;
&lt;li&gt;I filed a bug on OpenRXV: &lt;a href=&#34;https://github.com/ilri/OpenRXV/issues/39&#34;&gt;https://github.com/ilri/OpenRXV/issues/39&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;I filed an issue on OpenRXV to make some minor edits to the admin UI: &lt;a href=&#34;https://github.com/ilri/OpenRXV/issues/40&#34;&gt;https://github.com/ilri/OpenRXV/issues/40&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>August, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-08/</link>
<pubDate>Sun, 02 Aug 2020 15:35:54 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-08/</guid>
<description>&lt;h2 id=&#34;2020-08-02&#34;&gt;2020-08-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;I spent a few days working on a Java-based curation task to tag items with ISO 3166-1 Alpha2 country codes based on their &lt;code&gt;cg.coverage.country&lt;/code&gt; text values
&lt;ul&gt;
&lt;li&gt;It looks up the names in ISO 3166-1 first, and then in our CGSpace countries mapping (which has five or so of Peter&amp;rsquo;s preferred &amp;ldquo;display&amp;rdquo; country names)&lt;/li&gt;
&lt;li&gt;It implements a &amp;ldquo;force&amp;rdquo; mode too that will clear existing country codes and re-tag everything&lt;/li&gt;
&lt;li&gt;It is class based so I can easily add support for other vocabularies, and the technique could even be used for organizations with mappings to ROR and Clarisa&amp;hellip;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>July, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-07/</link>
<pubDate>Wed, 01 Jul 2020 10:53:54 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-07/</guid>
<description>&lt;h2 id=&#34;2020-07-01&#34;&gt;2020-07-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;A few users noticed that CGSpace wasn&amp;rsquo;t loading items today, item pages seem blank
&lt;ul&gt;
&lt;li&gt;I looked at the PostgreSQL locks but they don&amp;rsquo;t seem unusual&lt;/li&gt;
&lt;li&gt;I guess this is the same &amp;ldquo;blank item page&amp;rdquo; issue that we had a few times in 2019 that we never solved&lt;/li&gt;
&lt;li&gt;I restarted Tomcat and PostgreSQL and the issue was gone&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the &lt;code&gt;5_x-prod&lt;/code&gt; branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter&amp;rsquo;s request&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>June, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-06/</link>
<pubDate>Mon, 01 Jun 2020 13:55:39 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-06/</guid>
<description>&lt;h2 id=&#34;2020-06-01&#34;&gt;2020-06-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;I tried to run the &lt;code&gt;AtomicStatisticsUpdateCLI&lt;/code&gt; CUA migration script on DSpace Test (linode26) again and it is still going very slowly and has tons of errors like I noticed yesterday
&lt;ul&gt;
&lt;li&gt;I sent Atmire the dspace.log from today and told them to log into the server to debug the process&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;In other news, I checked the statistics API on DSpace 6 and it&amp;rsquo;s working&lt;/li&gt;
&lt;li&gt;I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error:&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>May, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-05/</link>
<pubDate>Sat, 02 May 2020 09:52:04 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-05/</guid>
<description>&lt;h2 id=&#34;2020-05-02&#34;&gt;2020-05-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Peter said that CTA is having problems submitting an item to CGSpace
&lt;ul&gt;
&lt;li&gt;Looking at the PostgreSQL stats it seems to be the same issue that Tezira was having last week, as I see the number of connections in &amp;lsquo;idle in transaction&amp;rsquo; and &amp;lsquo;waiting for lock&amp;rsquo; state are increasing again&lt;/li&gt;
&lt;li&gt;I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2.11, and there were some bugs related to transactions fixed in 42.2.12 (which I had updated in the Ansible playbooks, but not deployed yet)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>April, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-04/</link>
<pubDate>Thu, 02 Apr 2020 10:53:24 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-04/</guid>
<description>&lt;h2 id=&#34;2020-04-02&#34;&gt;2020-04-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Maria asked me to update Charles Staver&amp;rsquo;s ORCID iD in the submission template and on CGSpace, as his name was lower case before, and now he has corrected it
&lt;ul&gt;
&lt;li&gt;I updated the fifty-eight existing items on CGSpace&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Looking into the items Udana had asked about last week that were missing Altmetric donuts:
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://hdl.handle.net/10568/103225&#34;&gt;The first&lt;/a&gt; is still missing its DOI, so I added it and &lt;a href=&#34;https://twitter.com/mralanorth/status/1245632619661766657&#34;&gt;tweeted its handle&lt;/a&gt; (after a few hours there was a donut with score 222)&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://hdl.handle.net/10568/106899&#34;&gt;The second item&lt;/a&gt; now has a donut with score 2 since I &lt;a href=&#34;https://twitter.com/mralanorth/status/1243158045540134913&#34;&gt;tweeted its handle&lt;/a&gt; last week&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://hdl.handle.net/10568/107258&#34;&gt;The third item&lt;/a&gt; now has a donut with score 1 since I &lt;a href=&#34;https://twitter.com/mralanorth/status/1243158786392625153&#34;&gt;tweeted it&lt;/a&gt; last week&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;On the same note, the &lt;a href=&#34;https://hdl.handle.net/10568/106573&#34;&gt;one item&lt;/a&gt; Abenet pointed out last week now has a donut with score of 104 after I &lt;a href=&#34;https://twitter.com/mralanorth/status/1243163710241345536&#34;&gt;tweeted it&lt;/a&gt; last week&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>March, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-03/</link>
<pubDate>Mon, 02 Mar 2020 12:31:30 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-03/</guid>
<description>&lt;h2 id=&#34;2020-03-02&#34;&gt;2020-03-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Update &lt;a href=&#34;https://github.com/ilri/dspace-statistics-api&#34;&gt;dspace-statistics-api&lt;/a&gt; for DSpace 6+ UUIDs
&lt;ul&gt;
&lt;li&gt;Tag version 1.2.0 on GitHub&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Test migrating legacy Solr statistics to UUIDs with the as-of-yet unreleased &lt;a href=&#34;https://github.com/DSpace/DSpace/commit/184f2b2153479045fba6239342c63e7f8564b8b6#diff-0350ce2e13b28d5d61252b7a8f50a059&#34;&gt;SolrUpgradePre6xStatistics.java&lt;/a&gt;
&lt;ul&gt;
&lt;li&gt;You need to download this into the DSpace 6.x source and compile it&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>February, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-02/</link>
<pubDate>Sun, 02 Feb 2020 11:56:30 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-02/</guid>
<description>&lt;h2 id=&#34;2020-02-02&#34;&gt;2020-02-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Continue working on porting CGSpace&amp;rsquo;s DSpace 5 code to DSpace 6.3 that I started yesterday
&lt;ul&gt;
&lt;li&gt;Sign up for an account with MaxMind so I can get the GeoLite2-City.mmdb database&lt;/li&gt;
&lt;li&gt;I still need to wire up the API credentials and cron job into the Ansible infrastructure playbooks&lt;/li&gt;
&lt;li&gt;Fix some minor issues in the config and XMLUI themes, like removing Atmire stuff&lt;/li&gt;
&lt;li&gt;The code finally builds and runs with a fresh install&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>January, 2020</title>
<link>https://alanorth.github.io/cgspace-notes/2020-01/</link>
<pubDate>Mon, 06 Jan 2020 10:48:30 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2020-01/</guid>
<description>&lt;h2 id=&#34;2020-01-06&#34;&gt;2020-01-06&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Open &lt;a href=&#34;https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=706&#34;&gt;a ticket&lt;/a&gt; with Atmire to request a quote for the upgrade to DSpace 6&lt;/li&gt;
&lt;li&gt;Last week Altmetric responded about the &lt;a href=&#34;https://hdl.handle.net/10568/97087&#34;&gt;item&lt;/a&gt; that had a lower score than than its DOI
&lt;ul&gt;
&lt;li&gt;The score is now linked to the DOI&lt;/li&gt;
&lt;li&gt;Another &lt;a href=&#34;https://handle.hdl.net/10568/91278&#34;&gt;item&lt;/a&gt; that had the same problem in 2019 has now also linked to the score for its DOI&lt;/li&gt;
&lt;li&gt;Another &lt;a href=&#34;https://hdl.handle.net/10568/81236&#34;&gt;item&lt;/a&gt; that had the same problem in 2019 has also been fixed&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;2020-01-07&#34;&gt;2020-01-07&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Peter Ballantyne highlighted one more WLE &lt;a href=&#34;https://hdl.handle.net/10568/101286&#34;&gt;item&lt;/a&gt; that is missing the Altmetric score that its DOI has
&lt;ul&gt;
&lt;li&gt;The DOI has a score of 259, but the Handle has no score at all&lt;/li&gt;
&lt;li&gt;I &lt;a href=&#34;https://twitter.com/mralanorth/status/1214471427157626881&#34;&gt;tweeted&lt;/a&gt; the CGSpace repository link&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>December, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-12/</link>
<pubDate>Sun, 01 Dec 2019 11:22:30 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-12/</guid>
<description>&lt;h2 id=&#34;2019-12-01&#34;&gt;2019-12-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Upgrade CGSpace (linode18) to Ubuntu 18.04:
&lt;ul&gt;
&lt;li&gt;Check any packages that have residual configs and purge them:&lt;/li&gt;
&lt;li&gt;&lt;!-- raw HTML omitted --&gt;# dpkg -l | grep -E &amp;lsquo;^rc&amp;rsquo; | awk &amp;lsquo;{print $2}&amp;rsquo; | xargs dpkg -P&lt;!-- raw HTML omitted --&gt;&lt;/li&gt;
&lt;li&gt;Make sure all packages are up to date and the package manager is up to date, then reboot:&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;# apt update &amp;amp;&amp;amp; apt full-upgrade
# apt-get autoremove &amp;amp;&amp;amp; apt-get autoclean
# dpkg -C
# reboot
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>November, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-11/</link>
<pubDate>Mon, 04 Nov 2019 12:20:30 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-11/</guid>
<description>&lt;h2 id=&#34;2019-11-04&#34;&gt;2019-11-04&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Peter noticed that there were 5.2 million hits on CGSpace in 2019-10 according to the Atmire usage statistics
&lt;ul&gt;
&lt;li&gt;I looked in the nginx logs and see 4.6 million in the access logs, and 1.2 million in the API logs:&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;# zcat --force /var/log/nginx/*access.log.*.gz | grep -cE &amp;quot;[0-9]{1,2}/Oct/2019&amp;quot;
4671942
# zcat --force /var/log/nginx/{rest,oai,statistics}.log.*.gz | grep -cE &amp;quot;[0-9]{1,2}/Oct/2019&amp;quot;
1277694
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;So 4.6 million from XMLUI and another 1.2 million from API requests&lt;/li&gt;
&lt;li&gt;Let&amp;rsquo;s see how many of the REST API requests were for bitstreams (because they are counted in Solr stats):&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;# zcat --force /var/log/nginx/rest.log.*.gz | grep -c -E &amp;quot;[0-9]{1,2}/Oct/2019&amp;quot;
1183456
# zcat --force /var/log/nginx/rest.log.*.gz | grep -E &amp;quot;[0-9]{1,2}/Oct/2019&amp;quot; | grep -c -E &amp;quot;/rest/bitstreams&amp;quot;
106781
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>CGSpace CG Core v2 Migration</title>
<link>https://alanorth.github.io/cgspace-notes/cgspace-cgcorev2-migration/</link>
<pubDate>Mon, 28 Oct 2019 13:27:35 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/cgspace-cgcorev2-migration/</guid>
<description>&lt;p&gt;Possible changes to CGSpace metadata fields to align more with DC, QDC, and DCTERMS as well as CG Core v2.&lt;/p&gt;
&lt;p&gt;With reference to &lt;a href=&#34;https://agriculturalsemantics.github.io/cg-core/cgcore.html&#34;&gt;CG Core v2 draft standard&lt;/a&gt; by Marie-Angélique as well as &lt;a href=&#34;http://www.dublincore.org/specifications/dublin-core/dcmi-terms/&#34;&gt;DCMI DCTERMS&lt;/a&gt;.&lt;/p&gt;</description>
</item>
<item>
<title>October, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-10/</link>
<pubDate>Tue, 01 Oct 2019 13:20:51 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-10/</guid>
<description>2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script&amp;rsquo;s &amp;ldquo;unneccesary Unicode&amp;rdquo; fix: $ csvcut -c &#39;id,dc.</description>
</item>
<item>
<title>September, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-09/</link>
<pubDate>Sun, 01 Sep 2019 10:17:51 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-09/</guid>
<description>&lt;h2 id=&#34;2019-09-01&#34;&gt;2019-09-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Linode emailed to say that CGSpace (linode18) had a high rate of outbound traffic for several hours this morning&lt;/li&gt;
&lt;li&gt;Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;# zcat --force /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep -E &amp;quot;01/Sep/2019:0&amp;quot; | awk &#39;{print $1}&#39; | sort | uniq -c | sort -n | tail -n 10
440 17.58.101.255
441 157.55.39.101
485 207.46.13.43
728 169.60.128.125
730 207.46.13.108
758 157.55.39.9
808 66.160.140.179
814 207.46.13.212
2472 163.172.71.23
6092 3.94.211.189
# zcat --force /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/oai.log /var/log/nginx/oai.log.1 | grep -E &amp;quot;01/Sep/2019:0&amp;quot; | awk &#39;{print $1}&#39; | sort | uniq -c | sort -n | tail -n 10
33 2a01:7e00::f03c:91ff:fe16:fcb
57 3.83.192.124
57 3.87.77.25
57 54.82.1.8
822 2a01:9cc0:47:1:1a:4:0:2
1223 45.5.184.72
1633 172.104.229.92
5112 205.186.128.185
7249 2a01:7e00::f03c:91ff:fe18:7396
9124 45.5.186.2
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>August, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-08/</link>
<pubDate>Sat, 03 Aug 2019 12:39:51 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-08/</guid>
<description>&lt;h2 id=&#34;2019-08-03&#34;&gt;2019-08-03&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Look at Bioversity&amp;rsquo;s latest migration CSV and now I see that Francesco has cleaned up the extra columns and the newline at the end of the file, but many of the column headers have an extra space in the name&amp;hellip;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;2019-08-04&#34;&gt;2019-08-04&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Deploy ORCID identifier updates requested by Bioversity to CGSpace&lt;/li&gt;
&lt;li&gt;Run system updates on CGSpace (linode18) and reboot it
&lt;ul&gt;
&lt;li&gt;Before updating it I checked Solr and verified that all statistics cores were loaded properly&amp;hellip;&lt;/li&gt;
&lt;li&gt;After rebooting, all statistics cores were loaded&amp;hellip; wow, that&amp;rsquo;s lucky.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Run system updates on DSpace Test (linode19) and reboot it&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>July, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-07/</link>
<pubDate>Mon, 01 Jul 2019 12:13:51 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-07/</guid>
<description>&lt;h2 id=&#34;2019-07-01&#34;&gt;2019-07-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Create an &amp;ldquo;AfricaRice books and book chapters&amp;rdquo; collection on CGSpace for AfricaRice&lt;/li&gt;
&lt;li&gt;Last month Sisay asked why the following &amp;ldquo;most popular&amp;rdquo; statistics link for a range of months in 2018 works for the CIAT community on DSpace Test, but not on CGSpace:
&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://dspacetest.cgiar.org/handle/10568/35697/most-popular/item#simplefilter=custom&amp;amp;time_filter_end_date=01%2F12%2F2018&#34;&gt;DSpace Test&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://cgspace.cgiar.org/handle/10568/35697/most-popular/item#simplefilter=custom&amp;amp;time_filter_end_date=01%2F12%2F2018&#34;&gt;CGSpace&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>June, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-06/</link>
<pubDate>Sun, 02 Jun 2019 10:57:51 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-06/</guid>
<description>&lt;h2 id=&#34;2019-06-02&#34;&gt;2019-06-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Merge the &lt;a href=&#34;https://github.com/ilri/DSpace/pull/425&#34;&gt;Solr filterCache&lt;/a&gt; and &lt;a href=&#34;https://github.com/ilri/DSpace/pull/426&#34;&gt;XMLUI ISI journal&lt;/a&gt; changes to the &lt;code&gt;5_x-prod&lt;/code&gt; branch and deploy on CGSpace&lt;/li&gt;
&lt;li&gt;Run system updates on CGSpace (linode18) and reboot it&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;2019-06-03&#34;&gt;2019-06-03&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Skype with Marie-Angélique and Abenet about &lt;a href=&#34;https://agriculturalsemantics.github.io/cg-core/cgcore.html&#34;&gt;CG Core v2&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>May, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-05/</link>
<pubDate>Wed, 01 May 2019 07:37:43 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-05/</guid>
<description>&lt;h2 id=&#34;2019-05-01&#34;&gt;2019-05-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Help CCAFS with regenerating some item thumbnails after they uploaded new PDFs to some items on CGSpace&lt;/li&gt;
&lt;li&gt;A user on the dspace-tech mailing list offered some suggestions for troubleshooting the problem with the inability to delete certain items
&lt;ul&gt;
&lt;li&gt;Apparently if the item is in the &lt;code&gt;workflowitem&lt;/code&gt; table it is submitted to a workflow&lt;/li&gt;
&lt;li&gt;And if it is in the &lt;code&gt;workspaceitem&lt;/code&gt; table it is in the pre-submitted state&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;The item seems to be in a pre-submitted state, so I tried to delete it from there:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;dspace=# DELETE FROM workspaceitem WHERE item_id=74648;
DELETE 1
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;But after this I tried to delete the item from the XMLUI and it is &lt;em&gt;still&lt;/em&gt; present&amp;hellip;&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>April, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-04/</link>
<pubDate>Mon, 01 Apr 2019 09:00:43 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-04/</guid>
<description>&lt;h2 id=&#34;2019-04-01&#34;&gt;2019-04-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Meeting with AgroKnow to discuss CGSpace, ILRI data, AReS, GARDIAN, etc
&lt;ul&gt;
&lt;li&gt;They asked if we had plans to enable RDF support in CGSpace&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;There have been 4,400 more downloads of the CTA Spore publication from those strange Amazon IP addresses today
&lt;ul&gt;
&lt;li&gt;I suspected that some might not be successful, because the stats show less, but today they were all HTTP 200!&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;# cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep &#39;Spore-192-EN-web.pdf&#39; | grep -E &#39;(18.196.196.108|18.195.78.144|18.195.218.6)&#39; | awk &#39;{print $9}&#39; | sort | uniq -c | sort -n | tail -n 5
4432 200
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;In the last two weeks there have been 47,000 downloads of this &lt;em&gt;same exact PDF&lt;/em&gt; by these three IP addresses&lt;/li&gt;
&lt;li&gt;Apply country and region corrections and deletions on DSpace Test and CGSpace:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;$ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-9-countries.csv -db dspace -u dspace -p &#39;fuuu&#39; -f cg.coverage.country -m 228 -t ACTION -d
$ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u dspace -p &#39;fuuu&#39; -f cg.coverage.region -m 231 -t action -d
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p &#39;fuuu&#39; -m 228 -f cg.coverage.country -d
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p &#39;fuuu&#39; -m 231 -f cg.coverage.region -d
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>March, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-03/</link>
<pubDate>Fri, 01 Mar 2019 12:16:30 +0100</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-03/</guid>
<description>&lt;h2 id=&#34;2019-03-01&#34;&gt;2019-03-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;I checked IITA&amp;rsquo;s 259 Feb 14 records from last month for duplicates using Atmire&amp;rsquo;s Duplicate Checker on a fresh snapshot of CGSpace on my local machine and everything looks good&lt;/li&gt;
&lt;li&gt;I am now only waiting to hear from her about where the items should go, though I assume Journal Articles go to IITA Journal Articles collection, etc&amp;hellip;&lt;/li&gt;
&lt;li&gt;Looking at the other half of Udana&amp;rsquo;s WLE records from 2018-11
&lt;ul&gt;
&lt;li&gt;I finished the ones for Restoring Degraded Landscapes (RDL), but these are for Variability, Risks and Competing Uses (VRC)&lt;/li&gt;
&lt;li&gt;I did the usual cleanups for whitespace, added regions where they made sense for certain countries, cleaned up the DOI link formats, added rights information based on the publications page for a few items&lt;/li&gt;
&lt;li&gt;Most worryingly, there are encoding errors in the abstracts for eleven items, for example:&lt;/li&gt;
&lt;li&gt;68.15% <20> 9.45 instead of 68.15% ± 9.45&lt;/li&gt;
&lt;li&gt;2003<EFBFBD>2013 instead of 20032013&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>February, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-02/</link>
<pubDate>Fri, 01 Feb 2019 21:37:30 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-02/</guid>
<description>&lt;h2 id=&#34;2019-02-01&#34;&gt;2019-02-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Linode has alerted a few times since last night that the CPU usage on CGSpace (linode18) was high despite me increasing the alert threshold last week from 250% to 275%—I might need to increase it again!&lt;/li&gt;
&lt;li&gt;The top IPs before, during, and after this latest alert tonight were:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;# zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E &amp;quot;01/Feb/2019:(17|18|19|20|21)&amp;quot; | awk &#39;{print $1}&#39; | sort | uniq -c | sort -n | tail -n 10
245 207.46.13.5
332 54.70.40.11
385 5.143.231.38
405 207.46.13.173
405 207.46.13.75
1117 66.249.66.219
1121 35.237.175.180
1546 5.9.6.51
2474 45.5.186.2
5490 85.25.237.71
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;&lt;code&gt;85.25.237.71&lt;/code&gt; is the &amp;ldquo;Linguee Bot&amp;rdquo; that I first saw last month&lt;/li&gt;
&lt;li&gt;The Solr statistics the past few months have been very high and I was wondering if the web server logs also showed an increase&lt;/li&gt;
&lt;li&gt;There were just over 3 million accesses in the nginx logs last month:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;# time zcat --force /var/log/nginx/* | grep -cE &amp;quot;[0-9]{1,2}/Jan/2019&amp;quot;
3018243
real 0m19.873s
user 0m22.203s
sys 0m1.979s
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>January, 2019</title>
<link>https://alanorth.github.io/cgspace-notes/2019-01/</link>
<pubDate>Wed, 02 Jan 2019 09:48:30 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2019-01/</guid>
<description>&lt;h2 id=&#34;2019-01-02&#34;&gt;2019-01-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Linode alerted that CGSpace (linode18) had a higher outbound traffic rate than normal early this morning&lt;/li&gt;
&lt;li&gt;I don&amp;rsquo;t see anything interesting in the web server logs around that time though:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;# zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E &amp;quot;02/Jan/2019:0(1|2|3)&amp;quot; | awk &#39;{print $1}&#39; | sort | uniq -c | sort -n | tail -n 10
92 40.77.167.4
99 210.7.29.100
120 38.126.157.45
177 35.237.175.180
177 40.77.167.32
216 66.249.75.219
225 18.203.76.93
261 46.101.86.248
357 207.46.13.1
903 54.70.40.11
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>December, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-12/</link>
<pubDate>Sun, 02 Dec 2018 02:09:30 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-12/</guid>
<description>&lt;h2 id=&#34;2018-12-01&#34;&gt;2018-12-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Switch CGSpace (linode18) to use OpenJDK instead of Oracle JDK&lt;/li&gt;
&lt;li&gt;I manually installed OpenJDK, then removed Oracle JDK, then re-ran the &lt;a href=&#34;http://github.com/ilri/rmg-ansible-public&#34;&gt;Ansible playbook&lt;/a&gt; to update all configuration files, etc&lt;/li&gt;
&lt;li&gt;Then I ran all system updates and restarted the server&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;2018-12-02&#34;&gt;2018-12-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another &lt;a href=&#34;https://usn.ubuntu.com/3831-1/&#34;&gt;Ghostscript vulnerability last week&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>November, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-11/</link>
<pubDate>Thu, 01 Nov 2018 16:41:30 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-11/</guid>
<description>&lt;h2 id=&#34;2018-11-01&#34;&gt;2018-11-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Finalize AReS Phase I and Phase II ToRs&lt;/li&gt;
&lt;li&gt;Send a note about my &lt;a href=&#34;https://github.com/ilri/dspace-statistics-api&#34;&gt;dspace-statistics-api&lt;/a&gt; to the dspace-tech mailing list&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;2018-11-03&#34;&gt;2018-11-03&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage&lt;/li&gt;
&lt;li&gt;Today these are the top 10 IPs:&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>October, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-10/</link>
<pubDate>Mon, 01 Oct 2018 22:31:54 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-10/</guid>
<description>&lt;h2 id=&#34;2018-10-01&#34;&gt;2018-10-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items&lt;/li&gt;
&lt;li&gt;I created a GitHub issue to track this &lt;a href=&#34;https://github.com/ilri/DSpace/issues/389&#34;&gt;#389&lt;/a&gt;, because I&amp;rsquo;m super busy in Nairobi right now&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>September, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-09/</link>
<pubDate>Sun, 02 Sep 2018 09:55:54 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-09/</guid>
<description>&lt;h2 id=&#34;2018-09-02&#34;&gt;2018-09-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;New &lt;a href=&#34;https://jdbc.postgresql.org/documentation/changelog.html#version_42.2.5&#34;&gt;PostgreSQL JDBC driver version 42.2.5&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;I&amp;rsquo;ll update the DSpace role in our &lt;a href=&#34;https://github.com/ilri/rmg-ansible-public&#34;&gt;Ansible infrastructure playbooks&lt;/a&gt; and run the updated playbooks on CGSpace and DSpace Test&lt;/li&gt;
&lt;li&gt;Also, I&amp;rsquo;ll re-run the &lt;code&gt;postgresql&lt;/code&gt; tasks because the custom PostgreSQL variables are dynamic according to the system&amp;rsquo;s RAM, and we never re-ran them after migrating to larger Linodes last month&lt;/li&gt;
&lt;li&gt;I&amp;rsquo;m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I&amp;rsquo;m getting those autowire errors in Tomcat 8.5.30 again:&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>August, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-08/</link>
<pubDate>Wed, 01 Aug 2018 11:52:54 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-08/</guid>
<description>&lt;h2 id=&#34;2018-08-01&#34;&gt;2018-08-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;DSpace Test had crashed at some point yesterday morning and I see the following in &lt;code&gt;dmesg&lt;/code&gt;:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;[Tue Jul 31 00:00:41 2018] Out of memory: Kill process 1394 (java) score 668 or sacrifice child
[Tue Jul 31 00:00:41 2018] Killed process 1394 (java) total-vm:15601860kB, anon-rss:5355528kB, file-rss:0kB, shmem-rss:0kB
[Tue Jul 31 00:00:41 2018] oom_reaper: reaped process 1394 (java), now anon-rss:0kB, file-rss:0kB, shmem-rss:0kB
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;Judging from the time of the crash it was probably related to the Discovery indexing that starts at midnight&lt;/li&gt;
&lt;li&gt;From the DSpace log I see that eventually Solr stopped responding, so I guess the &lt;code&gt;java&lt;/code&gt; process that was OOM killed above was Tomcat&amp;rsquo;s&lt;/li&gt;
&lt;li&gt;I&amp;rsquo;m not sure why Tomcat didn&amp;rsquo;t crash with an OutOfMemoryError&amp;hellip;&lt;/li&gt;
&lt;li&gt;Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did a few months ago when we tried to run the whole CGSpace Solr core&lt;/li&gt;
&lt;li&gt;The server only has 8GB of RAM so we&amp;rsquo;ll eventually need to upgrade to a larger one because we&amp;rsquo;ll start starving the OS, PostgreSQL, and command line batch processes&lt;/li&gt;
&lt;li&gt;I ran all system updates on DSpace Test and rebooted it&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>July, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-07/</link>
<pubDate>Sun, 01 Jul 2018 12:56:54 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-07/</guid>
<description>&lt;h2 id=&#34;2018-07-01&#34;&gt;2018-07-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;I want to upgrade DSpace Test to DSpace 5.8 so I took a backup of its current database just in case:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;$ pg_dump -b -v -o --format=custom -U dspace -f dspace-2018-07-01.backup dspace
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;During the &lt;code&gt;mvn package&lt;/code&gt; stage on the 5.8 branch I kept getting issues with java running out of memory:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;There is insufficient memory for the Java Runtime Environment to continue.
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>June, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-06/</link>
<pubDate>Mon, 04 Jun 2018 19:49:54 -0700</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-06/</guid>
<description>&lt;h2 id=&#34;2018-06-04&#34;&gt;2018-06-04&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Test the &lt;a href=&#34;https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=560&#34;&gt;DSpace 5.8 module upgrades from Atmire&lt;/a&gt; (&lt;a href=&#34;https://github.com/ilri/DSpace/pull/378&#34;&gt;#378&lt;/a&gt;)
&lt;ul&gt;
&lt;li&gt;There seems to be a problem with the CUA and L&amp;amp;R versions in &lt;code&gt;pom.xml&lt;/code&gt; because they are using SNAPSHOT and it doesn&amp;rsquo;t build&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;I added the new CCAFS Phase II Project Tag &lt;code&gt;PII-FP1_PACCA2&lt;/code&gt; and merged it into the &lt;code&gt;5_x-prod&lt;/code&gt; branch (&lt;a href=&#34;https://github.com/ilri/DSpace/pull/379&#34;&gt;#379&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;I proofed and tested the ILRI author corrections that Peter sent back to me this week:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;$ ./fix-metadata-values.py -i /tmp/2018-05-30-Correct-660-authors.csv -db dspace -u dspace -p &#39;fuuu&#39; -f dc.contributor.author -t correct -m 3 -n
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;I think a sane proofing workflow in OpenRefine is to apply the custom text facets for check/delete/remove and illegal characters that I developed in &lt;a href=&#34;https://alanorth.github.io/cgspace-notes/cgspace-notes/2018-03/&#34;&gt;March, 2018&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Time to index ~70,000 items on CGSpace:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;$ time schedtool -D -e ionice -c2 -n7 nice -n19 [dspace]/bin/dspace index-discovery -b
real 74m42.646s
user 8m5.056s
sys 2m7.289s
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>May, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-05/</link>
<pubDate>Tue, 01 May 2018 16:43:54 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-05/</guid>
<description>&lt;h2 id=&#34;2018-05-01&#34;&gt;2018-05-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;I cleared the Solr statistics core on DSpace Test by issuing two commands directly to the Solr admin interface:
&lt;ul&gt;
&lt;li&gt;http://localhost:3000/solr/statistics/update?stream.body=%3Cdelete%3E%3Cquery%3E*:*%3C/query%3E%3C/delete%3E&lt;/li&gt;
&lt;li&gt;http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Then I reduced the JVM heap size from 6144 back to 5120m&lt;/li&gt;
&lt;li&gt;Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the &lt;a href=&#34;https://github.com/ilri/rmg-ansible-public&#34;&gt;Ansible infrastructure scripts&lt;/a&gt; to support hosts choosing which distribution they want to use&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>April, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-04/</link>
<pubDate>Sun, 01 Apr 2018 16:13:54 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-04/</guid>
<description>&lt;h2 id=&#34;2018-04-01&#34;&gt;2018-04-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;I tried to test something on DSpace Test but noticed that it&amp;rsquo;s down since god knows when&lt;/li&gt;
&lt;li&gt;Catalina logs at least show some memory errors yesterday:&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>March, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-03/</link>
<pubDate>Fri, 02 Mar 2018 16:07:54 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-03/</guid>
<description>&lt;h2 id=&#34;2018-03-02&#34;&gt;2018-03-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Export a CSV of the IITA community metadata for Martin Mueller&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>February, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-02/</link>
<pubDate>Thu, 01 Feb 2018 16:28:54 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-02/</guid>
<description>&lt;h2 id=&#34;2018-02-01&#34;&gt;2018-02-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Peter gave feedback on the &lt;code&gt;dc.rights&lt;/code&gt; proof of concept that I had sent him last week&lt;/li&gt;
&lt;li&gt;We don&amp;rsquo;t need to distinguish between internal and external works, so that makes it just a simple list&lt;/li&gt;
&lt;li&gt;Yesterday I figured out how to monitor DSpace sessions using JMX&lt;/li&gt;
&lt;li&gt;I copied the logic in the &lt;code&gt;jmx_tomcat_dbpools&lt;/code&gt; provided by Ubuntu&amp;rsquo;s &lt;code&gt;munin-plugins-java&lt;/code&gt; package and used the stuff I discovered about JMX &lt;a href=&#34;https://alanorth.github.io/cgspace-notes/cgspace-notes/2018-01/&#34;&gt;in 2018-01&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>January, 2018</title>
<link>https://alanorth.github.io/cgspace-notes/2018-01/</link>
<pubDate>Tue, 02 Jan 2018 08:35:54 -0800</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2018-01/</guid>
<description>&lt;h2 id=&#34;2018-01-02&#34;&gt;2018-01-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Uptime Robot noticed that CGSpace went down and up a few times last night, for a few minutes each time&lt;/li&gt;
&lt;li&gt;I didn&amp;rsquo;t get any load alerts from Linode and the REST and XMLUI logs don&amp;rsquo;t show anything out of the ordinary&lt;/li&gt;
&lt;li&gt;The nginx logs show HTTP 200s until &lt;code&gt;02/Jan/2018:11:27:17 +0000&lt;/code&gt; when Uptime Robot got an HTTP 500&lt;/li&gt;
&lt;li&gt;In dspace.log around that time I see many errors like &amp;ldquo;Client closed the connection before file download was complete&amp;rdquo;&lt;/li&gt;
&lt;li&gt;And just before that I see this:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;Caused by: org.apache.tomcat.jdbc.pool.PoolExhaustedException: [http-bio-127.0.0.1-8443-exec-980] Timeout: Pool empty. Unable to fetch a connection in 5 seconds, none available[size:50; busy:50; idle:0; lastwait:5000].
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;Ah hah! So the pool was actually empty!&lt;/li&gt;
&lt;li&gt;I need to increase that, let&amp;rsquo;s try to bump it up from 50 to 75&lt;/li&gt;
&lt;li&gt;After that one client got an HTTP 499 but then the rest were HTTP 200, so I don&amp;rsquo;t know what the hell Uptime Robot saw&lt;/li&gt;
&lt;li&gt;I notice this error quite a few times in dspace.log:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;2018-01-02 01:21:19,137 ERROR org.dspace.app.xmlui.aspect.discovery.SidebarFacetsTransformer @ Error while searching for sidebar facets
org.dspace.discovery.SearchServiceException: org.apache.solr.search.SyntaxError: Cannot parse &#39;dateIssued_keyword:[1976+TO+1979]&#39;: Encountered &amp;quot; &amp;quot;]&amp;quot; &amp;quot;] &amp;quot;&amp;quot; at line 1, column 32.
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;And there are many of these errors every day for the past month:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;$ grep -c &amp;quot;Error while searching for sidebar facets&amp;quot; dspace.log.*
dspace.log.2017-11-21:4
dspace.log.2017-11-22:1
dspace.log.2017-11-23:4
dspace.log.2017-11-24:11
dspace.log.2017-11-25:0
dspace.log.2017-11-26:1
dspace.log.2017-11-27:7
dspace.log.2017-11-28:21
dspace.log.2017-11-29:31
dspace.log.2017-11-30:15
dspace.log.2017-12-01:15
dspace.log.2017-12-02:20
dspace.log.2017-12-03:38
dspace.log.2017-12-04:65
dspace.log.2017-12-05:43
dspace.log.2017-12-06:72
dspace.log.2017-12-07:27
dspace.log.2017-12-08:15
dspace.log.2017-12-09:29
dspace.log.2017-12-10:35
dspace.log.2017-12-11:20
dspace.log.2017-12-12:44
dspace.log.2017-12-13:36
dspace.log.2017-12-14:59
dspace.log.2017-12-15:104
dspace.log.2017-12-16:53
dspace.log.2017-12-17:66
dspace.log.2017-12-18:83
dspace.log.2017-12-19:101
dspace.log.2017-12-20:74
dspace.log.2017-12-21:55
dspace.log.2017-12-22:66
dspace.log.2017-12-23:50
dspace.log.2017-12-24:85
dspace.log.2017-12-25:62
dspace.log.2017-12-26:49
dspace.log.2017-12-27:30
dspace.log.2017-12-28:54
dspace.log.2017-12-29:68
dspace.log.2017-12-30:89
dspace.log.2017-12-31:53
dspace.log.2018-01-01:45
dspace.log.2018-01-02:34
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let&amp;rsquo;s Encrypt if it&amp;rsquo;s just a handful of domains&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>December, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-12/</link>
<pubDate>Fri, 01 Dec 2017 13:53:54 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-12/</guid>
<description>&lt;h2 id=&#34;2017-12-01&#34;&gt;2017-12-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Uptime Robot noticed that CGSpace went down&lt;/li&gt;
&lt;li&gt;The logs say &amp;ldquo;Timeout waiting for idle object&amp;rdquo;&lt;/li&gt;
&lt;li&gt;PostgreSQL activity says there are 115 connections currently&lt;/li&gt;
&lt;li&gt;The list of connections to XMLUI and REST API for today:&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>November, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-11/</link>
<pubDate>Thu, 02 Nov 2017 09:37:54 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-11/</guid>
<description>&lt;h2 id=&#34;2017-11-01&#34;&gt;2017-11-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;The CORE developers responded to say they are looking into their bot not respecting our robots.txt&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;2017-11-02&#34;&gt;2017-11-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Today there have been no hits by CORE and no alerts from Linode (coincidence?)&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;# grep -c &amp;quot;CORE&amp;quot; /var/log/nginx/access.log
0
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;Generate list of authors on CGSpace for Peter to go through and correct:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = &#39;contributor&#39; and qualifier = &#39;author&#39;) AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
COPY 54701
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>October, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-10/</link>
<pubDate>Sun, 01 Oct 2017 08:07:54 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-10/</guid>
<description>&lt;h2 id=&#34;2017-10-01&#34;&gt;2017-10-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Peter emailed to point out that many items in the &lt;a href=&#34;https://cgspace.cgiar.org/handle/10568/2703&#34;&gt;ILRI archive collection&lt;/a&gt; have multiple handles:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;There appears to be a pattern but I&amp;rsquo;ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine&lt;/li&gt;
&lt;li&gt;Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>CGIAR Library Migration</title>
<link>https://alanorth.github.io/cgspace-notes/cgiar-library-migration/</link>
<pubDate>Mon, 18 Sep 2017 16:38:35 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/cgiar-library-migration/</guid>
<description>&lt;p&gt;Rough notes for importing the CGIAR Library content. It was decided that this content would go to a new top-level community called &lt;em&gt;CGIAR System Organization&lt;/em&gt;.&lt;/p&gt;</description>
</item>
<item>
<title>September, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-09/</link>
<pubDate>Thu, 07 Sep 2017 16:54:52 +0700</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-09/</guid>
<description>&lt;h2 id=&#34;2017-09-06&#34;&gt;2017-09-06&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two hours&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;2017-09-07&#34;&gt;2017-09-07&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Ask Sisay to clean up the WLE approvers a bit, as Marianne&amp;rsquo;s user account is both in the approvers step as well as the group&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>August, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-08/</link>
<pubDate>Tue, 01 Aug 2017 11:51:52 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-08/</guid>
<description>&lt;h2 id=&#34;2017-08-01&#34;&gt;2017-08-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Linode sent an alert that CGSpace (linode18) was using 350% CPU for the past two hours&lt;/li&gt;
&lt;li&gt;I looked in the Activity pane of the Admin Control Panel and it seems that Google, Baidu, Yahoo, and Bing are all crawling with massive numbers of bots concurrently (~100 total, mostly Baidu and Google)&lt;/li&gt;
&lt;li&gt;The good thing is that, according to &lt;code&gt;dspace.log.2017-08-01&lt;/code&gt;, they are all using the same Tomcat session&lt;/li&gt;
&lt;li&gt;This means our Tomcat Crawler Session Valve is working&lt;/li&gt;
&lt;li&gt;But many of the bots are browsing dynamic URLs like:
&lt;ul&gt;
&lt;li&gt;/handle/10568/3353/discover&lt;/li&gt;
&lt;li&gt;/handle/10568/16510/browse&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;The &lt;code&gt;robots.txt&lt;/code&gt; only blocks the top-level &lt;code&gt;/discover&lt;/code&gt; and &lt;code&gt;/browse&lt;/code&gt; URLs&amp;hellip; we will need to find a way to forbid them from accessing these!&lt;/li&gt;
&lt;li&gt;Relevant issue from DSpace Jira (semi resolved in DSpace 6.0): &lt;a href=&#34;https://jira.duraspace.org/browse/DS-2962&#34;&gt;https://jira.duraspace.org/browse/DS-2962&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;It turns out that we&amp;rsquo;re already adding the &lt;code&gt;X-Robots-Tag &amp;quot;none&amp;quot;&lt;/code&gt; HTTP header, but this only forbids the search engine from &lt;em&gt;indexing&lt;/em&gt; the page, not crawling it!&lt;/li&gt;
&lt;li&gt;Also, the bot has to successfully browse the page first so it can receive the HTTP header&amp;hellip;&lt;/li&gt;
&lt;li&gt;We might actually have to &lt;em&gt;block&lt;/em&gt; these requests with HTTP 403 depending on the user agent&lt;/li&gt;
&lt;li&gt;Abenet pointed out that the CGIAR Library Historical Archive collection I sent July 20th only had ~100 entries, instead of 2415&lt;/li&gt;
&lt;li&gt;This was due to newline characters in the &lt;code&gt;dc.description.abstract&lt;/code&gt; column, which caused OpenRefine to choke when exporting the CSV&lt;/li&gt;
&lt;li&gt;I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using &lt;code&gt;g/^$/d&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>July, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-07/</link>
<pubDate>Sat, 01 Jul 2017 18:03:52 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-07/</guid>
<description>&lt;h2 id=&#34;2017-07-01&#34;&gt;2017-07-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Run system updates and reboot DSpace Test&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;2017-07-04&#34;&gt;2017-07-04&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Merge changes for WLE Phase II theme rename (&lt;a href=&#34;https://github.com/ilri/DSpace/pull/329&#34;&gt;#329&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Looking at extracting the metadata registries from ICARDA&amp;rsquo;s MEL DSpace database so we can compare fields with CGSpace&lt;/li&gt;
&lt;li&gt;We can use PostgreSQL&amp;rsquo;s extended output format (&lt;code&gt;-x&lt;/code&gt;) plus &lt;code&gt;sed&lt;/code&gt; to format the output into quasi XML:&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>June, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-06/</link>
<pubDate>Thu, 01 Jun 2017 10:14:52 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-06/</guid>
<description>2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we&amp;rsquo;ll create a new sub-community for Phase II and create collections for the research themes there The current &amp;ldquo;Research Themes&amp;rdquo; community will be renamed to &amp;ldquo;WLE Phase I Research Themes&amp;rdquo; Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.</description>
</item>
<item>
<title>May, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-05/</link>
<pubDate>Mon, 01 May 2017 16:21:52 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-05/</guid>
<description>2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it&amp;rsquo;s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire&amp;rsquo;s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.</description>
</item>
<item>
<title>April, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-04/</link>
<pubDate>Sun, 02 Apr 2017 17:08:52 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-04/</guid>
<description>&lt;h2 id=&#34;2017-04-02&#34;&gt;2017-04-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Merge one change to CCAFS flagships that I had forgotten to remove last month (&amp;ldquo;MANAGING CLIMATE RISK&amp;rdquo;): &lt;a href=&#34;https://github.com/ilri/DSpace/pull/317&#34;&gt;https://github.com/ilri/DSpace/pull/317&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Quick proof-of-concept hack to add &lt;code&gt;dc.rights&lt;/code&gt; to the input form, including some inline instructions/hints:&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;img src=&#34;https://alanorth.github.io/cgspace-notes/cgspace-notes/2017/04/dc-rights.png&#34; alt=&#34;dc.rights in the submission form&#34;&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Remove redundant/duplicate text in the DSpace submission license&lt;/li&gt;
&lt;li&gt;Testing the CMYK patch on a collection with 650 items:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;$ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p &amp;quot;ImageMagick PDF Thumbnail&amp;quot; -v &amp;gt;&amp;amp; /tmp/filter-media-cmyk.txt
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>March, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-03/</link>
<pubDate>Wed, 01 Mar 2017 17:08:52 +0200</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-03/</guid>
<description>&lt;h2 id=&#34;2017-03-01&#34;&gt;2017-03-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Run the 279 CIAT author corrections on CGSpace&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;2017-03-02&#34;&gt;2017-03-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Skype with Michael and Peter, discussing moving the CGIAR Library to CGSpace&lt;/li&gt;
&lt;li&gt;CGIAR people possibly open to moving content, redirecting library.cgiar.org to CGSpace and letting CGSpace resolve their handles&lt;/li&gt;
&lt;li&gt;They might come in at the top level in one &amp;ldquo;CGIAR System&amp;rdquo; community, or with several communities&lt;/li&gt;
&lt;li&gt;I need to spend a bit of time looking at the multiple handle support in DSpace and see if new content can be minted in both handles, or just one?&lt;/li&gt;
&lt;li&gt;Need to send Peter and Michael some notes about this in a few days&lt;/li&gt;
&lt;li&gt;Also, need to consider talking to Atmire about hiring them to bring ORCiD metadata to REST / OAI&lt;/li&gt;
&lt;li&gt;Filed an issue on DSpace issue tracker for the &lt;code&gt;filter-media&lt;/code&gt; bug that causes it to process JPGs even when limiting to the PDF thumbnail plugin: &lt;a href=&#34;https://jira.duraspace.org/browse/DS-3516&#34;&gt;DS-3516&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Discovered that the ImageMagic &lt;code&gt;filter-media&lt;/code&gt; plugin creates JPG thumbnails with the CMYK colorspace when the source PDF is using CMYK&lt;/li&gt;
&lt;li&gt;Interestingly, it seems DSpace 4.x&amp;rsquo;s thumbnails were sRGB, but forcing regeneration using DSpace 5.x&amp;rsquo;s ImageMagick plugin creates CMYK JPGs if the source PDF was CMYK (see &lt;a href=&#34;https://cgspace.cgiar.org/handle/10568/51999&#34;&gt;10568/51999&lt;/a&gt;):&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;$ identify ~/Desktop/alc_contrastes_desafios.jpg
/Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>February, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-02/</link>
<pubDate>Tue, 07 Feb 2017 07:04:52 -0800</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-02/</guid>
<description>&lt;h2 id=&#34;2017-02-07&#34;&gt;2017-02-07&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;An item was mapped twice erroneously again, so I had to remove one of the mappings manually:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;dspace=# select * from collection2item where item_id = &#39;80278&#39;;
id | collection_id | item_id
-------+---------------+---------
92551 | 313 | 80278
92550 | 313 | 80278
90774 | 1051 | 80278
(3 rows)
dspace=# delete from collection2item where id = 92551 and item_id = 80278;
DELETE 1
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;Create issue on GitHub to track the addition of CCAFS Phase II project tags (&lt;a href=&#34;https://github.com/ilri/DSpace/issues/301&#34;&gt;#301&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Looks like we&amp;rsquo;ll be using &lt;code&gt;cg.identifier.ccafsprojectpii&lt;/code&gt; as the field name&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>January, 2017</title>
<link>https://alanorth.github.io/cgspace-notes/2017-01/</link>
<pubDate>Mon, 02 Jan 2017 10:43:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2017-01/</guid>
<description>&lt;h2 id=&#34;2017-01-02&#34;&gt;2017-01-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;I checked to see if the Solr sharding task that is supposed to run on January 1st had run and saw there was an error&lt;/li&gt;
&lt;li&gt;I tested on DSpace Test as well and it doesn&amp;rsquo;t work there either&lt;/li&gt;
&lt;li&gt;I asked on the dspace-tech mailing list because it seems to be broken, and actually now I&amp;rsquo;m not sure if we&amp;rsquo;ve ever had the sharding task run successfully over all these years&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>December, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-12/</link>
<pubDate>Fri, 02 Dec 2016 10:43:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-12/</guid>
<description>&lt;h2 id=&#34;2016-12-02&#34;&gt;2016-12-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;CGSpace was down for five hours in the morning while I was sleeping&lt;/li&gt;
&lt;li&gt;While looking in the logs for errors, I see tons of warnings about Atmire MQM:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;2016-12-02 03:00:32,352 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=CREATE, SubjectType=BUNDLE, SubjectID=70316, ObjectType=(Unknown), ObjectID=-1, TimeStamp=1480647632305, dispatcher=1544803905, detail=[null], transactionID=&amp;quot;TX157907838689377964651674089851855413607&amp;quot;)
2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=MODIFY_METADATA, SubjectType=BUNDLE, SubjectID =70316, ObjectType=(Unknown), ObjectID=-1, TimeStamp=1480647632309, dispatcher=1544803905, detail=&amp;quot;dc.title&amp;quot;, transactionID=&amp;quot;TX157907838689377964651674089851855413607&amp;quot;)
2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=ADD, SubjectType=ITEM, SubjectID=80044, Object Type=BUNDLE, ObjectID=70316, TimeStamp=1480647632311, dispatcher=1544803905, detail=&amp;quot;THUMBNAIL&amp;quot;, transactionID=&amp;quot;TX157907838689377964651674089851855413607&amp;quot;)
2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=ADD, SubjectType=BUNDLE, SubjectID=70316, Obje ctType=BITSTREAM, ObjectID=86715, TimeStamp=1480647632318, dispatcher=1544803905, detail=&amp;quot;-1&amp;quot;, transactionID=&amp;quot;TX157907838689377964651674089851855413607&amp;quot;)
2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=MODIFY, SubjectType=ITEM, SubjectID=80044, ObjectType=(Unknown), ObjectID=-1, TimeStamp=1480647632351, dispatcher=1544803905, detail=[null], transactionID=&amp;quot;TX157907838689377964651674089851855413607&amp;quot;)
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;I see thousands of them in the logs for the last few months, so it&amp;rsquo;s not related to the DSpace 5.5 upgrade&lt;/li&gt;
&lt;li&gt;I&amp;rsquo;ve raised a ticket with Atmire to ask&lt;/li&gt;
&lt;li&gt;Another worrying error from dspace.log is:&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>November, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-11/</link>
<pubDate>Tue, 01 Nov 2016 09:21:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-11/</guid>
<description>&lt;h2 id=&#34;2016-11-01&#34;&gt;2016-11-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Add &lt;code&gt;dc.type&lt;/code&gt; to the output options for Atmire&amp;rsquo;s Listings and Reports module (&lt;a href=&#34;https://github.com/ilri/DSpace/pull/286&#34;&gt;#286&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;img src=&#34;https://alanorth.github.io/cgspace-notes/cgspace-notes/2016/11/listings-and-reports.png&#34; alt=&#34;Listings and Reports with output type&#34;&gt;&lt;/p&gt;</description>
</item>
<item>
<title>October, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-10/</link>
<pubDate>Mon, 03 Oct 2016 15:53:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-10/</guid>
<description>&lt;h2 id=&#34;2016-10-03&#34;&gt;2016-10-03&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Testing adding &lt;a href=&#34;https://wiki.lyrasis.org/display/DSDOC5x/ORCID+Integration#ORCIDIntegration-EditingexistingitemsusingBatchCSVEditing&#34;&gt;ORCIDs to a CSV&lt;/a&gt; file for a single item to see if the author orders get messed up&lt;/li&gt;
&lt;li&gt;Need to test the following scenarios to see how author order is affected:
&lt;ul&gt;
&lt;li&gt;ORCIDs only&lt;/li&gt;
&lt;li&gt;ORCIDs plus normal authors&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;I exported a random item&amp;rsquo;s metadata as CSV, deleted &lt;em&gt;all columns&lt;/em&gt; except id and collection, and made a new coloum called &lt;code&gt;ORCID:dc.contributor.author&lt;/code&gt; with the following random ORCIDs from the ORCID registry:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>September, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-09/</link>
<pubDate>Thu, 01 Sep 2016 15:53:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-09/</guid>
<description>&lt;h2 id=&#34;2016-09-01&#34;&gt;2016-09-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Discuss helping CCAFS with some batch tagging of ORCID IDs for their authors&lt;/li&gt;
&lt;li&gt;Discuss how the migration of CGIAR&amp;rsquo;s Active Directory to a flat structure will break our LDAP groups in DSpace&lt;/li&gt;
&lt;li&gt;We had been using &lt;code&gt;DC=ILRI&lt;/code&gt; to determine whether a user was ILRI or not&lt;/li&gt;
&lt;li&gt;It looks like we might be able to use OUs now, instead of DCs:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;$ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b &amp;quot;dc=cgiarad,dc=org&amp;quot; -D &amp;quot;admigration1@cgiarad.org&amp;quot; -W &amp;quot;(sAMAccountName=admigration1)&amp;quot;
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>August, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-08/</link>
<pubDate>Mon, 01 Aug 2016 15:53:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-08/</guid>
<description>&lt;h2 id=&#34;2016-08-01&#34;&gt;2016-08-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Add updated distribution license from Sisay (&lt;a href=&#34;https://github.com/ilri/DSpace/issues/259&#34;&gt;#259&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Play with upgrading Mirage 2 dependencies in &lt;code&gt;bower.json&lt;/code&gt; because most are several versions of out date&lt;/li&gt;
&lt;li&gt;Bootstrap is at 3.3.0 but upstream is at 3.3.7, and upgrading to anything beyond 3.3.1 breaks glyphicons and probably more&lt;/li&gt;
&lt;li&gt;bower stuff is a dead end, waste of time, too many issues&lt;/li&gt;
&lt;li&gt;Anything after Bootstrap 3.3.1 makes glyphicons disappear (HTTP 404 trying to access from incorrect path of &lt;code&gt;fonts&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Start working on DSpace 5.15.5 port:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;$ git checkout -b 55new 5_x-prod
$ git reset --hard ilri/5_x-prod
$ git rebase -i dspace-5.5
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>July, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-07/</link>
<pubDate>Fri, 01 Jul 2016 10:53:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-07/</guid>
<description>&lt;h2 id=&#34;2016-07-01&#34;&gt;2016-07-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Add &lt;code&gt;dc.description.sponsorship&lt;/code&gt; to Discovery sidebar facets and make investors clickable in item view (&lt;a href=&#34;https://github.com/ilri/DSpace/issues/232&#34;&gt;#232&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;I think this query should find and replace all authors that have &amp;ldquo;,&amp;rdquo; at the end of their names:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;dspacetest=# update metadatavalue set text_value = regexp_replace(text_value, &#39;(^.+?),$&#39;, &#39;\1&#39;) where metadata_field_id=3 and resource_type_id=2 and text_value ~ &#39;^.+?,$&#39;;
UPDATE 95
dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and resource_type_id=2 and text_value ~ &#39;^.+?,$&#39;;
text_value
------------
(0 rows)
&lt;/code&gt;&lt;/pre&gt;&lt;ul&gt;
&lt;li&gt;In this case the select query was showing 95 results before the update&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>June, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-06/</link>
<pubDate>Wed, 01 Jun 2016 10:53:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-06/</guid>
<description>&lt;h2 id=&#34;2016-06-01&#34;&gt;2016-06-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Experimenting with IFPRI OAI (we want to harvest their publications)&lt;/li&gt;
&lt;li&gt;After reading the &lt;a href=&#34;https://www.oclc.org/support/services/contentdm/help/server-admin-help/oai-support.en.html&#34;&gt;ContentDM documentation&lt;/a&gt; I found IFPRI&amp;rsquo;s OAI endpoint: &lt;a href=&#34;http://ebrary.ifpri.org/oai/oai.php&#34;&gt;http://ebrary.ifpri.org/oai/oai.php&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;After reading the &lt;a href=&#34;https://www.openarchives.org/OAI/openarchivesprotocol.html&#34;&gt;OAI documentation&lt;/a&gt; and testing with an &lt;a href=&#34;http://validator.oaipmh.com/&#34;&gt;OAI validator&lt;/a&gt; I found out how to get their publications&lt;/li&gt;
&lt;li&gt;This is their publications set: &lt;a href=&#34;http://ebrary.ifpri.org/oai/oai.php?verb=ListRecords&amp;amp;from=2016-01-01&amp;amp;set=p15738coll2&amp;amp;metadataPrefix=oai_dc&#34;&gt;http://ebrary.ifpri.org/oai/oai.php?verb=ListRecords&amp;amp;from=2016-01-01&amp;amp;set=p15738coll2&amp;amp;metadataPrefix=oai_dc&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;You can see the others by using the OAI &lt;code&gt;ListSets&lt;/code&gt; verb: &lt;a href=&#34;http://ebrary.ifpri.org/oai/oai.php?verb=ListSets&#34;&gt;http://ebrary.ifpri.org/oai/oai.php?verb=ListSets&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in &lt;code&gt;dc.identifier.fund&lt;/code&gt; to &lt;code&gt;cg.identifier.cpwfproject&lt;/code&gt; and then the rest to &lt;code&gt;dc.description.sponsorship&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>May, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-05/</link>
<pubDate>Sun, 01 May 2016 23:06:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-05/</guid>
<description>&lt;h2 id=&#34;2016-05-01&#34;&gt;2016-05-01&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Since yesterday there have been 10,000 REST errors and the site has been unstable again&lt;/li&gt;
&lt;li&gt;I have blocked access to the API now&lt;/li&gt;
&lt;li&gt;There are 3,000 IPs accessing the REST API in a 24-hour period!&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;# awk &#39;{print $1}&#39; /var/log/nginx/rest.log | uniq | wc -l
3168
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>April, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-04/</link>
<pubDate>Mon, 04 Apr 2016 11:06:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-04/</guid>
<description>&lt;h2 id=&#34;2016-04-04&#34;&gt;2016-04-04&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Looking at log file use on CGSpace and notice that we need to work on our cron setup a bit&lt;/li&gt;
&lt;li&gt;We are backing up all logs in the log folder, including useless stuff like solr, cocoon, handle-plugin, etc&lt;/li&gt;
&lt;li&gt;After running DSpace for over five years I&amp;rsquo;ve never needed to look in any other log file than dspace.log, leave alone one from last year!&lt;/li&gt;
&lt;li&gt;This will save us a few gigs of backup space we&amp;rsquo;re paying for on S3&lt;/li&gt;
&lt;li&gt;Also, I noticed the &lt;code&gt;checker&lt;/code&gt; log has some errors we should pay attention to:&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>March, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-03/</link>
<pubDate>Wed, 02 Mar 2016 16:50:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-03/</guid>
<description>&lt;h2 id=&#34;2016-03-02&#34;&gt;2016-03-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Looking at issues with author authorities on CGSpace&lt;/li&gt;
&lt;li&gt;For some reason we still have the &lt;code&gt;index-lucene-update&lt;/code&gt; cron job active on CGSpace, but I&amp;rsquo;m pretty sure we don&amp;rsquo;t need it as of the latest few versions of Atmire&amp;rsquo;s Listings and Reports module&lt;/li&gt;
&lt;li&gt;Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>February, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-02/</link>
<pubDate>Fri, 05 Feb 2016 13:18:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-02/</guid>
<description>&lt;h2 id=&#34;2016-02-05&#34;&gt;2016-02-05&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Looking at some DAGRIS data for Abenet Yabowork&lt;/li&gt;
&lt;li&gt;Lots of issues with spaces, newlines, etc causing the import to fail&lt;/li&gt;
&lt;li&gt;I noticed we have a very &lt;em&gt;interesting&lt;/em&gt; list of countries on CGSpace:&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;img src=&#34;https://alanorth.github.io/cgspace-notes/cgspace-notes/2016/02/cgspace-countries.png&#34; alt=&#34;CGSpace country list&#34;&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Not only are there 49,000 countries, we have some blanks (25)&amp;hellip;&lt;/li&gt;
&lt;li&gt;Also, lots of things like &amp;ldquo;COTE D`LVOIRE&amp;rdquo; and &amp;ldquo;COTE D IVOIRE&amp;rdquo;&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>January, 2016</title>
<link>https://alanorth.github.io/cgspace-notes/2016-01/</link>
<pubDate>Wed, 13 Jan 2016 13:18:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2016-01/</guid>
<description>&lt;h2 id=&#34;2016-01-13&#34;&gt;2016-01-13&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Move ILRI collection &lt;code&gt;10568/12503&lt;/code&gt; from &lt;code&gt;10568/27869&lt;/code&gt; to &lt;code&gt;10568/27629&lt;/code&gt; using the &lt;a href=&#34;https://gist.github.com/alanorth/392c4660e8b022d99dfa&#34;&gt;move_collections.sh&lt;/a&gt; script I wrote last year.&lt;/li&gt;
&lt;li&gt;I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated.&lt;/li&gt;
&lt;li&gt;Update GitHub wiki for documentation of &lt;a href=&#34;https://github.com/ilri/DSpace/wiki/Maintenance-Tasks&#34;&gt;maintenance tasks&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;</description>
</item>
<item>
<title>December, 2015</title>
<link>https://alanorth.github.io/cgspace-notes/2015-12/</link>
<pubDate>Wed, 02 Dec 2015 13:18:00 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2015-12/</guid>
<description>&lt;h2 id=&#34;2015-12-02&#34;&gt;2015-12-02&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Replace &lt;code&gt;lzop&lt;/code&gt; with &lt;code&gt;xz&lt;/code&gt; in log compression cron jobs on DSpace Test—it uses less space:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;# cd /home/dspacetest.cgiar.org/log
# ls -lh dspace.log.2015-11-18*
-rw-rw-r-- 1 tomcat7 tomcat7 2.0M Nov 18 23:59 dspace.log.2015-11-18
-rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo
-rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
<item>
<title>November, 2015</title>
<link>https://alanorth.github.io/cgspace-notes/2015-11/</link>
<pubDate>Mon, 23 Nov 2015 17:00:57 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2015-11/</guid>
<description>&lt;h2 id=&#34;2015-11-22&#34;&gt;2015-11-22&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;CGSpace went down&lt;/li&gt;
&lt;li&gt;Looks like DSpace exhausted its PostgreSQL connection pool&lt;/li&gt;
&lt;li&gt;Last week I had increased the limit from 30 to 60, which seemed to help, but now there are many more idle connections:&lt;/li&gt;
&lt;/ul&gt;
&lt;pre&gt;&lt;code&gt;$ psql -c &#39;SELECT * from pg_stat_activity;&#39; | grep idle | grep -c cgspace
78
&lt;/code&gt;&lt;/pre&gt;</description>
</item>
</channel>
</rss>