CGSpace Notes https://alanorth.github.io/cgspace-notes/ Recent content on CGSpace Notes Hugo -- gohugo.io en-us Thu, 02 Apr 2020 10:53:24 +0300 April, 2020 https://alanorth.github.io/cgspace-notes/2020-04/ Thu, 02 Apr 2020 10:53:24 +0300 https://alanorth.github.io/cgspace-notes/2020-04/ <h2 id="2020-04-02">2020-04-02</h2> <ul> <li>Maria asked me to update Charles Staver&rsquo;s ORCID iD in the submission template and on CGSpace, as his name was lower case before, and now he has corrected it <ul> <li>I updated the fifty-eight existing items on CGSpace</li> </ul> </li> <li>Looking into the items Udana had asked about last week that were missing Altmetric donuts: <ul> <li><a href="https://hdl.handle.net/10568/103225">The first</a> is still missing its DOI, so I added it and <a href="https://twitter.com/mralanorth/status/1245632619661766657">tweeted its handle</a> (after a few hours there was a donut with score 222)</li> <li><a href="https://hdl.handle.net/10568/106899">The second item</a> now has a donut with score 2 since I <a href="https://twitter.com/mralanorth/status/1243158045540134913">tweeted its handle</a> last week</li> <li><a href="https://hdl.handle.net/10568/107258">The third item</a> now has a donut with score 1 since I <a href="https://twitter.com/mralanorth/status/1243158786392625153">tweeted it</a> last week</li> </ul> </li> <li>On the same note, the <a href="https://hdl.handle.net/10568/106573">one item</a> Abenet pointed out last week now has a donut with score of 104 after I <a href="https://twitter.com/mralanorth/status/1243163710241345536">tweeted it</a> last week</li> </ul> March, 2020 https://alanorth.github.io/cgspace-notes/2020-03/ Mon, 02 Mar 2020 12:31:30 +0200 https://alanorth.github.io/cgspace-notes/2020-03/ <h2 id="2020-03-02">2020-03-02</h2> <ul> <li>Update <a href="https://github.com/ilri/dspace-statistics-api">dspace-statistics-api</a> for DSpace 6+ UUIDs <ul> <li>Tag version 1.2.0 on GitHub</li> </ul> </li> <li>Test migrating legacy Solr statistics to UUIDs with the as-of-yet unreleased <a href="https://github.com/DSpace/DSpace/commit/184f2b2153479045fba6239342c63e7f8564b8b6#diff-0350ce2e13b28d5d61252b7a8f50a059">SolrUpgradePre6xStatistics.java</a> <ul> <li>You need to download this into the DSpace 6.x source and compile it</li> </ul> </li> </ul> February, 2020 https://alanorth.github.io/cgspace-notes/2020-02/ Sun, 02 Feb 2020 11:56:30 +0200 https://alanorth.github.io/cgspace-notes/2020-02/ <h2 id="2020-02-02">2020-02-02</h2> <ul> <li>Continue working on porting CGSpace&rsquo;s DSpace 5 code to DSpace 6.3 that I started yesterday <ul> <li>Sign up for an account with MaxMind so I can get the GeoLite2-City.mmdb database</li> <li>I still need to wire up the API credentials and cron job into the Ansible infrastructure playbooks</li> <li>Fix some minor issues in the config and XMLUI themes, like removing Atmire stuff</li> <li>The code finally builds and runs with a fresh install</li> </ul> </li> </ul> January, 2020 https://alanorth.github.io/cgspace-notes/2020-01/ Mon, 06 Jan 2020 10:48:30 +0200 https://alanorth.github.io/cgspace-notes/2020-01/ <h2 id="2020-01-06">2020-01-06</h2> <ul> <li>Open <a href="https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=706">a ticket</a> with Atmire to request a quote for the upgrade to DSpace 6</li> <li>Last week Altmetric responded about the <a href="https://hdl.handle.net/10568/97087">item</a> that had a lower score than than its DOI <ul> <li>The score is now linked to the DOI</li> <li>Another <a href="https://handle.hdl.net/10568/91278">item</a> that had the same problem in 2019 has now also linked to the score for its DOI</li> <li>Another <a href="https://hdl.handle.net/10568/81236">item</a> that had the same problem in 2019 has also been fixed</li> </ul> </li> </ul> <h2 id="2020-01-07">2020-01-07</h2> <ul> <li>Peter Ballantyne highlighted one more WLE <a href="https://hdl.handle.net/10568/101286">item</a> that is missing the Altmetric score that its DOI has <ul> <li>The DOI has a score of 259, but the Handle has no score at all</li> <li>I <a href="https://twitter.com/mralanorth/status/1214471427157626881">tweeted</a> the CGSpace repository link</li> </ul> </li> </ul> December, 2019 https://alanorth.github.io/cgspace-notes/2019-12/ Sun, 01 Dec 2019 11:22:30 +0200 https://alanorth.github.io/cgspace-notes/2019-12/ <h2 id="2019-12-01">2019-12-01</h2> <ul> <li>Upgrade CGSpace (linode18) to Ubuntu 18.04: <ul> <li>Check any packages that have residual configs and purge them:</li> <li><!-- raw HTML omitted --># dpkg -l | grep -E &lsquo;^rc&rsquo; | awk &lsquo;{print $2}&rsquo; | xargs dpkg -P<!-- raw HTML omitted --></li> <li>Make sure all packages are up to date and the package manager is up to date, then reboot:</li> </ul> </li> </ul> <pre><code># apt update &amp;&amp; apt full-upgrade # apt-get autoremove &amp;&amp; apt-get autoclean # dpkg -C # reboot </code></pre> November, 2019 https://alanorth.github.io/cgspace-notes/2019-11/ Mon, 04 Nov 2019 12:20:30 +0200 https://alanorth.github.io/cgspace-notes/2019-11/ <h2 id="2019-11-04">2019-11-04</h2> <ul> <li>Peter noticed that there were 5.2 million hits on CGSpace in 2019-10 according to the Atmire usage statistics <ul> <li>I looked in the nginx logs and see 4.6 million in the access logs, and 1.2 million in the API logs:</li> </ul> </li> </ul> <pre><code># zcat --force /var/log/nginx/*access.log.*.gz | grep -cE &quot;[0-9]{1,2}/Oct/2019&quot; 4671942 # zcat --force /var/log/nginx/{rest,oai,statistics}.log.*.gz | grep -cE &quot;[0-9]{1,2}/Oct/2019&quot; 1277694 </code></pre><ul> <li>So 4.6 million from XMLUI and another 1.2 million from API requests</li> <li>Let&rsquo;s see how many of the REST API requests were for bitstreams (because they are counted in Solr stats):</li> </ul> <pre><code># zcat --force /var/log/nginx/rest.log.*.gz | grep -c -E &quot;[0-9]{1,2}/Oct/2019&quot; 1183456 # zcat --force /var/log/nginx/rest.log.*.gz | grep -E &quot;[0-9]{1,2}/Oct/2019&quot; | grep -c -E &quot;/rest/bitstreams&quot; 106781 </code></pre> CGSpace CG Core v2 Migration https://alanorth.github.io/cgspace-notes/cgspace-cgcorev2-migration/ Mon, 28 Oct 2019 13:27:35 +0200 https://alanorth.github.io/cgspace-notes/cgspace-cgcorev2-migration/ <p>Possible changes to CGSpace metadata fields to align more with DC, QDC, and DCTERMS as well as CG Core v2.</p> <p>With reference to <a href="https://agriculturalsemantics.github.io/cg-core/cgcore.html">CG Core v2 draft standard</a> by Marie-Angélique as well as <a href="http://www.dublincore.org/specifications/dublin-core/dcmi-terms/">DCMI DCTERMS</a>.</p> October, 2019 https://alanorth.github.io/cgspace-notes/2019-10/ Tue, 01 Oct 2019 13:20:51 +0300 https://alanorth.github.io/cgspace-notes/2019-10/ 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script&rsquo;s &ldquo;unneccesary Unicode&rdquo; fix: $ csvcut -c 'id,dc. September, 2019 https://alanorth.github.io/cgspace-notes/2019-09/ Sun, 01 Sep 2019 10:17:51 +0300 https://alanorth.github.io/cgspace-notes/2019-09/ <h2 id="2019-09-01">2019-09-01</h2> <ul> <li>Linode emailed to say that CGSpace (linode18) had a high rate of outbound traffic for several hours this morning</li> <li>Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning:</li> </ul> <pre><code># zcat --force /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep -E &quot;01/Sep/2019:0&quot; | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10 440 17.58.101.255 441 157.55.39.101 485 207.46.13.43 728 169.60.128.125 730 207.46.13.108 758 157.55.39.9 808 66.160.140.179 814 207.46.13.212 2472 163.172.71.23 6092 3.94.211.189 # zcat --force /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/oai.log /var/log/nginx/oai.log.1 | grep -E &quot;01/Sep/2019:0&quot; | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10 33 2a01:7e00::f03c:91ff:fe16:fcb 57 3.83.192.124 57 3.87.77.25 57 54.82.1.8 822 2a01:9cc0:47:1:1a:4:0:2 1223 45.5.184.72 1633 172.104.229.92 5112 205.186.128.185 7249 2a01:7e00::f03c:91ff:fe18:7396 9124 45.5.186.2 </code></pre> August, 2019 https://alanorth.github.io/cgspace-notes/2019-08/ Sat, 03 Aug 2019 12:39:51 +0300 https://alanorth.github.io/cgspace-notes/2019-08/ <h2 id="2019-08-03">2019-08-03</h2> <ul> <li>Look at Bioversity&rsquo;s latest migration CSV and now I see that Francesco has cleaned up the extra columns and the newline at the end of the file, but many of the column headers have an extra space in the name&hellip;</li> </ul> <h2 id="2019-08-04">2019-08-04</h2> <ul> <li>Deploy ORCID identifier updates requested by Bioversity to CGSpace</li> <li>Run system updates on CGSpace (linode18) and reboot it <ul> <li>Before updating it I checked Solr and verified that all statistics cores were loaded properly&hellip;</li> <li>After rebooting, all statistics cores were loaded&hellip; wow, that&rsquo;s lucky.</li> </ul> </li> <li>Run system updates on DSpace Test (linode19) and reboot it</li> </ul> July, 2019 https://alanorth.github.io/cgspace-notes/2019-07/ Mon, 01 Jul 2019 12:13:51 +0300 https://alanorth.github.io/cgspace-notes/2019-07/ <h2 id="2019-07-01">2019-07-01</h2> <ul> <li>Create an &ldquo;AfricaRice books and book chapters&rdquo; collection on CGSpace for AfricaRice</li> <li>Last month Sisay asked why the following &ldquo;most popular&rdquo; statistics link for a range of months in 2018 works for the CIAT community on DSpace Test, but not on CGSpace: <ul> <li><a href="https://dspacetest.cgiar.org/handle/10568/35697/most-popular/item#simplefilter=custom&amp;time_filter_end_date=01%2F12%2F2018">DSpace Test</a></li> <li><a href="https://cgspace.cgiar.org/handle/10568/35697/most-popular/item#simplefilter=custom&amp;time_filter_end_date=01%2F12%2F2018">CGSpace</a></li> </ul> </li> <li>Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community</li> </ul> June, 2019 https://alanorth.github.io/cgspace-notes/2019-06/ Sun, 02 Jun 2019 10:57:51 +0300 https://alanorth.github.io/cgspace-notes/2019-06/ <h2 id="2019-06-02">2019-06-02</h2> <ul> <li>Merge the <a href="https://github.com/ilri/DSpace/pull/425">Solr filterCache</a> and <a href="https://github.com/ilri/DSpace/pull/426">XMLUI ISI journal</a> changes to the <code>5_x-prod</code> branch and deploy on CGSpace</li> <li>Run system updates on CGSpace (linode18) and reboot it</li> </ul> <h2 id="2019-06-03">2019-06-03</h2> <ul> <li>Skype with Marie-Angélique and Abenet about <a href="https://agriculturalsemantics.github.io/cg-core/cgcore.html">CG Core v2</a></li> </ul> May, 2019 https://alanorth.github.io/cgspace-notes/2019-05/ Wed, 01 May 2019 07:37:43 +0300 https://alanorth.github.io/cgspace-notes/2019-05/ <h2 id="2019-05-01">2019-05-01</h2> <ul> <li>Help CCAFS with regenerating some item thumbnails after they uploaded new PDFs to some items on CGSpace</li> <li>A user on the dspace-tech mailing list offered some suggestions for troubleshooting the problem with the inability to delete certain items <ul> <li>Apparently if the item is in the <code>workflowitem</code> table it is submitted to a workflow</li> <li>And if it is in the <code>workspaceitem</code> table it is in the pre-submitted state</li> </ul> </li> <li>The item seems to be in a pre-submitted state, so I tried to delete it from there:</li> </ul> <pre><code>dspace=# DELETE FROM workspaceitem WHERE item_id=74648; DELETE 1 </code></pre><ul> <li>But after this I tried to delete the item from the XMLUI and it is <em>still</em> present&hellip;</li> </ul> April, 2019 https://alanorth.github.io/cgspace-notes/2019-04/ Mon, 01 Apr 2019 09:00:43 +0300 https://alanorth.github.io/cgspace-notes/2019-04/ <h2 id="2019-04-01">2019-04-01</h2> <ul> <li>Meeting with AgroKnow to discuss CGSpace, ILRI data, AReS, GARDIAN, etc <ul> <li>They asked if we had plans to enable RDF support in CGSpace</li> </ul> </li> <li>There have been 4,400 more downloads of the CTA Spore publication from those strange Amazon IP addresses today <ul> <li>I suspected that some might not be successful, because the stats show less, but today they were all HTTP 200!</li> </ul> </li> </ul> <pre><code># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep 'Spore-192-EN-web.pdf' | grep -E '(18.196.196.108|18.195.78.144|18.195.218.6)' | awk '{print $9}' | sort | uniq -c | sort -n | tail -n 5 4432 200 </code></pre><ul> <li>In the last two weeks there have been 47,000 downloads of this <em>same exact PDF</em> by these three IP addresses</li> <li>Apply country and region corrections and deletions on DSpace Test and CGSpace:</li> </ul> <pre><code>$ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-9-countries.csv -db dspace -u dspace -p 'fuuu' -f cg.coverage.country -m 228 -t ACTION -d $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u dspace -p 'fuuu' -f cg.coverage.region -m 231 -t action -d $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p 'fuuu' -m 228 -f cg.coverage.country -d $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d </code></pre> March, 2019 https://alanorth.github.io/cgspace-notes/2019-03/ Fri, 01 Mar 2019 12:16:30 +0100 https://alanorth.github.io/cgspace-notes/2019-03/ <h2 id="2019-03-01">2019-03-01</h2> <ul> <li>I checked IITA&rsquo;s 259 Feb 14 records from last month for duplicates using Atmire&rsquo;s Duplicate Checker on a fresh snapshot of CGSpace on my local machine and everything looks good</li> <li>I am now only waiting to hear from her about where the items should go, though I assume Journal Articles go to IITA Journal Articles collection, etc&hellip;</li> <li>Looking at the other half of Udana&rsquo;s WLE records from 2018-11 <ul> <li>I finished the ones for Restoring Degraded Landscapes (RDL), but these are for Variability, Risks and Competing Uses (VRC)</li> <li>I did the usual cleanups for whitespace, added regions where they made sense for certain countries, cleaned up the DOI link formats, added rights information based on the publications page for a few items</li> <li>Most worryingly, there are encoding errors in the abstracts for eleven items, for example:</li> <li>68.15% � 9.45 instead of 68.15% ± 9.45</li> <li>2003�2013 instead of 2003–2013</li> </ul> </li> <li>I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs</li> </ul> February, 2019 https://alanorth.github.io/cgspace-notes/2019-02/ Fri, 01 Feb 2019 21:37:30 +0200 https://alanorth.github.io/cgspace-notes/2019-02/ <h2 id="2019-02-01">2019-02-01</h2> <ul> <li>Linode has alerted a few times since last night that the CPU usage on CGSpace (linode18) was high despite me increasing the alert threshold last week from 250% to 275%—I might need to increase it again!</li> <li>The top IPs before, during, and after this latest alert tonight were:</li> </ul> <pre><code># zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E &quot;01/Feb/2019:(17|18|19|20|21)&quot; | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10 245 207.46.13.5 332 54.70.40.11 385 5.143.231.38 405 207.46.13.173 405 207.46.13.75 1117 66.249.66.219 1121 35.237.175.180 1546 5.9.6.51 2474 45.5.186.2 5490 85.25.237.71 </code></pre><ul> <li><code>85.25.237.71</code> is the &ldquo;Linguee Bot&rdquo; that I first saw last month</li> <li>The Solr statistics the past few months have been very high and I was wondering if the web server logs also showed an increase</li> <li>There were just over 3 million accesses in the nginx logs last month:</li> </ul> <pre><code># time zcat --force /var/log/nginx/* | grep -cE &quot;[0-9]{1,2}/Jan/2019&quot; 3018243 real 0m19.873s user 0m22.203s sys 0m1.979s </code></pre> January, 2019 https://alanorth.github.io/cgspace-notes/2019-01/ Wed, 02 Jan 2019 09:48:30 +0200 https://alanorth.github.io/cgspace-notes/2019-01/ <h2 id="2019-01-02">2019-01-02</h2> <ul> <li>Linode alerted that CGSpace (linode18) had a higher outbound traffic rate than normal early this morning</li> <li>I don&rsquo;t see anything interesting in the web server logs around that time though:</li> </ul> <pre><code># zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E &quot;02/Jan/2019:0(1|2|3)&quot; | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10 92 40.77.167.4 99 210.7.29.100 120 38.126.157.45 177 35.237.175.180 177 40.77.167.32 216 66.249.75.219 225 18.203.76.93 261 46.101.86.248 357 207.46.13.1 903 54.70.40.11 </code></pre> December, 2018 https://alanorth.github.io/cgspace-notes/2018-12/ Sun, 02 Dec 2018 02:09:30 +0200 https://alanorth.github.io/cgspace-notes/2018-12/ <h2 id="2018-12-01">2018-12-01</h2> <ul> <li>Switch CGSpace (linode18) to use OpenJDK instead of Oracle JDK</li> <li>I manually installed OpenJDK, then removed Oracle JDK, then re-ran the <a href="http://github.com/ilri/rmg-ansible-public">Ansible playbook</a> to update all configuration files, etc</li> <li>Then I ran all system updates and restarted the server</li> </ul> <h2 id="2018-12-02">2018-12-02</h2> <ul> <li>I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another <a href="https://usn.ubuntu.com/3831-1/">Ghostscript vulnerability last week</a></li> </ul> November, 2018 https://alanorth.github.io/cgspace-notes/2018-11/ Thu, 01 Nov 2018 16:41:30 +0200 https://alanorth.github.io/cgspace-notes/2018-11/ <h2 id="2018-11-01">2018-11-01</h2> <ul> <li>Finalize AReS Phase I and Phase II ToRs</li> <li>Send a note about my <a href="https://github.com/ilri/dspace-statistics-api">dspace-statistics-api</a> to the dspace-tech mailing list</li> </ul> <h2 id="2018-11-03">2018-11-03</h2> <ul> <li>Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage</li> <li>Today these are the top 10 IPs:</li> </ul> October, 2018 https://alanorth.github.io/cgspace-notes/2018-10/ Mon, 01 Oct 2018 22:31:54 +0300 https://alanorth.github.io/cgspace-notes/2018-10/ <h2 id="2018-10-01">2018-10-01</h2> <ul> <li>Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items</li> <li>I created a GitHub issue to track this <a href="https://github.com/ilri/DSpace/issues/389">#389</a>, because I&rsquo;m super busy in Nairobi right now</li> </ul> September, 2018 https://alanorth.github.io/cgspace-notes/2018-09/ Sun, 02 Sep 2018 09:55:54 +0300 https://alanorth.github.io/cgspace-notes/2018-09/ <h2 id="2018-09-02">2018-09-02</h2> <ul> <li>New <a href="https://jdbc.postgresql.org/documentation/changelog.html#version_42.2.5">PostgreSQL JDBC driver version 42.2.5</a></li> <li>I&rsquo;ll update the DSpace role in our <a href="https://github.com/ilri/rmg-ansible-public">Ansible infrastructure playbooks</a> and run the updated playbooks on CGSpace and DSpace Test</li> <li>Also, I&rsquo;ll re-run the <code>postgresql</code> tasks because the custom PostgreSQL variables are dynamic according to the system&rsquo;s RAM, and we never re-ran them after migrating to larger Linodes last month</li> <li>I&rsquo;m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I&rsquo;m getting those autowire errors in Tomcat 8.5.30 again:</li> </ul> August, 2018 https://alanorth.github.io/cgspace-notes/2018-08/ Wed, 01 Aug 2018 11:52:54 +0300 https://alanorth.github.io/cgspace-notes/2018-08/ <h2 id="2018-08-01">2018-08-01</h2> <ul> <li>DSpace Test had crashed at some point yesterday morning and I see the following in <code>dmesg</code>:</li> </ul> <pre><code>[Tue Jul 31 00:00:41 2018] Out of memory: Kill process 1394 (java) score 668 or sacrifice child [Tue Jul 31 00:00:41 2018] Killed process 1394 (java) total-vm:15601860kB, anon-rss:5355528kB, file-rss:0kB, shmem-rss:0kB [Tue Jul 31 00:00:41 2018] oom_reaper: reaped process 1394 (java), now anon-rss:0kB, file-rss:0kB, shmem-rss:0kB </code></pre><ul> <li>Judging from the time of the crash it was probably related to the Discovery indexing that starts at midnight</li> <li>From the DSpace log I see that eventually Solr stopped responding, so I guess the <code>java</code> process that was OOM killed above was Tomcat&rsquo;s</li> <li>I&rsquo;m not sure why Tomcat didn&rsquo;t crash with an OutOfMemoryError&hellip;</li> <li>Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did a few months ago when we tried to run the whole CGSpace Solr core</li> <li>The server only has 8GB of RAM so we&rsquo;ll eventually need to upgrade to a larger one because we&rsquo;ll start starving the OS, PostgreSQL, and command line batch processes</li> <li>I ran all system updates on DSpace Test and rebooted it</li> </ul> July, 2018 https://alanorth.github.io/cgspace-notes/2018-07/ Sun, 01 Jul 2018 12:56:54 +0300 https://alanorth.github.io/cgspace-notes/2018-07/ <h2 id="2018-07-01">2018-07-01</h2> <ul> <li>I want to upgrade DSpace Test to DSpace 5.8 so I took a backup of its current database just in case:</li> </ul> <pre><code>$ pg_dump -b -v -o --format=custom -U dspace -f dspace-2018-07-01.backup dspace </code></pre><ul> <li>During the <code>mvn package</code> stage on the 5.8 branch I kept getting issues with java running out of memory:</li> </ul> <pre><code>There is insufficient memory for the Java Runtime Environment to continue. </code></pre> June, 2018 https://alanorth.github.io/cgspace-notes/2018-06/ Mon, 04 Jun 2018 19:49:54 -0700 https://alanorth.github.io/cgspace-notes/2018-06/ <h2 id="2018-06-04">2018-06-04</h2> <ul> <li>Test the <a href="https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=560">DSpace 5.8 module upgrades from Atmire</a> (<a href="https://github.com/ilri/DSpace/pull/378">#378</a>) <ul> <li>There seems to be a problem with the CUA and L&amp;R versions in <code>pom.xml</code> because they are using SNAPSHOT and it doesn&rsquo;t build</li> </ul> </li> <li>I added the new CCAFS Phase II Project Tag <code>PII-FP1_PACCA2</code> and merged it into the <code>5_x-prod</code> branch (<a href="https://github.com/ilri/DSpace/pull/379">#379</a>)</li> <li>I proofed and tested the ILRI author corrections that Peter sent back to me this week:</li> </ul> <pre><code>$ ./fix-metadata-values.py -i /tmp/2018-05-30-Correct-660-authors.csv -db dspace -u dspace -p 'fuuu' -f dc.contributor.author -t correct -m 3 -n </code></pre><ul> <li>I think a sane proofing workflow in OpenRefine is to apply the custom text facets for check/delete/remove and illegal characters that I developed in <a href="https://alanorth.github.io/cgspace-notes/cgspace-notes/2018-03/">March, 2018</a></li> <li>Time to index ~70,000 items on CGSpace:</li> </ul> <pre><code>$ time schedtool -D -e ionice -c2 -n7 nice -n19 [dspace]/bin/dspace index-discovery -b real 74m42.646s user 8m5.056s sys 2m7.289s </code></pre> May, 2018 https://alanorth.github.io/cgspace-notes/2018-05/ Tue, 01 May 2018 16:43:54 +0300 https://alanorth.github.io/cgspace-notes/2018-05/ <h2 id="2018-05-01">2018-05-01</h2> <ul> <li>I cleared the Solr statistics core on DSpace Test by issuing two commands directly to the Solr admin interface: <ul> <li>http://localhost:3000/solr/statistics/update?stream.body=%3Cdelete%3E%3Cquery%3E*:*%3C/query%3E%3C/delete%3E</li> <li>http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E</li> </ul> </li> <li>Then I reduced the JVM heap size from 6144 back to 5120m</li> <li>Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the <a href="https://github.com/ilri/rmg-ansible-public">Ansible infrastructure scripts</a> to support hosts choosing which distribution they want to use</li> </ul> April, 2018 https://alanorth.github.io/cgspace-notes/2018-04/ Sun, 01 Apr 2018 16:13:54 +0200 https://alanorth.github.io/cgspace-notes/2018-04/ <h2 id="2018-04-01">2018-04-01</h2> <ul> <li>I tried to test something on DSpace Test but noticed that it&rsquo;s down since god knows when</li> <li>Catalina logs at least show some memory errors yesterday:</li> </ul> March, 2018 https://alanorth.github.io/cgspace-notes/2018-03/ Fri, 02 Mar 2018 16:07:54 +0200 https://alanorth.github.io/cgspace-notes/2018-03/ <h2 id="2018-03-02">2018-03-02</h2> <ul> <li>Export a CSV of the IITA community metadata for Martin Mueller</li> </ul> February, 2018 https://alanorth.github.io/cgspace-notes/2018-02/ Thu, 01 Feb 2018 16:28:54 +0200 https://alanorth.github.io/cgspace-notes/2018-02/ <h2 id="2018-02-01">2018-02-01</h2> <ul> <li>Peter gave feedback on the <code>dc.rights</code> proof of concept that I had sent him last week</li> <li>We don&rsquo;t need to distinguish between internal and external works, so that makes it just a simple list</li> <li>Yesterday I figured out how to monitor DSpace sessions using JMX</li> <li>I copied the logic in the <code>jmx_tomcat_dbpools</code> provided by Ubuntu&rsquo;s <code>munin-plugins-java</code> package and used the stuff I discovered about JMX <a href="https://alanorth.github.io/cgspace-notes/cgspace-notes/2018-01/">in 2018-01</a></li> </ul> January, 2018 https://alanorth.github.io/cgspace-notes/2018-01/ Tue, 02 Jan 2018 08:35:54 -0800 https://alanorth.github.io/cgspace-notes/2018-01/ <h2 id="2018-01-02">2018-01-02</h2> <ul> <li>Uptime Robot noticed that CGSpace went down and up a few times last night, for a few minutes each time</li> <li>I didn&rsquo;t get any load alerts from Linode and the REST and XMLUI logs don&rsquo;t show anything out of the ordinary</li> <li>The nginx logs show HTTP 200s until <code>02/Jan/2018:11:27:17 +0000</code> when Uptime Robot got an HTTP 500</li> <li>In dspace.log around that time I see many errors like &ldquo;Client closed the connection before file download was complete&rdquo;</li> <li>And just before that I see this:</li> </ul> <pre><code>Caused by: org.apache.tomcat.jdbc.pool.PoolExhaustedException: [http-bio-127.0.0.1-8443-exec-980] Timeout: Pool empty. Unable to fetch a connection in 5 seconds, none available[size:50; busy:50; idle:0; lastwait:5000]. </code></pre><ul> <li>Ah hah! So the pool was actually empty!</li> <li>I need to increase that, let&rsquo;s try to bump it up from 50 to 75</li> <li>After that one client got an HTTP 499 but then the rest were HTTP 200, so I don&rsquo;t know what the hell Uptime Robot saw</li> <li>I notice this error quite a few times in dspace.log:</li> </ul> <pre><code>2018-01-02 01:21:19,137 ERROR org.dspace.app.xmlui.aspect.discovery.SidebarFacetsTransformer @ Error while searching for sidebar facets org.dspace.discovery.SearchServiceException: org.apache.solr.search.SyntaxError: Cannot parse 'dateIssued_keyword:[1976+TO+1979]': Encountered &quot; &quot;]&quot; &quot;] &quot;&quot; at line 1, column 32. </code></pre><ul> <li>And there are many of these errors every day for the past month:</li> </ul> <pre><code>$ grep -c &quot;Error while searching for sidebar facets&quot; dspace.log.* dspace.log.2017-11-21:4 dspace.log.2017-11-22:1 dspace.log.2017-11-23:4 dspace.log.2017-11-24:11 dspace.log.2017-11-25:0 dspace.log.2017-11-26:1 dspace.log.2017-11-27:7 dspace.log.2017-11-28:21 dspace.log.2017-11-29:31 dspace.log.2017-11-30:15 dspace.log.2017-12-01:15 dspace.log.2017-12-02:20 dspace.log.2017-12-03:38 dspace.log.2017-12-04:65 dspace.log.2017-12-05:43 dspace.log.2017-12-06:72 dspace.log.2017-12-07:27 dspace.log.2017-12-08:15 dspace.log.2017-12-09:29 dspace.log.2017-12-10:35 dspace.log.2017-12-11:20 dspace.log.2017-12-12:44 dspace.log.2017-12-13:36 dspace.log.2017-12-14:59 dspace.log.2017-12-15:104 dspace.log.2017-12-16:53 dspace.log.2017-12-17:66 dspace.log.2017-12-18:83 dspace.log.2017-12-19:101 dspace.log.2017-12-20:74 dspace.log.2017-12-21:55 dspace.log.2017-12-22:66 dspace.log.2017-12-23:50 dspace.log.2017-12-24:85 dspace.log.2017-12-25:62 dspace.log.2017-12-26:49 dspace.log.2017-12-27:30 dspace.log.2017-12-28:54 dspace.log.2017-12-29:68 dspace.log.2017-12-30:89 dspace.log.2017-12-31:53 dspace.log.2018-01-01:45 dspace.log.2018-01-02:34 </code></pre><ul> <li>Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let&rsquo;s Encrypt if it&rsquo;s just a handful of domains</li> </ul> December, 2017 https://alanorth.github.io/cgspace-notes/2017-12/ Fri, 01 Dec 2017 13:53:54 +0300 https://alanorth.github.io/cgspace-notes/2017-12/ <h2 id="2017-12-01">2017-12-01</h2> <ul> <li>Uptime Robot noticed that CGSpace went down</li> <li>The logs say &ldquo;Timeout waiting for idle object&rdquo;</li> <li>PostgreSQL activity says there are 115 connections currently</li> <li>The list of connections to XMLUI and REST API for today:</li> </ul> November, 2017 https://alanorth.github.io/cgspace-notes/2017-11/ Thu, 02 Nov 2017 09:37:54 +0200 https://alanorth.github.io/cgspace-notes/2017-11/ <h2 id="2017-11-01">2017-11-01</h2> <ul> <li>The CORE developers responded to say they are looking into their bot not respecting our robots.txt</li> </ul> <h2 id="2017-11-02">2017-11-02</h2> <ul> <li>Today there have been no hits by CORE and no alerts from Linode (coincidence?)</li> </ul> <pre><code># grep -c &quot;CORE&quot; /var/log/nginx/access.log 0 </code></pre><ul> <li>Generate list of authors on CGSpace for Peter to go through and correct:</li> </ul> <pre><code>dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv; COPY 54701 </code></pre> October, 2017 https://alanorth.github.io/cgspace-notes/2017-10/ Sun, 01 Oct 2017 08:07:54 +0300 https://alanorth.github.io/cgspace-notes/2017-10/ <h2 id="2017-10-01">2017-10-01</h2> <ul> <li>Peter emailed to point out that many items in the <a href="https://cgspace.cgiar.org/handle/10568/2703">ILRI archive collection</a> have multiple handles:</li> </ul> <pre><code>http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336 </code></pre><ul> <li>There appears to be a pattern but I&rsquo;ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine</li> <li>Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections</li> </ul> CGIAR Library Migration https://alanorth.github.io/cgspace-notes/cgiar-library-migration/ Mon, 18 Sep 2017 16:38:35 +0300 https://alanorth.github.io/cgspace-notes/cgiar-library-migration/ <p>Rough notes for importing the CGIAR Library content. It was decided that this content would go to a new top-level community called <em>CGIAR System Organization</em>.</p> September, 2017 https://alanorth.github.io/cgspace-notes/2017-09/ Thu, 07 Sep 2017 16:54:52 +0700 https://alanorth.github.io/cgspace-notes/2017-09/ <h2 id="2017-09-06">2017-09-06</h2> <ul> <li>Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two hours</li> </ul> <h2 id="2017-09-07">2017-09-07</h2> <ul> <li>Ask Sisay to clean up the WLE approvers a bit, as Marianne&rsquo;s user account is both in the approvers step as well as the group</li> </ul> August, 2017 https://alanorth.github.io/cgspace-notes/2017-08/ Tue, 01 Aug 2017 11:51:52 +0300 https://alanorth.github.io/cgspace-notes/2017-08/ <h2 id="2017-08-01">2017-08-01</h2> <ul> <li>Linode sent an alert that CGSpace (linode18) was using 350% CPU for the past two hours</li> <li>I looked in the Activity pane of the Admin Control Panel and it seems that Google, Baidu, Yahoo, and Bing are all crawling with massive numbers of bots concurrently (~100 total, mostly Baidu and Google)</li> <li>The good thing is that, according to <code>dspace.log.2017-08-01</code>, they are all using the same Tomcat session</li> <li>This means our Tomcat Crawler Session Valve is working</li> <li>But many of the bots are browsing dynamic URLs like: <ul> <li>/handle/10568/3353/discover</li> <li>/handle/10568/16510/browse</li> </ul> </li> <li>The <code>robots.txt</code> only blocks the top-level <code>/discover</code> and <code>/browse</code> URLs&hellip; we will need to find a way to forbid them from accessing these!</li> <li>Relevant issue from DSpace Jira (semi resolved in DSpace 6.0): <a href="https://jira.duraspace.org/browse/DS-2962">https://jira.duraspace.org/browse/DS-2962</a></li> <li>It turns out that we&rsquo;re already adding the <code>X-Robots-Tag &quot;none&quot;</code> HTTP header, but this only forbids the search engine from <em>indexing</em> the page, not crawling it!</li> <li>Also, the bot has to successfully browse the page first so it can receive the HTTP header&hellip;</li> <li>We might actually have to <em>block</em> these requests with HTTP 403 depending on the user agent</li> <li>Abenet pointed out that the CGIAR Library Historical Archive collection I sent July 20th only had ~100 entries, instead of 2415</li> <li>This was due to newline characters in the <code>dc.description.abstract</code> column, which caused OpenRefine to choke when exporting the CSV</li> <li>I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using <code>g/^$/d</code></li> <li>Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet</li> </ul> July, 2017 https://alanorth.github.io/cgspace-notes/2017-07/ Sat, 01 Jul 2017 18:03:52 +0300 https://alanorth.github.io/cgspace-notes/2017-07/ <h2 id="2017-07-01">2017-07-01</h2> <ul> <li>Run system updates and reboot DSpace Test</li> </ul> <h2 id="2017-07-04">2017-07-04</h2> <ul> <li>Merge changes for WLE Phase II theme rename (<a href="https://github.com/ilri/DSpace/pull/329">#329</a>)</li> <li>Looking at extracting the metadata registries from ICARDA&rsquo;s MEL DSpace database so we can compare fields with CGSpace</li> <li>We can use PostgreSQL&rsquo;s extended output format (<code>-x</code>) plus <code>sed</code> to format the output into quasi XML:</li> </ul> June, 2017 https://alanorth.github.io/cgspace-notes/2017-06/ Thu, 01 Jun 2017 10:14:52 +0300 https://alanorth.github.io/cgspace-notes/2017-06/ 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we&rsquo;ll create a new sub-community for Phase II and create collections for the research themes there The current &ldquo;Research Themes&rdquo; community will be renamed to &ldquo;WLE Phase I Research Themes&rdquo; Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg. May, 2017 https://alanorth.github.io/cgspace-notes/2017-05/ Mon, 01 May 2017 16:21:52 +0200 https://alanorth.github.io/cgspace-notes/2017-05/ 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it&rsquo;s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire&rsquo;s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace. April, 2017 https://alanorth.github.io/cgspace-notes/2017-04/ Sun, 02 Apr 2017 17:08:52 +0200 https://alanorth.github.io/cgspace-notes/2017-04/ <h2 id="2017-04-02">2017-04-02</h2> <ul> <li>Merge one change to CCAFS flagships that I had forgotten to remove last month (&ldquo;MANAGING CLIMATE RISK&rdquo;): <a href="https://github.com/ilri/DSpace/pull/317">https://github.com/ilri/DSpace/pull/317</a></li> <li>Quick proof-of-concept hack to add <code>dc.rights</code> to the input form, including some inline instructions/hints:</li> </ul> <p><img src="https://alanorth.github.io/cgspace-notes/cgspace-notes/2017/04/dc-rights.png" alt="dc.rights in the submission form"></p> <ul> <li>Remove redundant/duplicate text in the DSpace submission license</li> <li>Testing the CMYK patch on a collection with 650 items:</li> </ul> <pre><code>$ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p &quot;ImageMagick PDF Thumbnail&quot; -v &gt;&amp; /tmp/filter-media-cmyk.txt </code></pre> March, 2017 https://alanorth.github.io/cgspace-notes/2017-03/ Wed, 01 Mar 2017 17:08:52 +0200 https://alanorth.github.io/cgspace-notes/2017-03/ <h2 id="2017-03-01">2017-03-01</h2> <ul> <li>Run the 279 CIAT author corrections on CGSpace</li> </ul> <h2 id="2017-03-02">2017-03-02</h2> <ul> <li>Skype with Michael and Peter, discussing moving the CGIAR Library to CGSpace</li> <li>CGIAR people possibly open to moving content, redirecting library.cgiar.org to CGSpace and letting CGSpace resolve their handles</li> <li>They might come in at the top level in one &ldquo;CGIAR System&rdquo; community, or with several communities</li> <li>I need to spend a bit of time looking at the multiple handle support in DSpace and see if new content can be minted in both handles, or just one?</li> <li>Need to send Peter and Michael some notes about this in a few days</li> <li>Also, need to consider talking to Atmire about hiring them to bring ORCiD metadata to REST / OAI</li> <li>Filed an issue on DSpace issue tracker for the <code>filter-media</code> bug that causes it to process JPGs even when limiting to the PDF thumbnail plugin: <a href="https://jira.duraspace.org/browse/DS-3516">DS-3516</a></li> <li>Discovered that the ImageMagic <code>filter-media</code> plugin creates JPG thumbnails with the CMYK colorspace when the source PDF is using CMYK</li> <li>Interestingly, it seems DSpace 4.x&rsquo;s thumbnails were sRGB, but forcing regeneration using DSpace 5.x&rsquo;s ImageMagick plugin creates CMYK JPGs if the source PDF was CMYK (see <a href="https://cgspace.cgiar.org/handle/10568/51999">10568/51999</a>):</li> </ul> <pre><code>$ identify ~/Desktop/alc_contrastes_desafios.jpg /Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000 </code></pre> February, 2017 https://alanorth.github.io/cgspace-notes/2017-02/ Tue, 07 Feb 2017 07:04:52 -0800 https://alanorth.github.io/cgspace-notes/2017-02/ <h2 id="2017-02-07">2017-02-07</h2> <ul> <li>An item was mapped twice erroneously again, so I had to remove one of the mappings manually:</li> </ul> <pre><code>dspace=# select * from collection2item where item_id = '80278'; id | collection_id | item_id -------+---------------+--------- 92551 | 313 | 80278 92550 | 313 | 80278 90774 | 1051 | 80278 (3 rows) dspace=# delete from collection2item where id = 92551 and item_id = 80278; DELETE 1 </code></pre><ul> <li>Create issue on GitHub to track the addition of CCAFS Phase II project tags (<a href="https://github.com/ilri/DSpace/issues/301">#301</a>)</li> <li>Looks like we&rsquo;ll be using <code>cg.identifier.ccafsprojectpii</code> as the field name</li> </ul> January, 2017 https://alanorth.github.io/cgspace-notes/2017-01/ Mon, 02 Jan 2017 10:43:00 +0300 https://alanorth.github.io/cgspace-notes/2017-01/ <h2 id="2017-01-02">2017-01-02</h2> <ul> <li>I checked to see if the Solr sharding task that is supposed to run on January 1st had run and saw there was an error</li> <li>I tested on DSpace Test as well and it doesn&rsquo;t work there either</li> <li>I asked on the dspace-tech mailing list because it seems to be broken, and actually now I&rsquo;m not sure if we&rsquo;ve ever had the sharding task run successfully over all these years</li> </ul> December, 2016 https://alanorth.github.io/cgspace-notes/2016-12/ Fri, 02 Dec 2016 10:43:00 +0300 https://alanorth.github.io/cgspace-notes/2016-12/ <h2 id="2016-12-02">2016-12-02</h2> <ul> <li>CGSpace was down for five hours in the morning while I was sleeping</li> <li>While looking in the logs for errors, I see tons of warnings about Atmire MQM:</li> </ul> <pre><code>2016-12-02 03:00:32,352 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=CREATE, SubjectType=BUNDLE, SubjectID=70316, ObjectType=(Unknown), ObjectID=-1, TimeStamp=1480647632305, dispatcher=1544803905, detail=[null], transactionID=&quot;TX157907838689377964651674089851855413607&quot;) 2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=MODIFY_METADATA, SubjectType=BUNDLE, SubjectID =70316, ObjectType=(Unknown), ObjectID=-1, TimeStamp=1480647632309, dispatcher=1544803905, detail=&quot;dc.title&quot;, transactionID=&quot;TX157907838689377964651674089851855413607&quot;) 2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=ADD, SubjectType=ITEM, SubjectID=80044, Object Type=BUNDLE, ObjectID=70316, TimeStamp=1480647632311, dispatcher=1544803905, detail=&quot;THUMBNAIL&quot;, transactionID=&quot;TX157907838689377964651674089851855413607&quot;) 2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=ADD, SubjectType=BUNDLE, SubjectID=70316, Obje ctType=BITSTREAM, ObjectID=86715, TimeStamp=1480647632318, dispatcher=1544803905, detail=&quot;-1&quot;, transactionID=&quot;TX157907838689377964651674089851855413607&quot;) 2016-12-02 03:00:32,353 WARN com.atmire.metadataquality.batchedit.BatchEditConsumer @ BatchEditConsumer should not have been given this kind of Subject in an event, skipping: org.dspace.event.Event(eventType=MODIFY, SubjectType=ITEM, SubjectID=80044, ObjectType=(Unknown), ObjectID=-1, TimeStamp=1480647632351, dispatcher=1544803905, detail=[null], transactionID=&quot;TX157907838689377964651674089851855413607&quot;) </code></pre><ul> <li>I see thousands of them in the logs for the last few months, so it&rsquo;s not related to the DSpace 5.5 upgrade</li> <li>I&rsquo;ve raised a ticket with Atmire to ask</li> <li>Another worrying error from dspace.log is:</li> </ul> November, 2016 https://alanorth.github.io/cgspace-notes/2016-11/ Tue, 01 Nov 2016 09:21:00 +0300 https://alanorth.github.io/cgspace-notes/2016-11/ <h2 id="2016-11-01">2016-11-01</h2> <ul> <li>Add <code>dc.type</code> to the output options for Atmire&rsquo;s Listings and Reports module (<a href="https://github.com/ilri/DSpace/pull/286">#286</a>)</li> </ul> <p><img src="https://alanorth.github.io/cgspace-notes/cgspace-notes/2016/11/listings-and-reports.png" alt="Listings and Reports with output type"></p> October, 2016 https://alanorth.github.io/cgspace-notes/2016-10/ Mon, 03 Oct 2016 15:53:00 +0300 https://alanorth.github.io/cgspace-notes/2016-10/ <h2 id="2016-10-03">2016-10-03</h2> <ul> <li>Testing adding <a href="https://wiki.duraspace.org/display/DSDOC5x/ORCID+Integration#ORCIDIntegration-EditingexistingitemsusingBatchCSVEditing">ORCIDs to a CSV</a> file for a single item to see if the author orders get messed up</li> <li>Need to test the following scenarios to see how author order is affected: <ul> <li>ORCIDs only</li> <li>ORCIDs plus normal authors</li> </ul> </li> <li>I exported a random item&rsquo;s metadata as CSV, deleted <em>all columns</em> except id and collection, and made a new coloum called <code>ORCID:dc.contributor.author</code> with the following random ORCIDs from the ORCID registry:</li> </ul> <pre><code>0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X </code></pre> September, 2016 https://alanorth.github.io/cgspace-notes/2016-09/ Thu, 01 Sep 2016 15:53:00 +0300 https://alanorth.github.io/cgspace-notes/2016-09/ <h2 id="2016-09-01">2016-09-01</h2> <ul> <li>Discuss helping CCAFS with some batch tagging of ORCID IDs for their authors</li> <li>Discuss how the migration of CGIAR&rsquo;s Active Directory to a flat structure will break our LDAP groups in DSpace</li> <li>We had been using <code>DC=ILRI</code> to determine whether a user was ILRI or not</li> <li>It looks like we might be able to use OUs now, instead of DCs:</li> </ul> <pre><code>$ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b &quot;dc=cgiarad,dc=org&quot; -D &quot;admigration1@cgiarad.org&quot; -W &quot;(sAMAccountName=admigration1)&quot; </code></pre> August, 2016 https://alanorth.github.io/cgspace-notes/2016-08/ Mon, 01 Aug 2016 15:53:00 +0300 https://alanorth.github.io/cgspace-notes/2016-08/ <h2 id="2016-08-01">2016-08-01</h2> <ul> <li>Add updated distribution license from Sisay (<a href="https://github.com/ilri/DSpace/issues/259">#259</a>)</li> <li>Play with upgrading Mirage 2 dependencies in <code>bower.json</code> because most are several versions of out date</li> <li>Bootstrap is at 3.3.0 but upstream is at 3.3.7, and upgrading to anything beyond 3.3.1 breaks glyphicons and probably more</li> <li>bower stuff is a dead end, waste of time, too many issues</li> <li>Anything after Bootstrap 3.3.1 makes glyphicons disappear (HTTP 404 trying to access from incorrect path of <code>fonts</code>)</li> <li>Start working on DSpace 5.1 → 5.5 port:</li> </ul> <pre><code>$ git checkout -b 55new 5_x-prod $ git reset --hard ilri/5_x-prod $ git rebase -i dspace-5.5 </code></pre> July, 2016 https://alanorth.github.io/cgspace-notes/2016-07/ Fri, 01 Jul 2016 10:53:00 +0300 https://alanorth.github.io/cgspace-notes/2016-07/ <h2 id="2016-07-01">2016-07-01</h2> <ul> <li>Add <code>dc.description.sponsorship</code> to Discovery sidebar facets and make investors clickable in item view (<a href="https://github.com/ilri/DSpace/issues/232">#232</a>)</li> <li>I think this query should find and replace all authors that have &ldquo;,&rdquo; at the end of their names:</li> </ul> <pre><code>dspacetest=# update metadatavalue set text_value = regexp_replace(text_value, '(^.+?),$', '\1') where metadata_field_id=3 and resource_type_id=2 and text_value ~ '^.+?,$'; UPDATE 95 dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and resource_type_id=2 and text_value ~ '^.+?,$'; text_value ------------ (0 rows) </code></pre><ul> <li>In this case the select query was showing 95 results before the update</li> </ul> June, 2016 https://alanorth.github.io/cgspace-notes/2016-06/ Wed, 01 Jun 2016 10:53:00 +0300 https://alanorth.github.io/cgspace-notes/2016-06/ <h2 id="2016-06-01">2016-06-01</h2> <ul> <li>Experimenting with IFPRI OAI (we want to harvest their publications)</li> <li>After reading the <a href="https://www.oclc.org/support/services/contentdm/help/server-admin-help/oai-support.en.html">ContentDM documentation</a> I found IFPRI&rsquo;s OAI endpoint: <a href="http://ebrary.ifpri.org/oai/oai.php">http://ebrary.ifpri.org/oai/oai.php</a></li> <li>After reading the <a href="https://www.openarchives.org/OAI/openarchivesprotocol.html">OAI documentation</a> and testing with an <a href="http://validator.oaipmh.com/">OAI validator</a> I found out how to get their publications</li> <li>This is their publications set: <a href="http://ebrary.ifpri.org/oai/oai.php?verb=ListRecords&amp;from=2016-01-01&amp;set=p15738coll2&amp;metadataPrefix=oai_dc">http://ebrary.ifpri.org/oai/oai.php?verb=ListRecords&amp;from=2016-01-01&amp;set=p15738coll2&amp;metadataPrefix=oai_dc</a></li> <li>You can see the others by using the OAI <code>ListSets</code> verb: <a href="http://ebrary.ifpri.org/oai/oai.php?verb=ListSets">http://ebrary.ifpri.org/oai/oai.php?verb=ListSets</a></li> <li>Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in <code>dc.identifier.fund</code> to <code>cg.identifier.cpwfproject</code> and then the rest to <code>dc.description.sponsorship</code></li> </ul> May, 2016 https://alanorth.github.io/cgspace-notes/2016-05/ Sun, 01 May 2016 23:06:00 +0300 https://alanorth.github.io/cgspace-notes/2016-05/ <h2 id="2016-05-01">2016-05-01</h2> <ul> <li>Since yesterday there have been 10,000 REST errors and the site has been unstable again</li> <li>I have blocked access to the API now</li> <li>There are 3,000 IPs accessing the REST API in a 24-hour period!</li> </ul> <pre><code># awk '{print $1}' /var/log/nginx/rest.log | uniq | wc -l 3168 </code></pre> April, 2016 https://alanorth.github.io/cgspace-notes/2016-04/ Mon, 04 Apr 2016 11:06:00 +0300 https://alanorth.github.io/cgspace-notes/2016-04/ <h2 id="2016-04-04">2016-04-04</h2> <ul> <li>Looking at log file use on CGSpace and notice that we need to work on our cron setup a bit</li> <li>We are backing up all logs in the log folder, including useless stuff like solr, cocoon, handle-plugin, etc</li> <li>After running DSpace for over five years I&rsquo;ve never needed to look in any other log file than dspace.log, leave alone one from last year!</li> <li>This will save us a few gigs of backup space we&rsquo;re paying for on S3</li> <li>Also, I noticed the <code>checker</code> log has some errors we should pay attention to:</li> </ul> March, 2016 https://alanorth.github.io/cgspace-notes/2016-03/ Wed, 02 Mar 2016 16:50:00 +0300 https://alanorth.github.io/cgspace-notes/2016-03/ <h2 id="2016-03-02">2016-03-02</h2> <ul> <li>Looking at issues with author authorities on CGSpace</li> <li>For some reason we still have the <code>index-lucene-update</code> cron job active on CGSpace, but I&rsquo;m pretty sure we don&rsquo;t need it as of the latest few versions of Atmire&rsquo;s Listings and Reports module</li> <li>Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server</li> </ul> February, 2016 https://alanorth.github.io/cgspace-notes/2016-02/ Fri, 05 Feb 2016 13:18:00 +0300 https://alanorth.github.io/cgspace-notes/2016-02/ <h2 id="2016-02-05">2016-02-05</h2> <ul> <li>Looking at some DAGRIS data for Abenet Yabowork</li> <li>Lots of issues with spaces, newlines, etc causing the import to fail</li> <li>I noticed we have a very <em>interesting</em> list of countries on CGSpace:</li> </ul> <p><img src="https://alanorth.github.io/cgspace-notes/cgspace-notes/2016/02/cgspace-countries.png" alt="CGSpace country list"></p> <ul> <li>Not only are there 49,000 countries, we have some blanks (25)&hellip;</li> <li>Also, lots of things like &ldquo;COTE D`LVOIRE&rdquo; and &ldquo;COTE D IVOIRE&rdquo;</li> </ul> January, 2016 https://alanorth.github.io/cgspace-notes/2016-01/ Wed, 13 Jan 2016 13:18:00 +0300 https://alanorth.github.io/cgspace-notes/2016-01/ <h2 id="2016-01-13">2016-01-13</h2> <ul> <li>Move ILRI collection <code>10568/12503</code> from <code>10568/27869</code> to <code>10568/27629</code> using the <a href="https://gist.github.com/alanorth/392c4660e8b022d99dfa">move_collections.sh</a> script I wrote last year.</li> <li>I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated.</li> <li>Update GitHub wiki for documentation of <a href="https://github.com/ilri/DSpace/wiki/Maintenance-Tasks">maintenance tasks</a>.</li> </ul> December, 2015 https://alanorth.github.io/cgspace-notes/2015-12/ Wed, 02 Dec 2015 13:18:00 +0300 https://alanorth.github.io/cgspace-notes/2015-12/ <h2 id="2015-12-02">2015-12-02</h2> <ul> <li>Replace <code>lzop</code> with <code>xz</code> in log compression cron jobs on DSpace Test—it uses less space:</li> </ul> <pre><code># cd /home/dspacetest.cgiar.org/log # ls -lh dspace.log.2015-11-18* -rw-rw-r-- 1 tomcat7 tomcat7 2.0M Nov 18 23:59 dspace.log.2015-11-18 -rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo -rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz </code></pre> November, 2015 https://alanorth.github.io/cgspace-notes/2015-11/ Mon, 23 Nov 2015 17:00:57 +0300 https://alanorth.github.io/cgspace-notes/2015-11/ <h2 id="2015-11-22">2015-11-22</h2> <ul> <li>CGSpace went down</li> <li>Looks like DSpace exhausted its PostgreSQL connection pool</li> <li>Last week I had increased the limit from 30 to 60, which seemed to help, but now there are many more idle connections:</li> </ul> <pre><code>$ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace 78 </code></pre>