cgspace-notes/docs/2020-03/index.html

539 lines
25 KiB
HTML
Raw Permalink Normal View History

2023-07-04 07:03:36 +02:00
<!DOCTYPE html>
<html lang="en" >
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta property="og:title" content="March, 2020" />
<meta property="og:description" content="2020-03-02
Update dspace-statistics-api for DSpace 6&#43; UUIDs
Tag version 1.2.0 on GitHub
Test migrating legacy Solr statistics to UUIDs with the as-of-yet unreleased SolrUpgradePre6xStatistics.java
You need to download this into the DSpace 6.x source and compile it
" />
<meta property="og:type" content="article" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2020-03/" />
<meta property="article:published_time" content="2020-03-02T12:31:30+02:00" />
<meta property="article:modified_time" content="2020-04-02T12:33:41+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="March, 2020"/>
<meta name="twitter:description" content="2020-03-02
Update dspace-statistics-api for DSpace 6&#43; UUIDs
Tag version 1.2.0 on GitHub
Test migrating legacy Solr statistics to UUIDs with the as-of-yet unreleased SolrUpgradePre6xStatistics.java
You need to download this into the DSpace 6.x source and compile it
"/>
2024-08-28 10:35:05 +02:00
<meta name="generator" content="Hugo 0.133.1">
2023-07-04 07:03:36 +02:00
<script type="application/ld+json">
{
"@context": "http://schema.org",
"@type": "BlogPosting",
"headline": "March, 2020",
"url": "https://alanorth.github.io/cgspace-notes/2020-03/",
"wordCount": "2001",
"datePublished": "2020-03-02T12:31:30+02:00",
"dateModified": "2020-04-02T12:33:41+03:00",
"author": {
"@type": "Person",
"name": "Alan Orth"
},
"keywords": "Notes"
}
</script>
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2020-03/">
<title>March, 2020 | CGSpace Notes</title>
<!-- combined, minified CSS -->
<link href="https://alanorth.github.io/cgspace-notes/css/style.c6ba80bc50669557645abe05f86b73cc5af84408ed20f1551a267bc19ece8228.css" rel="stylesheet" integrity="sha256-xrqAvFBmlVdkWr4F&#43;GtzzFr4RAjtIPFVGiZ7wZ7Ogig=" crossorigin="anonymous">
<!-- minified Font Awesome for SVG icons -->
<script defer src="https://alanorth.github.io/cgspace-notes/js/fontawesome.min.f5072c55a0721857184db93a50561d7dc13975b4de2e19db7f81eb5f3fa57270.js" integrity="sha256-9QcsVaByGFcYTbk6UFYdfcE5dbTeLhnbf4HrXz&#43;lcnA=" crossorigin="anonymous"></script>
<!-- RSS 2.0 feed -->
</head>
<body>
<div class="blog-masthead">
<div class="container">
<nav class="nav blog-nav">
<a class="nav-link " href="https://alanorth.github.io/cgspace-notes/">Home</a>
</nav>
</div>
</div>
<header class="blog-header">
<div class="container">
<h1 class="blog-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/" rel="home">CGSpace Notes</a></h1>
<p class="lead blog-description" dir="auto">Documenting day-to-day work on the <a href="https://cgspace.cgiar.org">CGSpace</a> repository.</p>
</div>
</header>
<div class="container">
<div class="row">
<div class="col-sm-8 blog-main">
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2020-03/">March, 2020</a></h2>
<p class="blog-post-meta">
<time datetime="2020-03-02T12:31:30+02:00">Mon Mar 02, 2020</time>
in
<span class="fas fa-folder" aria-hidden="true"></span>&nbsp;<a href="/categories/notes/" rel="category tag">Notes</a>
</p>
</header>
<h2 id="2020-03-02">2020-03-02</h2>
<ul>
<li>Update <a href="https://github.com/ilri/dspace-statistics-api">dspace-statistics-api</a> for DSpace 6+ UUIDs
<ul>
<li>Tag version 1.2.0 on GitHub</li>
</ul>
</li>
<li>Test migrating legacy Solr statistics to UUIDs with the as-of-yet unreleased <a href="https://github.com/DSpace/DSpace/commit/184f2b2153479045fba6239342c63e7f8564b8b6#diff-0350ce2e13b28d5d61252b7a8f50a059">SolrUpgradePre6xStatistics.java</a>
<ul>
<li>You need to download this into the DSpace 6.x source and compile it</li>
</ul>
</li>
</ul>
<pre tabindex="0"><code>$ export JAVA_OPTS=&#34;-Xmx1024m -Dfile.encoding=UTF-8&#34;
$ ~/dspace63/bin/dspace solr-upgrade-statistics-6x
</code></pre><h2 id="2020-03-03">2020-03-03</h2>
<ul>
<li>Skype with Peter and Abenet to discuss the CG Core survey
<ul>
<li>We also discussed some other CGSpace issues</li>
</ul>
</li>
</ul>
<h2 id="2020-03-04">2020-03-04</h2>
<ul>
<li>Abenet asked me to add some new ILRI subjects to CGSpace
<ul>
<li>I <a href="https://github.com/ilri/DSpace/commit/b51a242e773bd8658d3cab4ac883975708b00386">updated the input-forms.xml</a> in our <code>5_x-prod</code> branch on GitHub</li>
<li>Abenet said we are changing <code>HEALTH</code> to <code>HUMAN HEALTH</code> so I need to fix those using my <code>fix-metadata-values.py</code> script:</li>
</ul>
</li>
</ul>
<pre tabindex="0"><code>$ ./fix-metadata-values.py -i 2020-03-04-fix-1-ilri-subject.csv -db dspace -u dspace -p &#39;fuuu&#39; -f cg.subject.ilri -m 203 -t correct -d
</code></pre><ul>
<li>But I have not run it on CGSpace yet because we want to ask Peter if he is sure about it&hellip;</li>
<li>Send a message to Macaroni Bros to ask them about their Drupal module and its readiness for DSpace 6 UUIDs</li>
</ul>
<h2 id="2020-03-05">2020-03-05</h2>
<ul>
<li>I found a very <a href="https://lucene.apache.org/solr/guide/8_1/solr-system-requirements.html#lucene-solr-prior-to-7-0">interesting comment on the Solr 8.1 guide</a> about Java compatibility:</li>
</ul>
<blockquote>
<p>Lucene/Solr 7.0 was the first version that successfully passed our tests using Java 9 and higher. You should avoid Java 9 or later for Lucene/Solr 6.x or earlier.</p>
</blockquote>
<h2 id="2020-03-08">2020-03-08</h2>
<ul>
<li>I want to try to consolidate our yearly Solr statistics cores back into one <code>statistics</code> core using the solr-import-export-json tool</li>
<li>I will try it on DSpace test, doing one year at a time:</li>
</ul>
<pre tabindex="0"><code>$ ./run.sh -s http://localhost:8081/solr/statistics-2010 -a export -o /tmp/statistics-2010.json -k uid
$ ./run.sh -s http://localhost:8081/solr/statistics -a import -o /tmp/statistics-2010.json -k uid
$ curl -s &#34;http://localhost:8081/solr/statistics-2010/update?softCommit=true&#34; -H &#34;Content-Type: text/xml&#34; --data-binary &#34;&lt;delete&gt;&lt;query&gt;time:2010*&lt;/query&gt;&lt;/delete&gt;&#34;
$ ./run.sh -s http://localhost:8081/solr/statistics-2011 -a export -o /tmp/statistics-2011.json -k uid
$ ./run.sh -s http://localhost:8081/solr/statistics -a import -o /tmp/statistics-2011.json -k uid
$ curl -s &#34;http://localhost:8081/solr/statistics-2011/update?softCommit=true&#34; -H &#34;Content-Type: text/xml&#34; --data-binary &#34;&lt;delete&gt;&lt;query&gt;time:2011*&lt;/query&gt;&lt;/delete&gt;&#34;
$ ./run.sh -s http://localhost:8081/solr/statistics -a import -o /tmp/statistics-2012.json -k uid
$ curl -s &#39;http://localhost:8081/solr/statistics/select?q=time:2012*&amp;rows=0&amp;wt=json&amp;indent=true&#39; | grep numFound
&#34;response&#34;:{&#34;numFound&#34;:3761989,&#34;start&#34;:0,&#34;docs&#34;:[]
$ curl -s &#39;http://localhost:8081/solr/statistics-2012/select?q=time:2012*&amp;rows=0&amp;wt=json&amp;indent=true&#39; | grep numFound
&#34;response&#34;:{&#34;numFound&#34;:3761989,&#34;start&#34;:0,&#34;docs&#34;:[]
$ curl -s &#34;http://localhost:8081/solr/statistics-2012/update?softCommit=true&#34; -H &#34;Content-Type: text/xml&#34; --data-binary &#34;&lt;delete&gt;&lt;query&gt;time:2012*&lt;/query&gt;&lt;/delete&gt;&#34;
</code></pre><ul>
<li>I will do this for as many cores as I can (disk space limited) and then monitor the effect on the system and JVM memory usage
<ul>
<li>Exporting half years might work, using a filter query with months as a regular expression:</li>
</ul>
</li>
</ul>
<pre tabindex="0"><code>$ ./run.sh -s http://localhost:8081/solr/statistics-2014 -a export -o /tmp/statistics-2014-1.json -k uid -f &#39;time:/2014-0[1-6].*/&#39;
</code></pre><ul>
<li>Upgrade PostgreSQL from 9.6 to 10 on DSpace Test (linode19)
<ul>
<li>I&rsquo;ve been running it for one month in my local environment, and others have reported on the dspace-tech mailing list that they are using 10 and 11</li>
</ul>
</li>
</ul>
<pre tabindex="0"><code># apt install postgresql-10 postgresql-contrib-10
# systemctl stop tomcat7
# pg_ctlcluster 9.6 main stop
# tar -cvzpf var-lib-postgresql-9.6.tar.gz /var/lib/postgresql/9.6
# tar -cvzpf etc-postgresql-9.6.tar.gz /etc/postgresql/9.6
# pg_ctlcluster 10 main stop
# pg_dropcluster 10 main
# pg_upgradecluster 9.6 main
# pg_dropcluster 9.6 main
# dpkg -l | grep postgresql | grep 9.6 | awk &#39;{print $2}&#39; | xargs dpkg -r
</code></pre><h2 id="2020-03-09">2020-03-09</h2>
<ul>
<li>Peter noticed that the Solr stats were not showing anything before 2020
<ul>
<li>I had to restart Tomcat three times before all cores loaded properly&hellip;</li>
</ul>
</li>
</ul>
<h2 id="2020-03-10">2020-03-10</h2>
<ul>
<li>Fix some logic issues in the nginx config
<ul>
<li>Use generic blocking of <code>[Bb]ot</code> and <code>[Cc]rawl</code> and <code>[Ss]pider</code> in the &ldquo;badbots&rdquo; rate limiting logic instead of trying to list them all one by one (bots should not be trying to index dynamic pages <em>no matter what</em> so we punish hard here)</li>
<li>We were not properly forwarding the remote IP address to Tomcat in all nginx location blocks, which led some locations to log a hit from 127.0.0.1 (because we need to explicitly add the global proxy params when setting other headers in location blocks)</li>
<li>Unfortunately this affected the REST API and there are a few hundred thousand requests from this user agent:</li>
</ul>
</li>
</ul>
<pre tabindex="0"><code>Mozilla/5.0 (Windows; U; Windows NT 6.1; en-GB; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13 (.NET CLR 3.5.30729)
</code></pre><ul>
<li>It seems to only be a problem in the last week:</li>
</ul>
<pre tabindex="0"><code># zgrep -c 64.225.40.66 /var/log/nginx/rest.log.{1..9}
/var/log/nginx/rest.log.1:0
/var/log/nginx/rest.log.2:0
/var/log/nginx/rest.log.3:0
/var/log/nginx/rest.log.4:3625
/var/log/nginx/rest.log.5:27458
/var/log/nginx/rest.log.6:0
/var/log/nginx/rest.log.7:0
/var/log/nginx/rest.log.8:0
/var/log/nginx/rest.log.9:0
</code></pre><ul>
<li>In Solr the IP is 127.0.0.1, but in the nginx logs I can luckily see the real IP (64.225.40.66), which is on Digital Ocean</li>
<li>I will purge them from Solr statistics:</li>
</ul>
<pre tabindex="0"><code>$ curl -s &#34;http://localhost:8081/solr/statistics/update?softCommit=true&#34; -H &#34;Content-Type: text/xml&#34; --data-binary &#39;&lt;delete&gt;&lt;query&gt;userAgent:&#34;Mozilla/5.0 (Windows; U; Windows NT 6.1; en-GB; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13 (.NET CLR 3.5.30729)&#34;&lt;/query&gt;&lt;/delete&gt;&#39;
</code></pre><ul>
<li>Another user agent that seems to be a bot is:</li>
</ul>
<pre tabindex="0"><code>Mozilla/5.0 ((Windows; U; Windows NT 6.1; fr; rv:1.9.2) Gecko/20100115 Firefox/3.6)
</code></pre><ul>
<li>In Solr the IP is 127.0.0.1 because of the misconfiguration, but in nginx&rsquo;s logs I see it belongs to three IPs on Online.net in France:</li>
</ul>
<pre tabindex="0"><code># zcat /var/log/nginx/access.log.*.gz /var/log/nginx/rest.log.*.gz | grep &#39;Mozilla/5.0 ((Windows; U; Windows NT 6.1; fr; rv:1.9.2) Gecko/20100115 Firefox/3.6)&#39; | awk &#39;{print $1}&#39; | sort | uniq -c
63090 163.172.68.99
183428 163.172.70.248
147608 163.172.71.24
</code></pre><ul>
<li>It is making 10,000 to 40,000 requests to XMLUI per day&hellip;</li>
</ul>
<pre tabindex="0"><code># zgrep -c &#39;Mozilla/5.0 ((Windows; U; Windows NT 6.1; fr; rv:1.9.2) Gecko/20100115 Firefox/3.6)&#39; /var/log/nginx/access.log.{1..9}
/var/log/nginx/access.log.30.gz:18687
/var/log/nginx/access.log.31.gz:28936
/var/log/nginx/access.log.32.gz:36402
/var/log/nginx/access.log.33.gz:38886
/var/log/nginx/access.log.34.gz:30607
/var/log/nginx/access.log.35.gz:19040
/var/log/nginx/access.log.36.gz:10780
/var/log/nginx/access.log.37.gz:5808
/var/log/nginx/access.log.38.gz:3100
/var/log/nginx/access.log.39.gz:1485
/var/log/nginx/access.log.3.gz:2898
/var/log/nginx/access.log.40.gz:373
/var/log/nginx/access.log.41.gz:3909
/var/log/nginx/access.log.42.gz:4729
/var/log/nginx/access.log.43.gz:3906
</code></pre><ul>
<li>I will purge those hits too!</li>
</ul>
<pre tabindex="0"><code>$ curl -s &#34;http://localhost:8081/solr/statistics/update?softCommit=true&#34; -H &#34;Content-Type: text/xml&#34; --data-binary &#39;&lt;delete&gt;&lt;query&gt;userAgent:&#34;Mozilla/5.0 ((Windows; U; Windows NT 6.1; fr; rv:1.9.2) Gecko/20100115 Firefox/3.6)&#34;&lt;/query&gt;&lt;/delete&gt;&#39;
</code></pre><ul>
<li>Shit, and something happened and a few thousand hits from user agents with &ldquo;Bot&rdquo; in their user agent got through
<ul>
<li>I need to re-run the <code>check-bot-hits.sh</code> script with the standard COUNTER-Robots list again, but add my own versions of a few because the script/Solr doesn&rsquo;t support case-insensitive regular expressions:</li>
</ul>
</li>
</ul>
<pre tabindex="0"><code>$ ./check-spider-hits.sh -f /tmp/bots -d -p
(DEBUG) Using spiders pattern file: /tmp/bots
(DEBUG) Checking for hits from spider: Citoid
Purging 11 hits from Citoid in statistics
(DEBUG) Checking for hits from spider: ecointernet
Purging 375 hits from ecointernet in statistics
(DEBUG) Checking for hits from spider: ^Pattern\/[0-9]
Purging 1 hits from ^Pattern\/[0-9] in statistics
(DEBUG) Checking for hits from spider: sqlmap
(DEBUG) Checking for hits from spider: Typhoeus
Purging 6 hits from Typhoeus in statistics
(DEBUG) Checking for hits from spider: 7siters
(DEBUG) Checking for hits from spider: Apache-HttpClient
Purging 3178 hits from Apache-HttpClient in statistics
Total number of bot hits purged: 3571
$ ./check-spider-hits.sh -f /tmp/bots -d -p
(DEBUG) Using spiders pattern file: /tmp/bots
(DEBUG) Checking for hits from spider: [Bb]ot
Purging 8317 hits from [Bb]ot in statistics
(DEBUG) Checking for hits from spider: [Cc]rawl
Purging 1314 hits from [Cc]rawl in statistics
(DEBUG) Checking for hits from spider: [Ss]pider
Purging 62 hits from [Ss]pider in statistics
(DEBUG) Checking for hits from spider: Citoid
(DEBUG) Checking for hits from spider: ecointernet
(DEBUG) Checking for hits from spider: ^Pattern\/[0-9]
(DEBUG) Checking for hits from spider: sqlmap
(DEBUG) Checking for hits from spider: Typhoeus
(DEBUG) Checking for hits from spider: 7siters
(DEBUG) Checking for hits from spider: Apache-HttpClient
</code></pre><h2 id="2020-03-11">2020-03-11</h2>
<ul>
<li>Ask Michael Victor for permission to create a new Linode server for DSpace Test</li>
</ul>
<h2 id="2020-3-12">2020-3-12</h2>
<ul>
<li>I&rsquo;m working on the 170 IITA records on <a href="https://dspacetest.cgiar.org/handle/10568/106567">DSpace Test</a> from January finally
<ul>
<li>It&rsquo;s been two months since I last looked and I want to do a thorough check to make sure Bosede didn&rsquo;t introduce any new issues, but I want to consolidate all the text languages for these records so it&rsquo;s easier to check them in OpenRefine</li>
<li>First I got a list of IDs from <code>csvcut</code> and then I updated the text languages for only those records:</li>
</ul>
</li>
</ul>
<pre tabindex="0"><code>dspace=# SELECT DISTINCT text_lang, COUNT(*) FROM metadatavalue WHERE resource_type_id=2 AND resource_id in (111295,111294,111293,111292,111291,111290,111288,111286,111285,111284,111283,111282,111281,111280,111279,111278,111277,111276,111275,111274,111273,111272,111271,111270,111269,111268,111267,111266,111265,111264,111263,111262,111261,111260,111259,111258,111257,111256,111255,111254,111253,111252,111251,111250,111249,111248,111247,111246,111245,111244,111243,111242,111241,111240,111238,111237,111236,111235,111234,111233,111232,111231,111230,111229,111228,111227,111226,111225,111224,111223,111222,111221,111220,111219,111218,111217,111216,111215,111214,111213,111212,111211,111209,111208,111207,111206,111205,111204,111203,111202,111201,111200,111199,111198,111197,111196,111195,111194,111193,111192,111191,111190,111189,111188,111187,111186,111185,111184,111183,111182,111181,111180,111179,111178,111177,111176,111175,111174,111173,111172,111171,111170,111169,111168,111299,111298,111297,111296,111167,111166,111165,111164,111163,111162,111161,111160,111159,111158,111157,111156,111155,111154,111153,111152,111151,111150,111149,111148,111147,111146,111145,111144,111143,111142,111141,111140,111139,111138,111137,111136,111135,111134,111133,111132,111131,111129,111128,111127,111126,111125) GROUP BY text_lang ORDER BY count;
</code></pre><ul>
<li>Then I exported the metadata from DSpace Test and imported it into OpenRefine
<ul>
<li>I corrected one invalid AGROVOC subject using my <code>csv-metadata-quality</code> script</li>
</ul>
</li>
<li>I exported a new list of affiliations from the database, added line numbers with <code>csvcut</code>, and then validated them in OpenRefine using <code>reconcile-csv</code>:</li>
</ul>
<pre tabindex="0"><code>dspace=# \COPY (SELECT DISTINCT text_value, count(*) FROM metadatavalue WHERE resource_type_id = 2 AND metadata_field_id = 211 GROUP BY text_value ORDER BY count DESC LIMIT 1500) to /tmp/2020-03-12-affiliations.csv WITH CSV HEADER;`
dspace=# \q
$ csvcut -l -c 0 /tmp/2020-03-12-affiliations.csv | sed -e &#39;s/^line_number/id/&#39; -e &#39;s/text_value/name/&#39; &gt; /tmp/affiliations.csv
$ lein run /tmp/affiliations.csv name id
</code></pre><ul>
<li>I always forget how to copy the reconciled values in OpenRefine, but you need to make a new column and populate it using this GREL: <code>if(cell.recon.matched, cell.recon.match.name, value)</code></li>
<li>I mapped all 170 items to their appropriate collections based on type and uploaded them to CGSpace</li>
</ul>
<h2 id="2020-03-16">2020-03-16</h2>
<ul>
<li>I&rsquo;m looking at the CPU usage of CGSpace (linode18) over the past year and I see we <em>rarely</em> even go over two CPUs on average sustained usage:</li>
</ul>
<p><img src="/cgspace-notes/2020/03/cgspace-cpu-year.png" alt="linode18 CPU usage year"></p>
<ul>
<li>Also clearly visible is the effect of CPU steal in 2019-03</li>
</ul>
<p><img src="/cgspace-notes/2020/03/cgspace-memory-year.png" alt="linode18 RAM usage year"></p>
<p><img src="/cgspace-notes/2020/03/cgspace-heap-year.png" alt="linode18 JVM heap usage year"></p>
<ul>
<li>At max we have committed 10GB of RAM, the rest is used opportunistically by the filesystem cache, likely for Solr
<ul>
<li>There was a huge drop in 2019-07 when I changed the JVM settings</li>
<li>I think we should re-evaluate our deployment and perhaps target a different instance type and add block storage for assetstore (as we determined Linode&rsquo;s block storage to be too slow for Solr)</li>
</ul>
</li>
</ul>
<h2 id="2020-03-17">2020-03-17</h2>
<ul>
<li>Update the PostgreSQL JDBC driver to version 42.2.11</li>
<li>Maria from Bioversity asked me to add a new field for the combined subjects of Bioversity and CIAT, since they merged recently
<ul>
<li>We will use <code>cg.subject.alliancebiovciat</code></li>
</ul>
</li>
</ul>
<h2 id="2020-03-18">2020-03-18</h2>
<ul>
<li>Provision new Linode server (linode26) for DSpace Test to replace the current linode19 server</li>
<li>Improve DSpace role of <a href="https://github.com/ilri/rmg-ansible-public">Ansible infrastructure playbooks</a>
<ul>
<li>We should install npm packages in the DSpace user&rsquo;s home directory instead of globally as root</li>
</ul>
</li>
</ul>
<h2 id="2020-03-19">2020-03-19</h2>
<ul>
<li>Finalized migration of DSpace Test to linode26 and removed linode19</li>
</ul>
<h2 id="2020-03-22">2020-03-22</h2>
<ul>
<li>Look over the AReS ToRs sent by Enrico and Moayad and add a few notes about missing GitHub issues
<ul>
<li>Hopefully now they can start working on the development!</li>
</ul>
</li>
</ul>
<h2 id="2020-03-24">2020-03-24</h2>
<ul>
<li>Skype meeting about CGSpace with Peter and Abenet</li>
</ul>
<h2 id="2020-03-25">2020-03-25</h2>
<ul>
<li>I sent Atmire a message to ask if they managed to start working on the DSpace 6 port, as the last communication was twenty-six days ago when they said they were going to secure technical resources to do so</li>
<li>Start adapting the <code>dspace</code> role in our <a href="https://github.com/ilri/rmg-ansible-public">Ansible infrastructure playbooks</a> for DSpace 6 support</li>
</ul>
<h2 id="2020-03-26">2020-03-26</h2>
<ul>
<li>More work adapting the <code>dspace</code> role in our Ansible infrastructure scripts to DSpace 6</li>
<li>Update Tomcat to version 7.0.103 in the Ansible infrastrcutrue playbooks and deploy on DSpace Test (linode26)</li>
<li>Maria sent me a few new ORCID identifiers from Bioversity so I combined them with our existing ones, filtered the unique ones, and then resolved their names using my <code>resolve-orcids.py</code> script:</li>
</ul>
<pre tabindex="0"><code>$ cat ~/src/git/DSpace/dspace/config/controlled-vocabularies/cg-creator-id.xml /tmp/bioversity-orcids | grep -oE &#39;[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}&#39; | sort | uniq &gt; /tmp/2020-03-26-combined-orcids.txt
$ ./resolve-orcids.py -i /tmp/2020-03-26-combined-orcids.txt -o /tmp/2020-03-26-combined-names.txt -d
# sort names, copy to cg-creator-id.xml, add XML formatting, and then format with tidy (preserving accents)
$ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-id.xml
</code></pre><ul>
<li>I checked the database for likely matches to the author name and then created a CSV with the author names and ORCID iDs:</li>
</ul>
<pre tabindex="0"><code>dc.contributor.author,cg.creator.id
&#34;King, Brian&#34;,&#34;Brian King: 0000-0002-7056-9214&#34;
&#34;Ortiz-Crespo, Berta&#34;,&#34;Berta Ortiz-Crespo: 0000-0002-6664-0815&#34;
&#34;Ekesa, Beatrice&#34;,&#34;Beatrice Ekesa: 0000-0002-2630-258X&#34;
&#34;Ekesa, B.&#34;,&#34;Beatrice Ekesa: 0000-0002-2630-258X&#34;
&#34;Ekesa, B.N.&#34;,&#34;Beatrice Ekesa: 0000-0002-2630-258X&#34;
&#34;Gullotta, G.&#34;,&#34;Gaia Gullotta: 0000-0002-2240-3869&#34;
</code></pre><ul>
<li>Running the <code>add-orcid-identifiers-csv.py</code> script I added 32 ORCID iDs to items on CGSpace!</li>
</ul>
<pre tabindex="0"><code>$ ./add-orcid-identifiers-csv.py -i /tmp/2020-03-26-ciat-orcids.csv -db dspace -u dspace -p &#39;fuuu&#39;
</code></pre><ul>
<li>Udana from IWMI asked about some items that are missing Altmetric donuts on CGSpace
<ul>
<li><a href="https://hdl.handle.net/10568/103225">One of them</a> had a link to the paper on Nature, but was missing a DOI</li>
<li><a href="https://hdl.handle.net/10568/106899">The second item</a> had no donut so I <a href="https://twitter.com/mralanorth/status/1243158045540134913">tweeted its handle</a></li>
<li><a href="https://hdl.handle.net/10568/107258">The third item</a> also had no handle so I <a href="https://twitter.com/mralanorth/status/1243158786392625153">tweeted it</a> as well</li>
</ul>
</li>
<li>Abenet pointed out <a href="https://hdl.handle.net/10568/106573">one item</a> that she had tweeted last week that is missing a donut as well, so I <a href="https://twitter.com/mralanorth/status/1243163710241345536">tweeted it</a> too</li>
</ul>
<h2 id="2020-03-29">2020-03-29</h2>
<ul>
<li>Add two more Bioversity ORCID iDs to CGSpace and then tag ~70 of the authors&rsquo; existing publications in the database using this CSV with my <code>add-orcid-identifiers-csv.py</code> script:</li>
</ul>
<pre tabindex="0"><code>dc.contributor.author,cg.creator.id
&#34;Snook, L.K.&#34;,&#34;Laura Snook: 0000-0002-9168-1301&#34;
&#34;Snook, L.&#34;,&#34;Laura Snook: 0000-0002-9168-1301&#34;
&#34;Zheng, S.J.&#34;,&#34;Sijun Zheng: 0000-0003-1550-3738&#34;
&#34;Zheng, S.&#34;,&#34;Sijun Zheng: 0000-0003-1550-3738&#34;
</code></pre><ul>
<li>Deploy latest Bioversity and CIAT updates on CGSpace (linode18) and DSpace Test (linode26)</li>
<li>Deploy latest Ansible infrastructure playbooks on CGSpace and DSpace Test to get the latest dspace-statistics-api (v1.1.1) and Tomcat (7.0.103) versions</li>
<li>Run system updates on CGSpace and DSpace Test and reboot them
<ul>
<li>After reboot all the Solr statistics cores came back up on the first time on both servers (yay)</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</article>
</div> <!-- /.blog-main -->
<aside class="col-sm-3 ml-auto blog-sidebar">
<section class="sidebar-module">
<h4>Recent Posts</h4>
<ol class="list-unstyled">
2024-11-19 08:40:23 +01:00
<li><a href="/cgspace-notes/2024-11/">November, 2024</a></li>
2024-10-03 10:51:44 +02:00
<li><a href="/cgspace-notes/2024-10/">October, 2024</a></li>
2024-09-09 09:20:09 +02:00
<li><a href="/cgspace-notes/2024-09/">September, 2024</a></li>
2024-08-28 10:35:05 +02:00
<li><a href="/cgspace-notes/2024-08/">August, 2024</a></li>
2024-07-02 10:12:03 +02:00
<li><a href="/cgspace-notes/2024-07/">July, 2024</a></li>
2023-07-04 07:03:36 +02:00
</ol>
</section>
<section class="sidebar-module">
<h4>Links</h4>
<ol class="list-unstyled">
<li><a href="https://cgspace.cgiar.org">CGSpace</a></li>
<li><a href="https://dspacetest.cgiar.org">DSpace Test</a></li>
<li><a href="https://github.com/ilri/DSpace">CGSpace @ GitHub</a></li>
</ol>
</section>
</aside>
</div> <!-- /.row -->
</div> <!-- /.container -->
<footer class="blog-footer">
<p dir="auto">
Blog template created by <a href="https://twitter.com/mdo">@mdo</a>, ported to Hugo by <a href='https://twitter.com/mralanorth'>@mralanorth</a>.
</p>
<p>
<a href="#">Back to top</a>
</p>
</footer>
</body>
</html>