mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-22 22:55:04 +01:00
472 lines
20 KiB
HTML
472 lines
20 KiB
HTML
<!DOCTYPE html>
|
|
<html lang="en" >
|
|
|
|
<head>
|
|
<meta charset="utf-8">
|
|
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
|
|
|
<meta property="og:title" content="May, 2020" />
|
|
<meta property="og:description" content="2020-05-02
|
|
|
|
Peter said that CTA is having problems submitting an item to CGSpace
|
|
|
|
Looking at the PostgreSQL stats it seems to be the same issue that Tezira was having last week, as I see the number of connections in ‘idle in transaction’ and ‘waiting for lock’ state are increasing again
|
|
I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2.11, and there were some bugs related to transactions fixed in 42.2.12 (which I had updated in the Ansible playbooks, but not deployed yet)
|
|
|
|
|
|
" />
|
|
<meta property="og:type" content="article" />
|
|
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2020-05/" />
|
|
<meta property="article:published_time" content="2020-05-02T09:52:04+03:00" />
|
|
<meta property="article:modified_time" content="2020-05-30T18:38:16+03:00" />
|
|
|
|
<meta name="twitter:card" content="summary"/>
|
|
<meta name="twitter:title" content="May, 2020"/>
|
|
<meta name="twitter:description" content="2020-05-02
|
|
|
|
Peter said that CTA is having problems submitting an item to CGSpace
|
|
|
|
Looking at the PostgreSQL stats it seems to be the same issue that Tezira was having last week, as I see the number of connections in ‘idle in transaction’ and ‘waiting for lock’ state are increasing again
|
|
I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2.11, and there were some bugs related to transactions fixed in 42.2.12 (which I had updated in the Ansible playbooks, but not deployed yet)
|
|
|
|
|
|
"/>
|
|
<meta name="generator" content="Hugo 0.71.1" />
|
|
|
|
|
|
|
|
<script type="application/ld+json">
|
|
{
|
|
"@context": "http://schema.org",
|
|
"@type": "BlogPosting",
|
|
"headline": "May, 2020",
|
|
"url": "https://alanorth.github.io/cgspace-notes/2020-05/",
|
|
"wordCount": "1861",
|
|
"datePublished": "2020-05-02T09:52:04+03:00",
|
|
"dateModified": "2020-05-30T18:38:16+03:00",
|
|
"author": {
|
|
"@type": "Person",
|
|
"name": "Alan Orth"
|
|
},
|
|
"keywords": "Notes"
|
|
}
|
|
</script>
|
|
|
|
|
|
|
|
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2020-05/">
|
|
|
|
<title>May, 2020 | CGSpace Notes</title>
|
|
|
|
|
|
<!-- combined, minified CSS -->
|
|
|
|
<link href="https://alanorth.github.io/cgspace-notes/css/style.6da5c906cc7a8fbb93f31cd2316c5dbe3f19ac4aa6bfb066f1243045b8f6061e.css" rel="stylesheet" integrity="sha256-baXJBsx6j7uT8xzSMWxdvj8ZrEqmv7Bm8SQwRbj2Bh4=" crossorigin="anonymous">
|
|
|
|
|
|
<!-- minified Font Awesome for SVG icons -->
|
|
|
|
<script defer src="https://alanorth.github.io/cgspace-notes/js/fontawesome.min.f3d2a1f5980bab30ddd0d8cadbd496475309fc48e2b1d052c5c09e6facffcb0f.js" integrity="sha256-89Kh9ZgLqzDd0NjK29SWR1MJ/EjisdBSxcCeb6z/yw8=" crossorigin="anonymous"></script>
|
|
|
|
<!-- RSS 2.0 feed -->
|
|
|
|
|
|
|
|
|
|
|
|
|
|
</head>
|
|
|
|
<body>
|
|
|
|
|
|
<div class="blog-masthead">
|
|
<div class="container">
|
|
<nav class="nav blog-nav">
|
|
<a class="nav-link " href="https://alanorth.github.io/cgspace-notes/">Home</a>
|
|
</nav>
|
|
</div>
|
|
</div>
|
|
|
|
|
|
|
|
|
|
<header class="blog-header">
|
|
<div class="container">
|
|
<h1 class="blog-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/" rel="home">CGSpace Notes</a></h1>
|
|
<p class="lead blog-description" dir="auto">Documenting day-to-day work on the <a href="https://cgspace.cgiar.org">CGSpace</a> repository.</p>
|
|
</div>
|
|
</header>
|
|
|
|
|
|
|
|
|
|
<div class="container">
|
|
<div class="row">
|
|
<div class="col-sm-8 blog-main">
|
|
|
|
|
|
|
|
|
|
<article class="blog-post">
|
|
<header>
|
|
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2020-05/">May, 2020</a></h2>
|
|
<p class="blog-post-meta"><time datetime="2020-05-02T09:52:04+03:00">Sat May 02, 2020</time> by Alan Orth in
|
|
<span class="fas fa-folder" aria-hidden="true"></span> <a href="/cgspace-notes/categories/notes/" rel="category tag">Notes</a>
|
|
|
|
|
|
</p>
|
|
</header>
|
|
<h2 id="2020-05-02">2020-05-02</h2>
|
|
<ul>
|
|
<li>Peter said that CTA is having problems submitting an item to CGSpace
|
|
<ul>
|
|
<li>Looking at the PostgreSQL stats it seems to be the same issue that Tezira was having last week, as I see the number of connections in ‘idle in transaction’ and ‘waiting for lock’ state are increasing again</li>
|
|
<li>I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2.11, and there were some bugs related to transactions fixed in 42.2.12 (which I had updated in the Ansible playbooks, but not deployed yet)</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<h2 id="2020-05-03">2020-05-03</h2>
|
|
<ul>
|
|
<li>Purge a few remaining bots from CGSpace Solr statistics that I had identified a few months ago
|
|
<ul>
|
|
<li><code>lua-resty-http/0.10 (Lua) ngx_lua/10000</code></li>
|
|
<li><code>omgili/0.5 +http://omgili.com</code></li>
|
|
<li><code>IZaBEE/IZaBEE-1.01 (Buzzing Abound The Web; https://izabee.com; info at izabee dot com)</code></li>
|
|
<li><code>Twurly v1.1 (https://twurly.org)</code></li>
|
|
<li><code>Pattern/2.6 +http://www.clips.ua.ac.be/pattern</code></li>
|
|
<li><code>CyotekWebCopy/1.7 CyotekHTTP/2.0</code></li>
|
|
</ul>
|
|
</li>
|
|
<li>This is only about 2,500 hits total from the last ten years, and half of these bots no longer seem to exist, so I won’t bother submitting them to the COUNTER-Robots project</li>
|
|
<li>I noticed that our custom themes were incorrectly linking to the OpenSearch XML file
|
|
<ul>
|
|
<li>The bug <a href="https://jira.lyrasis.org/browse/DS-2592">was fixed</a> for Mirage2 in 2015</li>
|
|
<li>Note that this did not prevent OpenSearch itself from working</li>
|
|
<li>I will patch this on our DSpace 5.x and 6.x branches</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<h2 id="2020-05-06">2020-05-06</h2>
|
|
<ul>
|
|
<li>Atmire responded asking for more information about the Solr statistics processing bug in CUA so I sent them some full logs
|
|
<ul>
|
|
<li>Also I asked again about the Maven variable interpolation issue for <code>cua.version.number</code>, and if they would be willing to upgrade CUA to use Font Awesome 5 instead of 4.</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<h2 id="2020-05-07">2020-05-07</h2>
|
|
<ul>
|
|
<li>Linode sent an alert that there was high CPU usage on CGSpace (linode18) early this morning
|
|
<ul>
|
|
<li>I looked at the nginx logs using goaccess and I found a few IPs making lots of requests around then:</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<pre><code># cat /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "07/May/2020:(01|03|04)" | goaccess --log-format=COMBINED -
|
|
</code></pre><ul>
|
|
<li>The two main IPs making requests around then are 188.134.31.88 and 212.34.8.188
|
|
<ul>
|
|
<li>The first is in Russia and it is hitting mostly XMLUI Discover links using <em>dozens</em> of different user agents, a total of 20,000 requests this week</li>
|
|
<li>The second IP is CodeObia testing AReS, a total of 171,000 hits this month</li>
|
|
<li>I will purge both of those IPs from the Solr stats using my <code>check-spider-ip-hits.sh</code> script:</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<pre><code>$ ./check-spider-ip-hits.sh -f /tmp/ips -s statistics -p
|
|
Purging 171641 hits from 212.34.8.188 in statistics
|
|
Purging 20691 hits from 188.134.31.88 in statistics
|
|
|
|
Total number of bot hits purged: 192332
|
|
</code></pre><ul>
|
|
<li>And then I will add 188.134.31.88 to the nginx bad bot list and tell CodeObia to please use a “bot” user agent</li>
|
|
<li>I also changed the nginx config to block requests with blank user agents</li>
|
|
</ul>
|
|
<h2 id="2020-05-11">2020-05-11</h2>
|
|
<ul>
|
|
<li>Bizu said she was having issues submitting to CGSpace last week
|
|
<ul>
|
|
<li>The issue sounds like the one Tezira and CTA were having in the last few weeks</li>
|
|
<li>I looked at the PostgreSQL graphs and see there are a lot of connections in “idle in transaction” and “waiting for lock” state:</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<p><img src="/cgspace-notes/2020/05/postgres_connections_cgspace-week.png" alt="PostgreSQL connections"></p>
|
|
<ul>
|
|
<li>I think I’ll downgrade the PostgreSQL JDBC driver from 42.2.12 to 42.2.10, which was the version we were using before these issues started happening</li>
|
|
<li>Atmire sent some feedback about my ongoing issues with their CUA module, but none of it was conclusive yet
|
|
<ul>
|
|
<li>Regarding Font Awesome 5 they will check how much work it will take and give me a quote</li>
|
|
</ul>
|
|
</li>
|
|
<li>Abenet said some users are questioning why the statistics dropped so much lately, so I made a <a href="https://www.yammer.com/dspacedevelopers/#/Threads/show?threadId=674923030216704">post to Yammer</a> to explain about the robots</li>
|
|
<li>Last week Peter had asked me to add a new ILRI author’s ORCID iD
|
|
<ul>
|
|
<li>I added it to the controlled vocabulary and tagged the user’s existing ~11 items in CGSpace using this CSV file with my <code>add-orcid-identifiers-csv.py</code> script:</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<pre><code>$ cat 2020-05-11-add-orcids.csv
|
|
dc.contributor.author,cg.creator.id
|
|
"Lutakome, P.","Pius Lutakome: 0000-0002-0804-2649"
|
|
"Lutakome, Pius","Pius Lutakome: 0000-0002-0804-2649"
|
|
$ ./add-orcid-identifiers-csv.py -i 2020-05-11-add-orcids.csv -db dspace -u dspace -p 'fuuu' -d
|
|
</code></pre><ul>
|
|
<li>Run system updates on CGSpace (linode18) and reboot it
|
|
<ul>
|
|
<li>I had to restart Tomcat five times before all Solr statistics cores came up OK, ugh.</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<h2 id="2020-05-12">2020-05-12</h2>
|
|
<ul>
|
|
<li>Peter noticed that CGSpace is no longer on AReS, because I blocked all requests that don’t specify a user agent
|
|
<ul>
|
|
<li>I’ve temporarily disabled that restriction and asked Moayad to look into how he can specify a user agent in the AReS harvester</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<h2 id="2020-05-13">2020-05-13</h2>
|
|
<ul>
|
|
<li>Atmire responded about Font Awesome and said they can switch to version 5 for 16 credits
|
|
<ul>
|
|
<li>I told them to go ahead</li>
|
|
</ul>
|
|
</li>
|
|
<li>Also, Atmire gave me a small workaround for the <code>cua.version.number</code> interpolation issue and said they would look into the crash that happens when processing our Solr stats</li>
|
|
<li>Run system updates and reboot AReS server (linode20) for the first time in almost 100 days
|
|
<ul>
|
|
<li>I notice that AReS now has some of CGSpace’s data in it (but not all) since I dropped the user-agent restriction on the REST API yesterday</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<h2 id="2020-05-17">2020-05-17</h2>
|
|
<ul>
|
|
<li>Create an issue in the OpenRXV project for Moayad to change the default harvester user agent (<a href="https://github.com/ilri/OpenRXV/issues/36">#36</a>)</li>
|
|
</ul>
|
|
<h2 id="2020-05-18">2020-05-18</h2>
|
|
<ul>
|
|
<li>Atmire responded and said they still can’t figure out the CUA statistics issue, though they seem to only be trying to understand what’s going on using static analysis
|
|
<ul>
|
|
<li>I told them that they should try to run the code with the Solr statistics that I shared with them a few weeks ago</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<h2 id="2020-05-19">2020-05-19</h2>
|
|
<ul>
|
|
<li>Add ORCID identifier for Sirak Bahta
|
|
<ul>
|
|
<li>I added it to the controlled vocabulary and tagged the user’s existing ~40 items in CGSpace using this CSV file with my <code>add-orcid-identifiers-csv.py</code> script:</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<pre><code>$ cat 2020-05-19-add-orcids.csv
|
|
dc.contributor.author,cg.creator.id
|
|
"Bahta, Sirak T.","Sirak Bahta: 0000-0002-5728-2489"
|
|
$ ./add-orcid-identifiers-csv.py -i 2020-05-19-add-orcids.csv -db dspace -u dspace -p 'fuuu' -d
|
|
</code></pre><ul>
|
|
<li>An IITA user is having issues submitting to CGSpace and I see there are a rising number of PostgreSQL connections waiting in transaction and in lock:</li>
|
|
</ul>
|
|
<p><img src="/cgspace-notes/2020/05/postgres_connections_cgspace-week2.png" alt="PostgreSQL connections"></p>
|
|
<ul>
|
|
<li>This is the same issue Tezira, Bizu, and CTA were having in the last few weeks and it I already downgraded the PostgreSQL JDBC driver version to the last version I was using before this started (42.2.10)
|
|
<ul>
|
|
<li>I will downgrade it to version 42.2.9 for now…</li>
|
|
<li>The only other thing I can think of is that I upgraded Tomcat to 7.0.103 in March</li>
|
|
</ul>
|
|
</li>
|
|
<li>Run system updates on DSpace Test (linode26) and reboot it</li>
|
|
<li>Run system updates on CGSpace (linode18) and reboot it
|
|
<ul>
|
|
<li>After the system came back up I had to restart Tomcat 7 three times before all the Solr statistics cores came up OK</li>
|
|
</ul>
|
|
</li>
|
|
<li>Send Atmire a snapshot of the CGSpace database for them to possibly troubleshoot the CUA issue with DSpace 6</li>
|
|
</ul>
|
|
<h2 id="2020-05-20">2020-05-20</h2>
|
|
<ul>
|
|
<li>Send CodeObia some logos and footer text for the next phase of OpenRXV development (<a href="https://github.com/ilri/OpenRXV/issues/18">#18</a>)</li>
|
|
</ul>
|
|
<h2 id="2020-05-25">2020-05-25</h2>
|
|
<ul>
|
|
<li>Add ORCID identifier for CIAT author Manuel Francisco
|
|
<ul>
|
|
<li>I added it to the controlled vocabulary and tagged the user’s existing ~27 items in CGSpace using this CSV file with my <code>add-orcid-identifiers-csv.py</code> script:</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<pre><code>$ cat 2020-05-25-add-orcids.csv
|
|
dc.contributor.author,cg.creator.id
|
|
"Díaz, Manuel F.","Manuel Francisco Diaz Baca: 0000-0001-8996-5092"
|
|
"Díaz, Manuel Francisco","Manuel Francisco Diaz Baca: 0000-0001-8996-5092"
|
|
$ ./add-orcid-identifiers-csv.py -i 2020-05-25-add-orcids.csv -db dspace -u dspace -p 'fuuu' -d
|
|
</code></pre><ul>
|
|
<li>Last week Maria asked again about searching for items by accession or issue date
|
|
<ul>
|
|
<li>A few months ago I had told her to search for the ISO8601 date in Discovery search, which appears to work because it filters the results down quite a bit</li>
|
|
<li>She pointed out that the results include hits that don’t exactly match, for example if part of the search string appears elsewhere like in the timestamp</li>
|
|
<li>I checked in Solr and the results are the same, so perhaps it’s a limitation in Solr…?</li>
|
|
<li>So this effectively means that we don’t have a way to create reports for items in an arbitrary date range shorter than a year:
|
|
<ul>
|
|
<li>DSpace advanced search is buggy or simply not designed to work like that</li>
|
|
<li>AReS Explorer currently only allows filtering by year, but will allow months soon</li>
|
|
<li>Atmire Listings and Reports only allows a “Timespan” of a year</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<h2 id="2020-05-29">2020-05-29</h2>
|
|
<ul>
|
|
<li>Linode alerted to say that the CPU load on CGSpace (linode18) was high for a few hours this morning
|
|
<ul>
|
|
<li>Looking at the nginx logs for this morning with goaccess:</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<pre><code># cat /var/log/nginx/*.log.1 | grep -E "29/May/2020:(02|03|04|05)" | goaccess --log-format=COMBINED -
|
|
</code></pre><ul>
|
|
<li>The top is 172.104.229.92, which is the AReS harvester (still not using a user agent, but it’s tagged as a bot in the nginx mapping)</li>
|
|
<li>Second is 188.134.31.88, which is a Russian host that we also saw in the last few weeks, using a browser user agent and hitting the XMLUI (but it is tagged as a bot in nginx as well)</li>
|
|
<li>Another one is 51.158.106.4, which is some Scaleway IP making requests to XMLUI with different browser user agents that I am pretty sure I have seen before but never blocked
|
|
<ul>
|
|
<li>According to Solr it has made about 800 requests this year, but still… it’s a bot.</li>
|
|
</ul>
|
|
</li>
|
|
<li>One I don’t think I’ve seen before is 95.217.58.146, which is making requests to XMLUI with a Drupal user agent
|
|
<ul>
|
|
<li>According to <a href="https://viewdns.info/reverseip/?host=95.217.58.146&t=1">viewdns.info</a> it belongs to <a href="https://landvoc.org/">landvoc.org</a></li>
|
|
<li>I should add Drupal to the list of bots…</li>
|
|
</ul>
|
|
</li>
|
|
<li>Atmire got back to me about the <a href="https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=706">Solr CUA issue in the DSpace 6 upgrade</a> and they cannot reproduce the error
|
|
<ul>
|
|
<li>The next step is for me to migrate DSpace Test (linode26) to DSpace 6 and try to reproduce the error there</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<h2 id="2020-05-31">2020-05-31</h2>
|
|
<ul>
|
|
<li>Start preparing to migrate DSpace Test (linode26) to the <code>6_x-dev-atmire-modules</code> branch
|
|
<ul>
|
|
<li>Run all system updates and reboot</li>
|
|
<li>For now I will disable all yearly Solr statistics cores except the current <code>statistics</code> one</li>
|
|
<li>Prepare PostgreSQL with a clean snapshot of CGSpace’s DSpace 5.8 database:</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<pre><code>$ sudo su - postgres
|
|
$ dropdb dspacetest
|
|
$ createdb -O dspacetest --encoding=UNICODE dspacetest
|
|
$ psql dspacetest -c 'alter user dspacetest superuser;'
|
|
$ pg_restore -d dspacetest -O --role=dspacetest /tmp/cgspace_2020-05-31.backup
|
|
$ psql dspacetest -c 'alter user dspacetest nosuperuser;'
|
|
# run DSpace 5 version of update-sequences.sql!!!
|
|
$ psql -f /home/dspace/src/git/DSpace/dspace/etc/postgres/update-sequences.sql dspacetest
|
|
$ psql dspacetest -c "DELETE FROM schema_version WHERE version IN ('5.8.2015.12.03.3');"
|
|
$ psql dspacetest -c 'CREATE EXTENSION pgcrypto;'
|
|
$ exit
|
|
</code></pre><ul>
|
|
<li>Now switch to the DSpace 6.x branch and start a build:</li>
|
|
</ul>
|
|
<pre><code>$ chrt -i 0 ionice -c2 -n7 nice -n19 mvn -U -Dmirage2.on=true -Dmirage2.deps.included=false package
|
|
...
|
|
[ERROR] Failed to execute goal on project additions: Could not resolve dependencies for project org.dspace.modules:additions:jar:6.3: Failed to collect dependencies at com.atmire:atmire-listings-and-reports-api:jar:6.x-2.10.8-0-SNAPSHOT: Failed to read artifact descriptor for com.atmire:atmire-listings-and-reports-api:jar:6.x-2.10.8-0-SNAPSHOT: Could not transfer artifact com.atmire:atmire-listings-and-reports-api:pom:6.x-2.10.8-0-SNAPSHOT from/to atmire.com-snapshots (https://atmire.com/artifactory/atmire.com-snapshots): Not authorized , ReasonPhrase:Unauthorized. -> [Help 1]
|
|
</code></pre><ul>
|
|
<li>Great! I will have to send Atmire a note about this… but for now I can sync over my local <code>~/.m2</code> directory and the build completes</li>
|
|
<li>After the Maven build completed successfully I installed the updated code with Ant (make sure to delete the old spring directory):</li>
|
|
</ul>
|
|
<pre><code>$ cd dspace/target/dspace-installer
|
|
$ rm -rf /blah/dspacetest/config/spring
|
|
$ ant update
|
|
</code></pre><ul>
|
|
<li>Database migrations take 10:18.287s during the first startup…
|
|
<ul>
|
|
<li>perhaps when we do the production CGSpace migration I can do this in advance and tell users not to make any submissions?</li>
|
|
</ul>
|
|
</li>
|
|
<li>I had a mistake in my Solr internal URL parameter so DSpace couldn’t find it, but once I fixed that DSpace starts up OK!</li>
|
|
<li>Once the initial Discovery reindexing is completed I started the Solr statistics UUID migration:</li>
|
|
</ul>
|
|
<pre><code>$ export JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8"
|
|
</code></pre><ul>
|
|
<li>Experiment a bit with the Python <a href="https://pypi.org/project/country-converter/">country-converter</a> library as it can convert between different formats (like ISO 3166 and UN m49)
|
|
<ul>
|
|
<li>We need to eventually find a format we can use for all CGIAR DSpaces…</li>
|
|
</ul>
|
|
</li>
|
|
</ul>
|
|
<!-- raw HTML omitted -->
|
|
|
|
|
|
|
|
|
|
|
|
</article>
|
|
|
|
|
|
|
|
</div> <!-- /.blog-main -->
|
|
|
|
<aside class="col-sm-3 ml-auto blog-sidebar">
|
|
|
|
|
|
|
|
<section class="sidebar-module">
|
|
<h4>Recent Posts</h4>
|
|
<ol class="list-unstyled">
|
|
|
|
|
|
<li><a href="/cgspace-notes/2020-05/">May, 2020</a></li>
|
|
|
|
<li><a href="/cgspace-notes/2020-04/">April, 2020</a></li>
|
|
|
|
<li><a href="/cgspace-notes/2020-03/">March, 2020</a></li>
|
|
|
|
<li><a href="/cgspace-notes/2020-02/">February, 2020</a></li>
|
|
|
|
<li><a href="/cgspace-notes/2020-01/">January, 2020</a></li>
|
|
|
|
</ol>
|
|
</section>
|
|
|
|
|
|
|
|
|
|
<section class="sidebar-module">
|
|
<h4>Links</h4>
|
|
<ol class="list-unstyled">
|
|
|
|
<li><a href="https://cgspace.cgiar.org">CGSpace</a></li>
|
|
|
|
<li><a href="https://dspacetest.cgiar.org">DSpace Test</a></li>
|
|
|
|
<li><a href="https://github.com/ilri/DSpace">CGSpace @ GitHub</a></li>
|
|
|
|
</ol>
|
|
</section>
|
|
|
|
</aside>
|
|
|
|
|
|
</div> <!-- /.row -->
|
|
</div> <!-- /.container -->
|
|
|
|
|
|
|
|
<footer class="blog-footer">
|
|
<p dir="auto">
|
|
|
|
Blog template created by <a href="https://twitter.com/mdo">@mdo</a>, ported to Hugo by <a href='https://twitter.com/mralanorth'>@mralanorth</a>.
|
|
|
|
</p>
|
|
<p>
|
|
<a href="#">Back to top</a>
|
|
</p>
|
|
</footer>
|
|
|
|
|
|
</body>
|
|
|
|
</html>
|