mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-17 20:27:05 +01:00
440 lines
21 KiB
HTML
440 lines
21 KiB
HTML
<!DOCTYPE html>
|
||
<html lang="en" >
|
||
|
||
<head>
|
||
<meta charset="utf-8">
|
||
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
||
|
||
|
||
<meta property="og:title" content="December, 2020" />
|
||
<meta property="og:description" content="2020-12-01
|
||
|
||
Atmire responded about the issue with duplicate data in our Solr statistics
|
||
|
||
They noticed that some records in the statistics-2015 core haven’t been migrated with the AtomicStatisticsUpdateCLI tool yet and assumed that I haven’t migrated any of the records yet
|
||
That’s strange, as I checked all ten cores and 2015 is the only one with some unmigrated documents, as according to the cua_version field
|
||
I started processing those (about 411,000 records):
|
||
|
||
|
||
" />
|
||
<meta property="og:type" content="article" />
|
||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2020-12/" />
|
||
<meta property="article:published_time" content="2020-12-01T11:32:54+02:00" />
|
||
<meta property="article:modified_time" content="2020-12-10T23:43:09+02:00" />
|
||
|
||
|
||
|
||
<meta name="twitter:card" content="summary"/>
|
||
<meta name="twitter:title" content="December, 2020"/>
|
||
<meta name="twitter:description" content="2020-12-01
|
||
|
||
Atmire responded about the issue with duplicate data in our Solr statistics
|
||
|
||
They noticed that some records in the statistics-2015 core haven’t been migrated with the AtomicStatisticsUpdateCLI tool yet and assumed that I haven’t migrated any of the records yet
|
||
That’s strange, as I checked all ten cores and 2015 is the only one with some unmigrated documents, as according to the cua_version field
|
||
I started processing those (about 411,000 records):
|
||
|
||
|
||
"/>
|
||
<meta name="generator" content="Hugo 0.79.0" />
|
||
|
||
|
||
|
||
<script type="application/ld+json">
|
||
{
|
||
"@context": "http://schema.org",
|
||
"@type": "BlogPosting",
|
||
"headline": "December, 2020",
|
||
"url": "https://alanorth.github.io/cgspace-notes/2020-12/",
|
||
"wordCount": "1378",
|
||
"datePublished": "2020-12-01T11:32:54+02:00",
|
||
"dateModified": "2020-12-10T23:43:09+02:00",
|
||
"author": {
|
||
"@type": "Person",
|
||
"name": "Alan Orth"
|
||
},
|
||
"keywords": "Notes"
|
||
}
|
||
</script>
|
||
|
||
|
||
|
||
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2020-12/">
|
||
|
||
<title>December, 2020 | CGSpace Notes</title>
|
||
|
||
|
||
<!-- combined, minified CSS -->
|
||
|
||
<link href="https://alanorth.github.io/cgspace-notes/css/style.16633182cd803b52b9bf9e29ea1ef4b2e3d460deee0ded49466d7e16e449c158.css" rel="stylesheet" integrity="sha256-FmMxgs2AO1K5v54p6h70suPUYN7uDe1JRm1+FuRJwVg=" crossorigin="anonymous">
|
||
|
||
|
||
<!-- minified Font Awesome for SVG icons -->
|
||
|
||
<script defer src="https://alanorth.github.io/cgspace-notes/js/fontawesome.min.4ed405d7c7002b970d34cbe6026ff44a556b0808cb98a9db4008752110ed964b.js" integrity="sha256-TtQF18cAK5cNNMvmAm/0SlVrCAjLmKnbQAh1IRDtlks=" crossorigin="anonymous"></script>
|
||
|
||
<!-- RSS 2.0 feed -->
|
||
|
||
|
||
|
||
|
||
</head>
|
||
|
||
<body>
|
||
|
||
|
||
<div class="blog-masthead">
|
||
<div class="container">
|
||
<nav class="nav blog-nav">
|
||
<a class="nav-link " href="https://alanorth.github.io/cgspace-notes/">Home</a>
|
||
</nav>
|
||
</div>
|
||
</div>
|
||
|
||
|
||
|
||
|
||
<header class="blog-header">
|
||
<div class="container">
|
||
<h1 class="blog-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/" rel="home">CGSpace Notes</a></h1>
|
||
<p class="lead blog-description" dir="auto">Documenting day-to-day work on the <a href="https://cgspace.cgiar.org">CGSpace</a> repository.</p>
|
||
</div>
|
||
</header>
|
||
|
||
|
||
|
||
|
||
<div class="container">
|
||
<div class="row">
|
||
<div class="col-sm-8 blog-main">
|
||
|
||
|
||
|
||
|
||
<article class="blog-post">
|
||
<header>
|
||
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2020-12/">December, 2020</a></h2>
|
||
<p class="blog-post-meta">
|
||
<time datetime="2020-12-01T11:32:54+02:00">Tue Dec 01, 2020</time>
|
||
in
|
||
<span class="fas fa-folder" aria-hidden="true"></span> <a href="/cgspace-notes/categories/notes/" rel="category tag">Notes</a>
|
||
|
||
|
||
</p>
|
||
</header>
|
||
<h2 id="2020-12-01">2020-12-01</h2>
|
||
<ul>
|
||
<li>Atmire responded about the issue with duplicate data in our Solr statistics
|
||
<ul>
|
||
<li>They noticed that some records in the statistics-2015 core haven’t been migrated with the AtomicStatisticsUpdateCLI tool yet and assumed that I haven’t migrated any of the records yet</li>
|
||
<li>That’s strange, as I checked all ten cores and 2015 is the only one with some unmigrated documents, as according to the <code>cua_version</code> field</li>
|
||
<li>I started processing those (about 411,000 records):</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<pre><code class="language-console" data-lang="console">$ chrt -b 0 dspace dsrun com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdateCLI -t 12 -c statistics-2015
|
||
</code></pre><ul>
|
||
<li>AReS went down when the <code>renew-letsencrypt</code> service stopped the <code>angular_nginx</code> container in the pre-update hook and failed to bring it back up
|
||
<ul>
|
||
<li>I ran all system updates on the host and rebooted it and AReS came back up OK</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<h2 id="2020-12-02">2020-12-02</h2>
|
||
<ul>
|
||
<li>Udana emailed me yesterday to ask why the CGSpace usage statistics were showing “No Data”
|
||
<ul>
|
||
<li>I noticed a message in the Solr Admin UI that one of the statistics cores failed to load, but it is up and I can query it…</li>
|
||
<li>Nevertheless, I restarted Tomcat a few times to see if all cores would come up without an error message, but had no success (despite that all cores ARE up and I can query them, <em>sigh</em>)</li>
|
||
<li>I think I will move all the Solr yearly statistics back into the main statistics core</li>
|
||
</ul>
|
||
</li>
|
||
<li>Start testing export/import of yearly Solr statistics data into the main statistics core on DSpace Test, for example:</li>
|
||
</ul>
|
||
<pre><code>$ ./run.sh -s http://localhost:8081/solr/statistics-2010 -a export -o statistics-2010.json -k uid
|
||
$ ./run.sh -s http://localhost:8081/solr/statistics -a import -o statistics-2010.json -k uid
|
||
$ curl -s "http://localhost:8081/solr/statistics-2010/update?softCommit=true" -H "Content-Type: text/xml" --data-binary "<delete><query>*:*</query></delete>"
|
||
</code></pre><ul>
|
||
<li>I deployed Tomcat 7.0.107 on DSpace Test (CGSpace is still Tomcat 7.0.104)</li>
|
||
<li>I finished migrating all the statistics from the yearly shards back to the main core</li>
|
||
</ul>
|
||
<h2 id="2020-12-05">2020-12-05</h2>
|
||
<ul>
|
||
<li>I deleted all the yearly statistics shards and restarted Tomcat on DSpace Test (linode26)</li>
|
||
</ul>
|
||
<h2 id="2020-12-06">2020-12-06</h2>
|
||
<ul>
|
||
<li>Looking into the statistics on DSpace Test after I migrated them back to the main core
|
||
<ul>
|
||
<li>All stats are working as expected… indexing time for the DSpace Statistics API is the same… and I don’t even see a difference in the JVM or memory stats in Munin other than a minor jump last week when I was processing them</li>
|
||
</ul>
|
||
</li>
|
||
<li>I will migrate them on CGSpace too I think
|
||
<ul>
|
||
<li>First I will start with the statistics-2010 and statistics-2015 cores because they were the ones that were failing to load recently (despite actually being available in Solr WTF)</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/12/solr-statistics-2010-failed.png" alt="Error message in Solr admin UI about the statistics-2010 core failing to load"></p>
|
||
<ul>
|
||
<li>First the 2010 core:</li>
|
||
</ul>
|
||
<pre><code class="language-console" data-lang="console">$ chrt -b 0 ./run.sh -s http://localhost:8081/solr/statistics-2010 -a export -o statistics-2010.json -k uid
|
||
$ chrt -b 0 ./run.sh -s http://localhost:8081/solr/statistics -a import -o statistics-2010.json -k uid
|
||
$ curl -s "http://localhost:8081/solr/statistics-2010/update?softCommit=true" -H "Content-Type: text/xml" --data-binary "<delete><query>*:*</query></delete>"
|
||
</code></pre><ul>
|
||
<li>Judging by the DSpace logs all these cores had a problem starting up in the last month:</li>
|
||
</ul>
|
||
<pre><code class="language-console" data-lang="console"># grep -rsI "Unable to create core" [dspace]/log/dspace.log.2020-* | grep -o -E "statistics-[0-9]+" | sort | uniq -c
|
||
24 statistics-2010
|
||
24 statistics-2015
|
||
18 statistics-2016
|
||
6 statistics-2018
|
||
</code></pre><ul>
|
||
<li>The message is always this:</li>
|
||
</ul>
|
||
<pre><code>org.apache.solr.client.solrj.impl.HttpSolrServer$RemoteSolrException: Error CREATEing SolrCore 'statistics-2016': Unable to create core [statistics-2016] Caused by: Lock obtain timed out: NativeFSLock@/[dspace]/solr/statistics-2016/data/index/write.lock
|
||
</code></pre><ul>
|
||
<li>I will migrate all these cores and see if it makes a difference, then probably end up migrating all of them
|
||
<ul>
|
||
<li>I removed the statistics-2010, statistics-2015, statistics-2016, and statistics-2018 cores and restarted Tomcat and <em>all the statistics cores came up OK and the CUA statistics are OK</em>!</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<h2 id="2020-12-07">2020-12-07</h2>
|
||
<ul>
|
||
<li>Run <code>dspace cleanup -v</code> on CGSpace to clean up deleted bitstreams</li>
|
||
<li>Atmire sent a <a href="https://github.com/ilri/DSpace/pull/457">pull request</a> to address the duplicate owningComm and owningColl
|
||
<ul>
|
||
<li>Built and deployed it on DSpace Test but I am not sure how to run it yet</li>
|
||
<li>I sent feedback to Atmire on their tracker: <a href="https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=839">https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=839</a></li>
|
||
</ul>
|
||
</li>
|
||
<li>Abenet and Tezira are having issues with committing to the archive in their workflow
|
||
<ul>
|
||
<li>I looked at the server and indeed the locks and transactions are back up:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/12/postgres_transactions_ALL-day.png" alt="PostgreSQL Transactions day">
|
||
<img src="/cgspace-notes/2020/12/postgres_locks_ALL-day.png" alt="PostgreSQL Locks day">
|
||
<img src="/cgspace-notes/2020/12/postgres_querylength_ALL-day.png" alt="PostgreSQL Locks day">
|
||
<img src="/cgspace-notes/2020/12/postgres_connections_ALL-day.png" alt="PostgreSQL Connections day"></p>
|
||
<ul>
|
||
<li>There are apparently 1,700 locks right now:</li>
|
||
</ul>
|
||
<pre><code class="language-console" data-lang="console">$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | wc -l
|
||
1739
|
||
</code></pre><h2 id="2020-12-08">2020-12-08</h2>
|
||
<ul>
|
||
<li>Atmire sent some instructions for using the DeduplicateValuesProcessor
|
||
<ul>
|
||
<li>I modified <code>atmire-cua-update.xml</code> as they instructed, but I get a million errors like this when I run AtomicStatisticsUpdateCLI with that configuration:</li>
|
||
</ul>
|
||
</li>
|
||
</ul>
|
||
<pre><code>Record uid: 64387815-d9a7-4605-8024-1c0a5c7520e0 couldn't be processed
|
||
com.atmire.statistics.util.update.atomic.ProcessingException: something went wrong while processing record uid: 64387815-d9a7-4605-8024-1c0a5c7520e0, an error occured in the com.atmire.statistics.util.update.atomic.processor.DeduplicateValuesProcessor
|
||
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.applyProcessors(SourceFile:304)
|
||
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.processRecords(SourceFile:176)
|
||
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.performRun(SourceFile:161)
|
||
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.update(SourceFile:128)
|
||
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdateCLI.main(SourceFile:78)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
|
||
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
||
at java.lang.reflect.Method.invoke(Method.java:498)
|
||
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:229)
|
||
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:81)
|
||
Caused by: java.lang.UnsupportedOperationException
|
||
at org.apache.solr.common.SolrDocument$1.entrySet(SolrDocument.java:256)
|
||
at java.util.HashMap.putMapEntries(HashMap.java:512)
|
||
at java.util.HashMap.<init>(HashMap.java:490)
|
||
at com.atmire.statistics.util.update.atomic.record.Record.getFieldValuesMap(SourceFile:86)
|
||
at com.atmire.statistics.util.update.atomic.processor.DeduplicateValuesProcessor.process(SourceFile:38)
|
||
at com.atmire.statistics.util.update.atomic.processor.DeduplicateValuesProcessor.visit(SourceFile:34)
|
||
at com.atmire.statistics.util.update.atomic.record.UsageRecord.accept(SourceFile:23)
|
||
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.applyProcessors(SourceFile:301)
|
||
... 10 more
|
||
</code></pre><ul>
|
||
<li>I sent some feedback to Atmire
|
||
<ul>
|
||
<li>They responded with an updated CUA (6.x-4.1.10-ilri-RC7) that has a fix for the duplicates processor <em>and</em> a possible fix for the database locking issues (a bug in CUASolrLoggerServiceImpl that causes an infinite loop and a Tomcat timeout)</li>
|
||
<li>I deployed the changes on DSpace Test and CGSpace, hopefully it will fix both issues!</li>
|
||
</ul>
|
||
</li>
|
||
<li>In other news, after I restarted Tomcat on CGSpace the statistics-2013 core didn’t come back up properly, so I exported it and imported it into the main statistics core like I did for the others a few days ago</li>
|
||
<li>Sync DSpace Test with CGSpace’s Solr, PostgreSQL database, and assetstore…</li>
|
||
</ul>
|
||
<h2 id="2020-12-09">2020-12-09</h2>
|
||
<ul>
|
||
<li>I was running the AtomicStatisticsUpdateCLI to remove duplicates on DSpace Test but it failed near the end of the statistics core (after 20 hours or so) with a memory error:</li>
|
||
</ul>
|
||
<pre><code>Successfully finished updating Solr Storage Reports | Wed Dec 09 15:25:11 CET 2020
|
||
Run 1 — 67% — 10,000/14,935 docs — 6m 6s — 6m 6s
|
||
Exception: GC overhead limit exceeded
|
||
java.lang.OutOfMemoryError: GC overhead limit exceeded
|
||
at org.noggit.CharArr.toString(CharArr.java:164)
|
||
</code></pre><ul>
|
||
<li>I increased the JVM heap to 2048m and tried again, but it failed with a memory error again…</li>
|
||
<li>I increased the JVM heap to 4096m and tried again, but it failed with another error:</li>
|
||
</ul>
|
||
<pre><code>Successfully finished updating Solr Storage Reports | Wed Dec 09 15:53:40 CET 2020
|
||
Exception: parsing error
|
||
org.apache.solr.client.solrj.impl.HttpSolrServer$RemoteSolrException: parsing error
|
||
at org.apache.solr.client.solrj.impl.HttpSolrServer.executeMethod(HttpSolrServer.java:530)
|
||
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:210)
|
||
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:206)
|
||
at org.apache.solr.client.solrj.request.QueryRequest.process(QueryRequest.java:91)
|
||
at org.apache.solr.client.solrj.SolrServer.query(SolrServer.java:301)
|
||
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.getNextSetOfSolrDocuments(SourceFile:392)
|
||
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.performRun(SourceFile:157)
|
||
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdater.update(SourceFile:128)
|
||
at com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdateCLI.main(SourceFile:78)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
|
||
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
||
at java.lang.reflect.Method.invoke(Method.java:498)
|
||
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:229)
|
||
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:81)
|
||
Caused by: org.apache.solr.common.SolrException: parsing error
|
||
at org.apache.solr.client.solrj.impl.BinaryResponseParser.processResponse(BinaryResponseParser.java:45)
|
||
at org.apache.solr.client.solrj.impl.HttpSolrServer.executeMethod(HttpSolrServer.java:528)
|
||
... 14 more
|
||
Caused by: org.apache.http.TruncatedChunkException: Truncated chunk ( expected size: 8192; actual size: 2843)
|
||
at org.apache.http.impl.io.ChunkedInputStream.read(ChunkedInputStream.java:200)
|
||
at org.apache.http.conn.EofSensorInputStream.read(EofSensorInputStream.java:137)
|
||
at org.apache.solr.common.util.FastInputStream.readWrappedStream(FastInputStream.java:80)
|
||
at org.apache.solr.common.util.FastInputStream.refill(FastInputStream.java:89)
|
||
at org.apache.solr.common.util.FastInputStream.read(FastInputStream.java:125)
|
||
at org.apache.solr.common.util.FastInputStream.readFully(FastInputStream.java:152)
|
||
...
|
||
</code></pre><h2 id="2020-12-10">2020-12-10</h2>
|
||
<ul>
|
||
<li>The statistics-2019 core finished processing the duplicate removal so I started the statistics-2017 core</li>
|
||
<li>Peter asked me to add ONE HEALTH to ILRI subjects on CGSpace</li>
|
||
<li>A few items that got “lost” after approval during the database issues earlier this week seem to have gone back into their workflows
|
||
<ul>
|
||
<li>Abenet approved them again and they got new handles, phew</li>
|
||
</ul>
|
||
</li>
|
||
<li>Abenet was having an issue with the date filter on AReS and it turns out that it’s the same <code>.keyword</code> issue I had noticed before that causes the filter to stop working
|
||
<ul>
|
||
<li>I fixed the filter to use the correct field name and filed a bug on OpenRXV: <a href="https://github.com/ilri/OpenRXV/issues/63">https://github.com/ilri/OpenRXV/issues/63</a></li>
|
||
</ul>
|
||
</li>
|
||
<li>I checked the Solr statistics on DSpace Test to see if the Atmire duplicates remover was working, but now I see a comical amount of duplicates…</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/12/solr-stats-duplicates.png" alt="Solr stats with dozens of duplicates"></p>
|
||
<ul>
|
||
<li>I sent feedback about this to Atmire</li>
|
||
<li>I will re-sync the Solr stats from CGSpace so we can try again…</li>
|
||
<li>In other news, it has been a few days since we deployed the fix for the database locking issue and things seem much better now:</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/12/postgres_connections_ALL-week.png" alt="PostgreSQL connections all week">
|
||
<img src="/cgspace-notes/2020/12/postgres_locks_ALL-week.png" alt="PostgreSQL locks all week"></p>
|
||
<h2 id="2020-12-13">2020-12-13</h2>
|
||
<ul>
|
||
<li>I tried to harvest a few times on OpenRXV in the last few days and every time it appends all the new records to the items index instead of overwriting it:</li>
|
||
</ul>
|
||
<p><img src="/cgspace-notes/2020/12/openrxv-duplicates.png" alt="OpenRXV duplicates"></p>
|
||
<ul>
|
||
<li>I can see it in the <code>openrxv-items-final</code> index:</li>
|
||
</ul>
|
||
<pre><code class="language-console" data-lang="console">$ curl -s 'http://localhost:9200/openrxv-items-final/_count?q=*' | json_pp
|
||
{
|
||
"_shards" : {
|
||
"failed" : 0,
|
||
"skipped" : 0,
|
||
"successful" : 1,
|
||
"total" : 1
|
||
},
|
||
"count" : 299922
|
||
}
|
||
</code></pre><ul>
|
||
<li>I filed a bug on OpenRXV: <a href="https://github.com/ilri/OpenRXV/issues/64">https://github.com/ilri/OpenRXV/issues/64</a></li>
|
||
<li>For now I will try to delete the index and start a re-harvest in the Admin UI:</li>
|
||
</ul>
|
||
<pre><code>$ curl -XDELETE http://localhost:9200/openrxv-items-final
|
||
{"acknowledged":true}%
|
||
</code></pre><ul>
|
||
<li>Moayad said he’s working on the harvesting so I stopped it for now to re-deploy his latest changes</li>
|
||
<li>I updated Tomcat to version 7.0.107 on CGSpace (linode18), ran all updates, and restarted the server</li>
|
||
<li>I deleted both items indexes and restarted the harvesting:</li>
|
||
</ul>
|
||
<pre><code>$ curl -XDELETE http://localhost:9200/openrxv-items-final
|
||
$ curl -XDELETE http://localhost:9200/openrxv-items-temp
|
||
</code></pre><!-- raw HTML omitted -->
|
||
|
||
|
||
|
||
|
||
|
||
</article>
|
||
|
||
|
||
|
||
</div> <!-- /.blog-main -->
|
||
|
||
<aside class="col-sm-3 ml-auto blog-sidebar">
|
||
|
||
|
||
|
||
<section class="sidebar-module">
|
||
<h4>Recent Posts</h4>
|
||
<ol class="list-unstyled">
|
||
|
||
|
||
<li><a href="/cgspace-notes/2020-12/">December, 2020</a></li>
|
||
|
||
<li><a href="/cgspace-notes/cgspace-dspace6-upgrade/">CGSpace DSpace 6 Upgrade</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2020-11/">November, 2020</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2020-10/">October, 2020</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2020-09/">September, 2020</a></li>
|
||
|
||
</ol>
|
||
</section>
|
||
|
||
|
||
|
||
|
||
<section class="sidebar-module">
|
||
<h4>Links</h4>
|
||
<ol class="list-unstyled">
|
||
|
||
<li><a href="https://cgspace.cgiar.org">CGSpace</a></li>
|
||
|
||
<li><a href="https://dspacetest.cgiar.org">DSpace Test</a></li>
|
||
|
||
<li><a href="https://github.com/ilri/DSpace">CGSpace @ GitHub</a></li>
|
||
|
||
</ol>
|
||
</section>
|
||
|
||
</aside>
|
||
|
||
|
||
</div> <!-- /.row -->
|
||
</div> <!-- /.container -->
|
||
|
||
|
||
|
||
<footer class="blog-footer">
|
||
<p dir="auto">
|
||
|
||
Blog template created by <a href="https://twitter.com/mdo">@mdo</a>, ported to Hugo by <a href='https://twitter.com/mralanorth'>@mralanorth</a>.
|
||
|
||
</p>
|
||
<p>
|
||
<a href="#">Back to top</a>
|
||
</p>
|
||
</footer>
|
||
|
||
|
||
</body>
|
||
|
||
</html>
|