cgspace-notes/docs/2020-11/index.html

562 lines
28 KiB
HTML
Raw Normal View History

2020-11-04 19:41:19 +01:00
<!DOCTYPE html>
<html lang="en" >
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta property="og:title" content="November, 2020" />
<meta property="og:description" content="2020-11-01
Continue with processing the statistics-2019 Solr core with the AtomicStatisticsUpdateCLI tool on DSpace Test
So far we&rsquo;ve spent at least fifty hours to process the statistics and statistics-2019 core&hellip; wow.
" />
<meta property="og:type" content="article" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2020-11/" />
<meta property="article:published_time" content="2020-11-01T13:11:54+02:00" />
2020-11-16 09:54:00 +01:00
<meta property="article:modified_time" content="2020-11-16T10:53:45+02:00" />
2020-11-04 19:41:19 +01:00
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="November, 2020"/>
<meta name="twitter:description" content="2020-11-01
Continue with processing the statistics-2019 Solr core with the AtomicStatisticsUpdateCLI tool on DSpace Test
So far we&rsquo;ve spent at least fifty hours to process the statistics and statistics-2019 core&hellip; wow.
"/>
2020-11-07 15:22:36 +01:00
<meta name="generator" content="Hugo 0.78.1" />
2020-11-04 19:41:19 +01:00
<script type="application/ld+json">
{
"@context": "http://schema.org",
"@type": "BlogPosting",
"headline": "November, 2020",
"url": "https://alanorth.github.io/cgspace-notes/2020-11/",
2020-11-17 21:14:56 +01:00
"wordCount": "2131",
2020-11-04 19:41:19 +01:00
"datePublished": "2020-11-01T13:11:54+02:00",
2020-11-16 09:54:00 +01:00
"dateModified": "2020-11-16T10:53:45+02:00",
2020-11-04 19:41:19 +01:00
"author": {
"@type": "Person",
"name": "Alan Orth"
},
"keywords": "Notes"
}
</script>
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2020-11/">
<title>November, 2020 | CGSpace Notes</title>
<!-- combined, minified CSS -->
2020-11-16 09:54:00 +01:00
<link href="https://alanorth.github.io/cgspace-notes/css/style.d20c61b183eb27beb5b2c48f70a38b91c8bb5fb929e77b447d5f77c7285221ad.css" rel="stylesheet" integrity="sha256-0gxhsYPrJ761ssSPcKOLkci7X7kp53tEfV93xyhSIa0=" crossorigin="anonymous">
2020-11-04 19:41:19 +01:00
<!-- minified Font Awesome for SVG icons -->
2020-11-16 09:54:00 +01:00
<script defer src="https://alanorth.github.io/cgspace-notes/js/fontawesome.min.4ed405d7c7002b970d34cbe6026ff44a556b0808cb98a9db4008752110ed964b.js" integrity="sha256-TtQF18cAK5cNNMvmAm/0SlVrCAjLmKnbQAh1IRDtlks=" crossorigin="anonymous"></script>
2020-11-04 19:41:19 +01:00
<!-- RSS 2.0 feed -->
</head>
<body>
<div class="blog-masthead">
<div class="container">
<nav class="nav blog-nav">
<a class="nav-link " href="https://alanorth.github.io/cgspace-notes/">Home</a>
</nav>
</div>
</div>
<header class="blog-header">
<div class="container">
<h1 class="blog-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/" rel="home">CGSpace Notes</a></h1>
<p class="lead blog-description" dir="auto">Documenting day-to-day work on the <a href="https://cgspace.cgiar.org">CGSpace</a> repository.</p>
</div>
</header>
<div class="container">
<div class="row">
<div class="col-sm-8 blog-main">
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2020-11/">November, 2020</a></h2>
2020-11-16 09:54:00 +01:00
<p class="blog-post-meta">
<time datetime="2020-11-01T13:11:54+02:00">Sun Nov 01, 2020</time>
in
2020-11-04 19:41:19 +01:00
<span class="fas fa-folder" aria-hidden="true"></span>&nbsp;<a href="/cgspace-notes/categories/notes/" rel="category tag">Notes</a>
</p>
</header>
<h2 id="2020-11-01">2020-11-01</h2>
<ul>
<li>Continue with processing the statistics-2019 Solr core with the AtomicStatisticsUpdateCLI tool on DSpace Test
<ul>
<li>So far we&rsquo;ve spent at least fifty hours to process the statistics and statistics-2019 core&hellip; wow.</li>
</ul>
</li>
</ul>
<h2 id="2020-11-02">2020-11-02</h2>
<ul>
<li>Talk to Moayad and fix a few issues on OpenRXV:
<ul>
<li>Incorrect views and downloads (caused by Elasticsearch&rsquo;s default result set size of 10)</li>
<li>Invalid share link</li>
<li>Missing &ldquo;https://&rdquo; for Handles in the Simple Excel report (caused by using the <code>handle</code> instead of the <code>uri</code>)</li>
<li>Sorting the list of items by views</li>
</ul>
</li>
<li>I resumed the processing of the statistics-2018 Solr core after it spent 20 hours to get to 60%</li>
</ul>
<h2 id="2020-11-04">2020-11-04</h2>
<ul>
<li>After 29 hours the statistics-2017 core finished processing so I started the statistics-2016 core on DSpace Test</li>
</ul>
2020-11-05 11:14:50 +01:00
<h2 id="2020-11-05">2020-11-05</h2>
<ul>
<li>Peter sent me corrections and deletions for the author affiliations
<ul>
<li>I quickly proofed them for UTF-8 issues in OpenRefine and csv-metadata-quality and then tested them locally and then applied them on CGSpace:</li>
</ul>
</li>
</ul>
<pre><code>$ ./fix-metadata-values.py -i 2020-11-05-fix-862-affiliations.csv -db dspace -u dspace -p 'fuuu' -f cg.contributor.affiliation -t 'correct' -m 211
$ ./delete-metadata-values.py -i 2020-11-05-delete-29-affiliations.csv -db dspace -u dspace -p 'fuuu' -f cg.contributor.affiliation -m 211
</code></pre><ul>
<li>Then I started a Discovery re-index on CGSpace:</li>
</ul>
<pre><code>$ time chrt -b 0 ionice -c2 -n7 nice -n19 dspace index-discovery -b
2020-11-05 13:04:34 +01:00
real 92m24.993s
user 8m11.858s
sys 2m26.931s
2020-11-07 15:22:36 +01:00
</code></pre><h2 id="2020-11-06">2020-11-06</h2>
<ul>
<li>Restart the AtomicStatisticsUpdateCLI processing of the statistics-2016 core on DSpace Test after 20 hours&hellip;
<ul>
<li>This phase finished after five hours so I started it on the statistics-2015 core</li>
</ul>
</li>
</ul>
<h2 id="2020-11-07">2020-11-07</h2>
<ul>
<li>Atmire responded about the issue with duplicate values in owningComm and containerCommunity etc
<ul>
<li>I told them to please look into it and use some of our credits if need be</li>
</ul>
</li>
<li>The statistics-2015 core finished after 20 hours so I started the statistics-2014 core</li>
</ul>
2020-11-08 13:43:00 +01:00
<h2 id="2020-11-08">2020-11-08</h2>
<ul>
<li>Add &ldquo;Data Paper&rdquo; to types on CGSpace</li>
<li>Add &ldquo;SCALING CLIMATE-SMART AGRICULTURE&rdquo; to CCAFS subjects on CGSpace</li>
<li>Add &ldquo;ANDEAN ROOTS AND TUBERS&rdquo; to CIP subjects on CGSpace</li>
<li>Add CGIAR System subjects to Discovery sidebar facets on CGSpace
<ul>
<li>Also add the System subject to item view on CGSpace</li>
</ul>
</li>
2020-11-08 14:03:02 +01:00
<li>The statistics-2014 core finished processing after five hours, so I started processing the statistics-2013 core on DSpace Test</li>
<li>Since I was going to restart CGSpace and update the Discovery indexes anyways I decided to check for any straggling upper case AGROVOC entries and lower case them:</li>
</ul>
<pre><code>dspace=# BEGIN;
dspace=# UPDATE metadatavalue SET text_value=LOWER(text_value) WHERE resource_type_id=2 AND metadata_field_id=57 AND text_value ~ '[[:upper:]]';
UPDATE 164
dspace=# COMMIT;
</code></pre><ul>
<li>Run system updates on CGSpace (linode18) and reboot it
<ul>
<li>I had to restart Tomcat once after the machine started up to get all Solr statistics cores to load properly</li>
</ul>
</li>
2020-11-10 16:00:02 +01:00
<li>After about ten more hours the rest of the Solr statistics cores finished processing on DSpace Test and I started optimizing them in Solr admin UI</li>
</ul>
<h2 id="2020-11-10">2020-11-10</h2>
<ul>
<li>I am noticing that CGSpace doesn&rsquo;t have any statistics showing for years before 2020, but all cores are loaded successfully in Solr Admin UI&hellip; strange
<ul>
<li>I restarted Tomcat and I see in Solr Admin UI that the statistics-2015 core failed to load</li>
<li>Looking in the DSpace log I see:</li>
</ul>
</li>
</ul>
<pre><code>2020-11-10 08:43:59,634 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:43:59,687 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2018
2020-11-10 08:43:59,707 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:44:00,004 WARN org.dspace.core.ConfigurationManager @ Requested configuration module: atmire-datatables not found
2020-11-10 08:44:00,005 WARN org.dspace.core.ConfigurationManager @ Requested configuration module: atmire-datatables not found
2020-11-10 08:44:00,005 WARN org.dspace.core.ConfigurationManager @ Requested configuration module: atmire-datatables not found
2020-11-10 08:44:00,325 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2015
</code></pre><ul>
<li>Seems that the core gets probed twice&hellip; perhaps a threading issue?
<ul>
<li>The only thing I can think of is the <code>acceptorThreadCount</code> parameter in Tomcat&rsquo;s server.xml, which has been set to 2 since 2018-01 (we started sharding the Solr statistics cores in 2019-01 and that&rsquo;s when this problem arose)</li>
<li>I will try reducing that to 1</li>
<li>Wow, now it&rsquo;s even worse:</li>
</ul>
</li>
</ul>
<pre><code>2020-11-10 08:51:03,007 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2018
2020-11-10 08:51:03,008 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:51:03,137 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2018
2020-11-10 08:51:03,153 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:51:03,289 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2015
2020-11-10 08:51:03,289 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2010
2020-11-10 08:51:03,475 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2010
2020-11-10 08:51:03,475 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2016
2020-11-10 08:51:03,730 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2016
2020-11-10 08:51:03,731 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2017
2020-11-10 08:51:03,992 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2017
2020-11-10 08:51:03,992 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2011
2020-11-10 08:51:04,178 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2011
2020-11-10 08:51:04,178 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2012
</code></pre><ul>
<li>Could it be because we have two Tomcat connectors?
<ul>
<li>I restarted Tomcat a few more times before all cores loaded, and still there are no stats before 2020-01&hellip; hmmmmm</li>
</ul>
</li>
<li>I added a <a href="https://github.com/ilri/OpenRXV/commit/3816b9b3f3d9182d2ba1a899c1017c5895a59dee">lowercase formatter to OpenRXV</a> so that we can lowercase AGROVOC subjects during harvesting</li>
2020-11-08 13:43:00 +01:00
</ul>
2020-11-13 08:22:18 +01:00
<h2 id="2020-11-11">2020-11-11</h2>
<ul>
<li>Atmire responded with a quote for the work to fix the duplicate owningComm, etc in our Solr data
<ul>
<li>I told them to proceed, as it&rsquo;s within our budget of credits</li>
<li>They will write a processor for DSpace 6 to remove the duplicates</li>
</ul>
</li>
<li>I did some tests to add a usage statistics chart to the item views on DSpace Test
<ul>
<li>It is inspired by Salem&rsquo;s work on WorldFish&rsquo;s repository, and it hits the dspace-statistics-api for the current item and displays a graph</li>
2020-11-15 10:35:31 +01:00
<li>I got it working very easily for all-time statistics with Chart.js, but I think I will need to use Highcharts or something else because Chart.js is HTML5 canvas and doesn&rsquo;t allow theming via CSS (so our Bootstrap brand colors for each theme won&rsquo;t work)
<ul>
<li>Hmm, Highcharts is not licensed under and open source license so I will not use it</li>
<li>Perhaps I&rsquo;ll use Chartist with the popover plugin&hellip;</li>
</ul>
</li>
2020-11-13 08:22:18 +01:00
<li>I think I&rsquo;ll pursue this after the DSpace 6 upgrade&hellip;</li>
</ul>
</li>
</ul>
<h2 id="2020-11-12">2020-11-12</h2>
<ul>
<li>I was looking at Solr again trying to find a way to get community and collection stats by faceting on <code>owningComm</code> and <code>owningColl</code> and it seems to work actually
<ul>
<li>The duplicated values in the multi-value fields don&rsquo;t seem to affect the counts, as I had thought previously (though we should still get rid of them)</li>
<li>One major difference between the raw numbers I was looking at and Atmire&rsquo;s numbers is that Atmire&rsquo;s code filters &ldquo;Internal&rdquo; IP addresses&hellip;</li>
<li>Also, instead of doing <code>isBot:false</code> I think I should do <code>-isBot:true</code> because it&rsquo;s not a given that all documents will have this field and have it false, but we can definitely exclude the ones that have it as true</li>
</ul>
</li>
<li>First we get the total number of communities with stats (using calcdistinct):</li>
</ul>
<pre><code>facet=true&amp;facet.field=owningComm&amp;facet.mincount=1&amp;facet.limit=1&amp;facet.offset=0&amp;stats=true&amp;stats.field=owningComm&amp;stats.calcdistinct=true&amp;shards=http://localhost:8081/solr/statistics,http://localhost:8081/solr/statistics-2019,http://localhost:8081/solr/statistics-2018,http://localhost:8081/solr/statistics-2017,http://localhost:8081/solr/statistics-2016,http://localhost:8081/solr/statistics-2015,http://localhost:8081/solr/statistics-2014,http://localhost:8081/solr/statistics-2013,http://localhost:8081/solr/statistics-2012,http://localhost:8081/solr/statistics-2011,http://localhost:8081/solr/statistics-2010
</code></pre><ul>
<li>Then get stats themselves, iterating 100 items at a time with limit and offset:</li>
</ul>
<pre><code>facet=true&amp;facet.field=owningComm&amp;facet.mincount=1&amp;facet.limit=100&amp;facet.offset=0&amp;shards=http://localhost:8081/solr/statistics,http://localhost:8081/solr/statistics-2019,http://localhost:8081/solr/statistics-2018,http://localhost:8081/solr/statistics-2017,http://localhost:8081/solr/statistics-2016,http://localhost:8081/solr/statistics-2015,http://localhost:8081/solr/statistics-2014,http://localhost:8081/solr/statistics-2013,http://localhost:8081/solr/statistics-2012,http://localhost:8081/solr/statistics-2011,http://localhost:8081/solr/statistics-2010
</code></pre><ul>
<li>I was surprised to see 10,000,000 docs with <code>isBot:true</code> when I was testing on DSpace Test&hellip;
<ul>
<li>This has got to be a mistake of some kind, as I see 4 million in 2014 that are from <code>dns:localhost.</code>, perhaps that&rsquo;s when we didn&rsquo;t have useProxies set up correctly?</li>
<li>I don&rsquo;t see the same thing on CGSpace&hellip; I wonder what happened?</li>
<li>Perhaps they got re-tagged during the DSpace 6 upgrade, somehow during the Solr migration? Hmmmmm. Definitely have to be careful with <code>isBot:true</code> in the future and not automatically purge these!!!</li>
</ul>
</li>
<li>I noticed 120,000+ hits from monit, FeedBurner, and Blackboard Safeassign in 2014, 2015, 2016, 2017, etc&hellip;
<ul>
<li>I hadn&rsquo;t seen monit before, but the others are already in DSpace&rsquo;s spider agents lists for some time so probably only appear in older stats cores</li>
<li>The issue with purging these using <code>check-spider-hits.sh</code> is that it can&rsquo;t do case-insensitive regexes and some metacharacters like <code>\s</code> don&rsquo;t work so I added case-sensitive patterns to a local agents file and purged them with the script</li>
</ul>
</li>
</ul>
2020-11-15 10:35:31 +01:00
<h2 id="2020-11-15">2020-11-15</h2>
<ul>
<li>Upgrade CGSpace to DSpace 6.3
<ul>
<li>First build, update, and migrate the database:</li>
</ul>
</li>
</ul>
<pre><code>$ dspace cleanup -v
$ git checkout origin/6_x-dev-atmire-modules
$ npm install -g yarn
$ chrt -b 0 mvn -U -Dmirage2.on=true -Dmirage2.deps.included=false -P \!dspace-lni,\!dspace-rdf,\!dspace-sword,\!dspace-swordv2,\!dspace-jspui clean package
$ sudo su - postgres
$ psql dspace -c 'CREATE EXTENSION pgcrypto;'
$ psql dspace -c &quot;DELETE FROM schema_version WHERE version IN ('5.8.2015.12.03.3');&quot;
$ exit
$ rm -rf /home/cgspace/config/spring
$ ant update
$ dspace database info
$ dspace database migrate
$ sudo systemctl start tomcat7
</code></pre><ul>
<li>After starting Tomcat DSpace should start up OK and begin Discovery indexing, but I want to also upgrade from PostgreSQL 9.6 to 10
<ul>
<li>I installed and configured PostgreSQL 10 using the Ansible playbooks, then migrated the database manually:</li>
</ul>
</li>
</ul>
<pre><code># systemctl stop tomcat7
# pg_ctlcluster 9.6 main stop
# tar -cvzpf var-lib-postgresql-9.6.tar.gz /var/lib/postgresql/9.6
# tar -cvzpf etc-postgresql-9.6.tar.gz /etc/postgresql/9.6
# pg_ctlcluster 10 main stop
# pg_dropcluster 10 main
# pg_upgradecluster 9.6 main
# pg_dropcluster 9.6 main
# systemctl start postgresql
# dpkg -l | grep postgresql | grep 9.6 | awk '{print $2}' | xargs dpkg -r
</code></pre><ul>
<li>Then I ran all system updates and rebooted the server&hellip;</li>
<li>After the server came back up I re-ran the Ansible playbook to make sure all configs and services were updated</li>
<li>I disabled the dspace-statistsics-api for now because it won&rsquo;t work until I migrate all the Solr statistics anyways</li>
<li>Start a full Discovery re-indexing:</li>
</ul>
<pre><code>$ time chrt -b 0 ionice -c2 -n7 nice -n19 dspace index-discovery -b
2020-11-16 09:54:00 +01:00
real 211m30.726s
user 134m40.124s
sys 2m17.979s
</code></pre><ul>
<li>Towards the end of the indexing there were a few dozen of these messages:</li>
</ul>
<pre><code>2020-11-15 13:23:21,685 INFO com.atmire.dspace.discovery.service.AtmireSolrService @ Removed Item: null from Index
2020-11-15 10:35:31 +01:00
</code></pre><ul>
<li>I updated all the Ansible infrastructure and DSpace branches to be the DSpace 6 ones</li>
<li>I will wait until the Discovery indexing is finished to start doing the Solr statistics migration</li>
2020-11-16 09:54:00 +01:00
<li>I tested the email functionality and it seems to need more configuration:</li>
2020-11-15 10:35:31 +01:00
</ul>
2020-11-16 09:54:00 +01:00
<pre><code>$ dspace test-email
About to send test email:
- To: blah@cgiar.org
- Subject: DSpace test email
- Server: smtp.office365.com
Error sending email:
- Error: com.sun.mail.smtp.SMTPSendFailedException: 451 5.7.3 STARTTLS is required to send mail [AM4PR0701CA0003.eurprd07.prod.outlook.com]
</code></pre><ul>
<li>I copied the <code>mail.extraproperties = mail.smtp.starttls.enable=true</code> setting from the old DSpace 5 <code>dspace.cfg</code> and now the emails are working</li>
<li>After the Discovery indexing finished I started processing the Solr stats one core and 2.5 million records at a time:</li>
</ul>
<pre><code>$ export JAVA_OPTS='-Dfile.encoding=UTF-8 -Xmx2048m'
$ chrt -b 0 dspace solr-upgrade-statistics-6x -n 2500000 -i statistics
</code></pre><ul>
<li>After about 6,000,000 records I got the same error that I&rsquo;ve gotten every time I test this migration process:</li>
</ul>
<pre><code>Exception: Error while creating field 'p_group_id{type=uuid,properties=indexed,stored,multiValued}' from value '10'
org.apache.solr.client.solrj.impl.HttpSolrServer$RemoteSolrException: Error while creating field 'p_group_id{type=uuid,properties=indexed,stored,multiValued}' from value '10'
at org.apache.solr.client.solrj.impl.HttpSolrServer.executeMethod(HttpSolrServer.java:552)
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:210)
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:206)
at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:124)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:68)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:54)
at org.dspace.util.SolrUpgradePre6xStatistics.batchUpdateStats(SolrUpgradePre6xStatistics.java:161)
at org.dspace.util.SolrUpgradePre6xStatistics.run(SolrUpgradePre6xStatistics.java:456)
at org.dspace.util.SolrUpgradePre6xStatistics.main(SolrUpgradePre6xStatistics.java:365)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:229)
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:81)
2020-11-17 21:14:56 +01:00
</code></pre><h2 id="2020-11-16">2020-11-16</h2>
<ul>
<li>Users are having issues submitting items to CGSpace
<ul>
<li>Looking at the data I see that connections skyrocketed since DSpace 6 upgrade yesterday, and they are all in &ldquo;waiting for lock&rdquo; state:</li>
</ul>
</li>
</ul>
<p><img src="/cgspace-notes/2020/11/postgres_connections_ALL-week.png" alt="PostgreSQL connections week">
<img src="/cgspace-notes/2020/11/postgres_locks_ALL-week.png" alt="PostgreSQL locks week"></p>
<ul>
<li>There are almost 1,500 locks:</li>
</ul>
<pre><code>$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | wc -l
1494
</code></pre><ul>
<li>I sent a mail to the dspace-tech mailing list to ask for help&hellip;
<ul>
<li>For now I just restarted PostgreSQL and a few users were able to complete submissions&hellip;</li>
</ul>
</li>
<li>While processing the statistics-2018 Solr core I got the <em>same</em> memory error that I have gotten every time I processed this core in testing:</li>
</ul>
<pre><code>Exception: Java heap space
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3332)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:448)
at java.lang.StringBuffer.append(StringBuffer.java:270)
at java.io.StringWriter.write(StringWriter.java:101)
at org.apache.solr.common.util.XML.writeXML(XML.java:133)
at org.apache.solr.client.solrj.util.ClientUtils.writeVal(SourceFile:160)
at org.apache.solr.client.solrj.util.ClientUtils.writeXML(SourceFile:128)
at org.apache.solr.client.solrj.request.UpdateRequest.writeXML(UpdateRequest.java:365)
at org.apache.solr.client.solrj.request.UpdateRequest.getXML(UpdateRequest.java:281)
at org.apache.solr.client.solrj.request.RequestWriter.getContentStream(RequestWriter.java:67)
at org.apache.solr.client.solrj.request.RequestWriter$LazyContentStream.getDelegate(RequestWriter.java:95)
at org.apache.solr.client.solrj.request.RequestWriter$LazyContentStream.getName(RequestWriter.java:105)
at org.apache.solr.client.solrj.impl.HttpSolrServer.createMethod(HttpSolrServer.java:302)
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:210)
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:206)
at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:124)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:68)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:54)
at org.dspace.util.SolrUpgradePre6xStatistics.batchUpdateStats(SolrUpgradePre6xStatistics.java:161)
at org.dspace.util.SolrUpgradePre6xStatistics.run(SolrUpgradePre6xStatistics.java:456)
at org.dspace.util.SolrUpgradePre6xStatistics.main(SolrUpgradePre6xStatistics.java:365)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:229)
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:81)
</code></pre><ul>
<li>I increased the Java heap memory to 4096MB and restarted the processing
<ul>
<li>After a few hours I got the following error, which I have gotten several times over the last few months:</li>
</ul>
</li>
</ul>
<pre><code>Exception: Error while creating field 'p_group_id{type=uuid,properties=indexed,stored,multiValued}' from value '10'
org.apache.solr.client.solrj.impl.HttpSolrServer$RemoteSolrException: Error while creating field 'p_group_id{type=uuid,properties=indexed,stored,multiValued}' from value '10'
at org.apache.solr.client.solrj.impl.HttpSolrServer.executeMethod(HttpSolrServer.java:552)
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:210)
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:206)
at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:124)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:68)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:54)
at org.dspace.util.SolrUpgradePre6xStatistics.batchUpdateStats(SolrUpgradePre6xStatistics.java:161)
at org.dspace.util.SolrUpgradePre6xStatistics.run(SolrUpgradePre6xStatistics.java:456)
at org.dspace.util.SolrUpgradePre6xStatistics.main(SolrUpgradePre6xStatistics.java:365)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:229)
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:81)
</code></pre><h2 id="2020-11-17">2020-11-17</h2>
<ul>
<li>Chat with Peter about using some remaining CRP Livestock open access money to fund more work on OpenRXV / AReS
<ul>
<li>I will create GitHub issues for each of the things we talked about and then create ToRs to send to CodeObia for a quote</li>
</ul>
</li>
<li>Continue migrating Solr statistics to DSpace 6 UUID format after the upgrade on Sunday</li>
<li>Regarding the IWMI issue about flagships and strategic priorities we can use CRP Livestock as an example because all their <a href="https://cgspace.cgiar.org/handle/10568/80102">flagships are mapped to collections</a></li>
<li>Database issues are worse today&hellip;</li>
</ul>
<p><img src="/cgspace-notes/2020/11/postgres_connections_ALL-week2.png" alt="PostgreSQL connections week">
<img src="/cgspace-notes/2020/11/postgres_locks_ALL-week2.png" alt="PostgreSQL locks week"></p>
<ul>
<li>There are over 2,000 locks:</li>
</ul>
<pre><code>$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | wc -l
2071
2020-11-16 09:54:00 +01:00
</code></pre><!-- raw HTML omitted -->
2020-11-04 19:41:19 +01:00
</article>
</div> <!-- /.blog-main -->
<aside class="col-sm-3 ml-auto blog-sidebar">
<section class="sidebar-module">
<h4>Recent Posts</h4>
<ol class="list-unstyled">
2020-11-17 21:14:56 +01:00
<li><a href="/cgspace-notes/cgspace-dspace6-upgrade/">CGSpace DSpace 6 Upgrade</a></li>
2020-11-04 19:41:19 +01:00
<li><a href="/cgspace-notes/2020-11/">November, 2020</a></li>
<li><a href="/cgspace-notes/2020-10/">October, 2020</a></li>
<li><a href="/cgspace-notes/2020-09/">September, 2020</a></li>
<li><a href="/cgspace-notes/2020-08/">August, 2020</a></li>
</ol>
</section>
<section class="sidebar-module">
<h4>Links</h4>
<ol class="list-unstyled">
<li><a href="https://cgspace.cgiar.org">CGSpace</a></li>
<li><a href="https://dspacetest.cgiar.org">DSpace Test</a></li>
<li><a href="https://github.com/ilri/DSpace">CGSpace @ GitHub</a></li>
</ol>
</section>
</aside>
</div> <!-- /.row -->
</div> <!-- /.container -->
<footer class="blog-footer">
<p dir="auto">
Blog template created by <a href="https://twitter.com/mdo">@mdo</a>, ported to Hugo by <a href='https://twitter.com/mralanorth'>@mralanorth</a>.
</p>
<p>
<a href="#">Back to top</a>
</p>
</footer>
</body>
</html>