diff --git a/content/posts/2022-06.md b/content/posts/2022-06.md
index c0197aa5f..a197ddc17 100644
--- a/content/posts/2022-06.md
+++ b/content/posts/2022-06.md
@@ -64,4 +64,14 @@ categories: ["Notes"]
- Once I downed and upped AReS with docker-compose I was able to start a new harvest
- I also did some tests to enable ES2020 target in the backend because we're on Node.js 14 there now
+## 2022-06-13
+
+- Create a user for Mohammed Salem to test MEL submission on DSpace Test:
+
+```console
+$ dspace user -a -m mel-submit@cgiar.org -g MEL -s Submit -p 'owwwwwwww'
+```
+
+- According to my notes from [2020-10]({{< relref "2020-10.md" >}}) the account must be in the admin group in order to submit via the REST API
+
diff --git a/docs/2015-11/index.html b/docs/2015-11/index.html
index 88c04a4fa..857d3f1a9 100644
--- a/docs/2015-11/index.html
+++ b/docs/2015-11/index.html
@@ -34,7 +34,7 @@ Last week I had increased the limit from 30 to 60, which seemed to help, but now
$ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace
78
"/>
-
+
diff --git a/docs/2015-12/index.html b/docs/2015-12/index.html
index 5e3a2bd6e..8574554ac 100644
--- a/docs/2015-12/index.html
+++ b/docs/2015-12/index.html
@@ -36,7 +36,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less
-rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo
-rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz
"/>
-
+
diff --git a/docs/2016-01/index.html b/docs/2016-01/index.html
index eff41481f..afec4bd5d 100644
--- a/docs/2016-01/index.html
+++ b/docs/2016-01/index.html
@@ -28,7 +28,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_
I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated.
Update GitHub wiki for documentation of maintenance tasks.
"/>
-
+
diff --git a/docs/2016-02/index.html b/docs/2016-02/index.html
index b91a1d7d2..895d71a4f 100644
--- a/docs/2016-02/index.html
+++ b/docs/2016-02/index.html
@@ -38,7 +38,7 @@ I noticed we have a very interesting list of countries on CGSpace:
Not only are there 49,000 countries, we have some blanks (25)…
Also, lots of things like “COTE D`LVOIRE” and “COTE D IVOIRE”
"/>
-
+
diff --git a/docs/2016-03/index.html b/docs/2016-03/index.html
index 5c38fb17e..2cba1260e 100644
--- a/docs/2016-03/index.html
+++ b/docs/2016-03/index.html
@@ -28,7 +28,7 @@ Looking at issues with author authorities on CGSpace
For some reason we still have the index-lucene-update cron job active on CGSpace, but I’m pretty sure we don’t need it as of the latest few versions of Atmire’s Listings and Reports module
Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server
"/>
-
+
diff --git a/docs/2016-04/index.html b/docs/2016-04/index.html
index 94575d29b..f005d101d 100644
--- a/docs/2016-04/index.html
+++ b/docs/2016-04/index.html
@@ -32,7 +32,7 @@ After running DSpace for over five years I’ve never needed to look in any
This will save us a few gigs of backup space we’re paying for on S3
Also, I noticed the checker log has some errors we should pay attention to:
"/>
-
+
diff --git a/docs/2016-05/index.html b/docs/2016-05/index.html
index 94705b388..e7aa7afdf 100644
--- a/docs/2016-05/index.html
+++ b/docs/2016-05/index.html
@@ -34,7 +34,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period!
# awk '{print $1}' /var/log/nginx/rest.log | uniq | wc -l
3168
"/>
-
+
diff --git a/docs/2016-06/index.html b/docs/2016-06/index.html
index 8d4c15ff4..11548e0ca 100644
--- a/docs/2016-06/index.html
+++ b/docs/2016-06/index.html
@@ -34,7 +34,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec
You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets
Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship
"/>
-
+
diff --git a/docs/2016-07/index.html b/docs/2016-07/index.html
index 833df59e2..b3c9f85de 100644
--- a/docs/2016-07/index.html
+++ b/docs/2016-07/index.html
@@ -44,7 +44,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and
In this case the select query was showing 95 results before the update
"/>
-
+
diff --git a/docs/2016-08/index.html b/docs/2016-08/index.html
index 3f7145fa2..54598e23f 100644
--- a/docs/2016-08/index.html
+++ b/docs/2016-08/index.html
@@ -42,7 +42,7 @@ $ git checkout -b 55new 5_x-prod
$ git reset --hard ilri/5_x-prod
$ git rebase -i dspace-5.5
"/>
-
+
diff --git a/docs/2016-09/index.html b/docs/2016-09/index.html
index 1f76c3c3c..4a3fe92ad 100644
--- a/docs/2016-09/index.html
+++ b/docs/2016-09/index.html
@@ -34,7 +34,7 @@ It looks like we might be able to use OUs now, instead of DCs:
$ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b "dc=cgiarad,dc=org" -D "admigration1@cgiarad.org" -W "(sAMAccountName=admigration1)"
"/>
-
+
diff --git a/docs/2016-10/index.html b/docs/2016-10/index.html
index ba4a27b43..3f31b12fa 100644
--- a/docs/2016-10/index.html
+++ b/docs/2016-10/index.html
@@ -42,7 +42,7 @@ I exported a random item’s metadata as CSV, deleted all columns except id
0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X
"/>
-
+
diff --git a/docs/2016-11/index.html b/docs/2016-11/index.html
index 46c820c38..0d449141a 100644
--- a/docs/2016-11/index.html
+++ b/docs/2016-11/index.html
@@ -26,7 +26,7 @@ Add dc.type to the output options for Atmire’s Listings and Reports module
Add dc.type to the output options for Atmire’s Listings and Reports module (#286)
"/>
-
+
diff --git a/docs/2016-12/index.html b/docs/2016-12/index.html
index c3fe7ba4d..8c8e15792 100644
--- a/docs/2016-12/index.html
+++ b/docs/2016-12/index.html
@@ -46,7 +46,7 @@ I see thousands of them in the logs for the last few months, so it’s not r
I’ve raised a ticket with Atmire to ask
Another worrying error from dspace.log is:
"/>
-
+
diff --git a/docs/2017-01/index.html b/docs/2017-01/index.html
index b2bf4db16..0e901a56d 100644
--- a/docs/2017-01/index.html
+++ b/docs/2017-01/index.html
@@ -28,7 +28,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s
I tested on DSpace Test as well and it doesn’t work there either
I asked on the dspace-tech mailing list because it seems to be broken, and actually now I’m not sure if we’ve ever had the sharding task run successfully over all these years
"/>
-
+
diff --git a/docs/2017-02/index.html b/docs/2017-02/index.html
index 306f37095..9e468ce0a 100644
--- a/docs/2017-02/index.html
+++ b/docs/2017-02/index.html
@@ -50,7 +50,7 @@ DELETE 1
Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301)
Looks like we’ll be using cg.identifier.ccafsprojectpii as the field name
"/>
-
+
diff --git a/docs/2017-03/index.html b/docs/2017-03/index.html
index b09a062cd..0570db48f 100644
--- a/docs/2017-03/index.html
+++ b/docs/2017-03/index.html
@@ -54,7 +54,7 @@ Interestingly, it seems DSpace 4.x’s thumbnails were sRGB, but forcing reg
$ identify ~/Desktop/alc_contrastes_desafios.jpg
/Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000
"/>
-
+
diff --git a/docs/2017-04/index.html b/docs/2017-04/index.html
index 4b72e9912..9e72abadc 100644
--- a/docs/2017-04/index.html
+++ b/docs/2017-04/index.html
@@ -40,7 +40,7 @@ Testing the CMYK patch on a collection with 650 items:
$ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p "ImageMagick PDF Thumbnail" -v >& /tmp/filter-media-cmyk.txt
"/>
-
+
diff --git a/docs/2017-05/index.html b/docs/2017-05/index.html
index 31b861726..87985125d 100644
--- a/docs/2017-05/index.html
+++ b/docs/2017-05/index.html
@@ -7,7 +7,7 @@
-
+
@@ -17,8 +17,8 @@
-
-
+
+
diff --git a/docs/2017-06/index.html b/docs/2017-06/index.html
index b0b192077..7eee4f026 100644
--- a/docs/2017-06/index.html
+++ b/docs/2017-06/index.html
@@ -7,7 +7,7 @@
-
+
@@ -17,8 +17,8 @@
-
-
+
+
diff --git a/docs/2017-07/index.html b/docs/2017-07/index.html
index f2037f2e7..4766516c0 100644
--- a/docs/2017-07/index.html
+++ b/docs/2017-07/index.html
@@ -36,7 +36,7 @@ Merge changes for WLE Phase II theme rename (#329)
Looking at extracting the metadata registries from ICARDA’s MEL DSpace database so we can compare fields with CGSpace
We can use PostgreSQL’s extended output format (-x) plus sed to format the output into quasi XML:
"/>
-
+
diff --git a/docs/2017-08/index.html b/docs/2017-08/index.html
index 36faa52a8..43552d892 100644
--- a/docs/2017-08/index.html
+++ b/docs/2017-08/index.html
@@ -60,7 +60,7 @@ This was due to newline characters in the dc.description.abstract column, which
I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d
Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet
"/>
-
+
diff --git a/docs/2017-09/index.html b/docs/2017-09/index.html
index 88795bb6e..3661505fa 100644
--- a/docs/2017-09/index.html
+++ b/docs/2017-09/index.html
@@ -32,7 +32,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two
Ask Sisay to clean up the WLE approvers a bit, as Marianne’s user account is both in the approvers step as well as the group
"/>
-
+
diff --git a/docs/2017-10/index.html b/docs/2017-10/index.html
index 1826da2fe..b2eaef714 100644
--- a/docs/2017-10/index.html
+++ b/docs/2017-10/index.html
@@ -34,7 +34,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336
There appears to be a pattern but I’ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine
Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections
"/>
-
+
diff --git a/docs/2017-11/index.html b/docs/2017-11/index.html
index d566d938a..8687d40b0 100644
--- a/docs/2017-11/index.html
+++ b/docs/2017-11/index.html
@@ -48,7 +48,7 @@ Generate list of authors on CGSpace for Peter to go through and correct:
dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
COPY 54701
"/>
-
+
diff --git a/docs/2017-12/index.html b/docs/2017-12/index.html
index d77050024..84fe28bf5 100644
--- a/docs/2017-12/index.html
+++ b/docs/2017-12/index.html
@@ -30,7 +30,7 @@ The logs say “Timeout waiting for idle object”
PostgreSQL activity says there are 115 connections currently
The list of connections to XMLUI and REST API for today:
"/>
-
+
diff --git a/docs/2018-01/index.html b/docs/2018-01/index.html
index 1e2e823ae..9aa385f87 100644
--- a/docs/2018-01/index.html
+++ b/docs/2018-01/index.html
@@ -150,7 +150,7 @@ dspace.log.2018-01-02:34
Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains
"/>
-
+
diff --git a/docs/2018-02/index.html b/docs/2018-02/index.html
index 78add50e8..ae5414c82 100644
--- a/docs/2018-02/index.html
+++ b/docs/2018-02/index.html
@@ -30,7 +30,7 @@ We don’t need to distinguish between internal and external works, so that
Yesterday I figured out how to monitor DSpace sessions using JMX
I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01
"/>
-
+
diff --git a/docs/2018-03/index.html b/docs/2018-03/index.html
index 381d89881..0d98661c7 100644
--- a/docs/2018-03/index.html
+++ b/docs/2018-03/index.html
@@ -24,7 +24,7 @@ Export a CSV of the IITA community metadata for Martin Mueller
Export a CSV of the IITA community metadata for Martin Mueller
"/>
-
+
diff --git a/docs/2018-04/index.html b/docs/2018-04/index.html
index b01f8c280..d7bc7edc2 100644
--- a/docs/2018-04/index.html
+++ b/docs/2018-04/index.html
@@ -26,7 +26,7 @@ Catalina logs at least show some memory errors yesterday:
I tried to test something on DSpace Test but noticed that it’s down since god knows when
Catalina logs at least show some memory errors yesterday:
"/>
-
+
diff --git a/docs/2018-05/index.html b/docs/2018-05/index.html
index 183454303..e752de5d3 100644
--- a/docs/2018-05/index.html
+++ b/docs/2018-05/index.html
@@ -38,7 +38,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E
Then I reduced the JVM heap size from 6144 back to 5120m
Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use
"/>
-
+
diff --git a/docs/2018-06/index.html b/docs/2018-06/index.html
index 256243311..476c5dc3e 100644
--- a/docs/2018-06/index.html
+++ b/docs/2018-06/index.html
@@ -58,7 +58,7 @@ real 74m42.646s
user 8m5.056s
sys 2m7.289s
"/>
-
+
diff --git a/docs/2018-07/index.html b/docs/2018-07/index.html
index 8a6b24986..ae5dd9af8 100644
--- a/docs/2018-07/index.html
+++ b/docs/2018-07/index.html
@@ -36,7 +36,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r
There is insufficient memory for the Java Runtime Environment to continue.
"/>
-
+
diff --git a/docs/2018-08/index.html b/docs/2018-08/index.html
index 86f8c4d49..6a67bf202 100644
--- a/docs/2018-08/index.html
+++ b/docs/2018-08/index.html
@@ -46,7 +46,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did
The server only has 8GB of RAM so we’ll eventually need to upgrade to a larger one because we’ll start starving the OS, PostgreSQL, and command line batch processes
I ran all system updates on DSpace Test and rebooted it
"/>
-
+
diff --git a/docs/2018-09/index.html b/docs/2018-09/index.html
index c22d8eb84..b05ed853a 100644
--- a/docs/2018-09/index.html
+++ b/docs/2018-09/index.html
@@ -30,7 +30,7 @@ I’ll update the DSpace role in our Ansible infrastructure playbooks and ru
Also, I’ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system’s RAM, and we never re-ran them after migrating to larger Linodes last month
I’m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I’m getting those autowire errors in Tomcat 8.5.30 again:
"/>
-
+
diff --git a/docs/2018-10/index.html b/docs/2018-10/index.html
index 814528bf8..d95bd906d 100644
--- a/docs/2018-10/index.html
+++ b/docs/2018-10/index.html
@@ -26,7 +26,7 @@ I created a GitHub issue to track this #389, because I’m super busy in Nai
Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items
I created a GitHub issue to track this #389, because I’m super busy in Nairobi right now
"/>
-
+
diff --git a/docs/2018-11/index.html b/docs/2018-11/index.html
index a0b42aadc..8c5a878c2 100644
--- a/docs/2018-11/index.html
+++ b/docs/2018-11/index.html
@@ -36,7 +36,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list
Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage
Today these are the top 10 IPs:
"/>
-
+
diff --git a/docs/2018-12/index.html b/docs/2018-12/index.html
index cabe74caa..77e5563d5 100644
--- a/docs/2018-12/index.html
+++ b/docs/2018-12/index.html
@@ -36,7 +36,7 @@ Then I ran all system updates and restarted the server
I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week
"/>
-
+
diff --git a/docs/2019-01/index.html b/docs/2019-01/index.html
index e36f62ce0..969d7ec9b 100644
--- a/docs/2019-01/index.html
+++ b/docs/2019-01/index.html
@@ -50,7 +50,7 @@ I don’t see anything interesting in the web server logs around that time t
357 207.46.13.1
903 54.70.40.11
"/>
-
+
diff --git a/docs/2019-02/index.html b/docs/2019-02/index.html
index 722c7252f..e80e959ab 100644
--- a/docs/2019-02/index.html
+++ b/docs/2019-02/index.html
@@ -72,7 +72,7 @@ real 0m19.873s
user 0m22.203s
sys 0m1.979s
"/>
-
+
diff --git a/docs/2019-03/index.html b/docs/2019-03/index.html
index 6f595695a..75f934336 100644
--- a/docs/2019-03/index.html
+++ b/docs/2019-03/index.html
@@ -46,7 +46,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo
I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs
"/>
-
+
diff --git a/docs/2019-04/index.html b/docs/2019-04/index.html
index 11b51e079..ad0142bc1 100644
--- a/docs/2019-04/index.html
+++ b/docs/2019-04/index.html
@@ -64,7 +64,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p 'fuuu' -m 228 -f cg.coverage.country -d
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d
"/>
-
+
diff --git a/docs/2019-05/index.html b/docs/2019-05/index.html
index 9b77bf73d..2d15cc69a 100644
--- a/docs/2019-05/index.html
+++ b/docs/2019-05/index.html
@@ -48,7 +48,7 @@ DELETE 1
But after this I tried to delete the item from the XMLUI and it is still present…
"/>
-
+
diff --git a/docs/2019-06/index.html b/docs/2019-06/index.html
index cc3addbb2..fef588bc1 100644
--- a/docs/2019-06/index.html
+++ b/docs/2019-06/index.html
@@ -34,7 +34,7 @@ Run system updates on CGSpace (linode18) and reboot it
Skype with Marie-Angélique and Abenet about CG Core v2
"/>
-
+
diff --git a/docs/2019-07/index.html b/docs/2019-07/index.html
index 232317c2e..407487b0e 100644
--- a/docs/2019-07/index.html
+++ b/docs/2019-07/index.html
@@ -38,7 +38,7 @@ CGSpace
Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community
"/>
-
+
diff --git a/docs/2019-08/index.html b/docs/2019-08/index.html
index 90c9b6595..fdec6a0d6 100644
--- a/docs/2019-08/index.html
+++ b/docs/2019-08/index.html
@@ -46,7 +46,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck
Run system updates on DSpace Test (linode19) and reboot it
"/>
-
+
diff --git a/docs/2019-09/index.html b/docs/2019-09/index.html
index 37bc47b6e..2dc91f3a9 100644
--- a/docs/2019-09/index.html
+++ b/docs/2019-09/index.html
@@ -72,7 +72,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning:
7249 2a01:7e00::f03c:91ff:fe18:7396
9124 45.5.186.2
"/>
-
+
diff --git a/docs/2019-10/index.html b/docs/2019-10/index.html
index 31684d73c..d32443480 100644
--- a/docs/2019-10/index.html
+++ b/docs/2019-10/index.html
@@ -7,7 +7,7 @@
-
+
@@ -17,8 +17,8 @@
-
-
+
+
diff --git a/docs/2019-11/index.html b/docs/2019-11/index.html
index cfc2be3de..0b4d3230f 100644
--- a/docs/2019-11/index.html
+++ b/docs/2019-11/index.html
@@ -58,7 +58,7 @@ Let’s see how many of the REST API requests were for bitstreams (because t
# zcat --force /var/log/nginx/rest.log.*.gz | grep -E "[0-9]{1,2}/Oct/2019" | grep -c -E "/rest/bitstreams"
106781
"/>
-
+
diff --git a/docs/2019-12/index.html b/docs/2019-12/index.html
index 33d4a1fa4..fa04676cd 100644
--- a/docs/2019-12/index.html
+++ b/docs/2019-12/index.html
@@ -46,7 +46,7 @@ Make sure all packages are up to date and the package manager is up to date, the
# dpkg -C
# reboot
"/>
-
+
diff --git a/docs/2020-01/index.html b/docs/2020-01/index.html
index fa8dca8be..9664bbff0 100644
--- a/docs/2020-01/index.html
+++ b/docs/2020-01/index.html
@@ -56,7 +56,7 @@ I tweeted the CGSpace repository link
"/>
-
+
diff --git a/docs/2020-02/index.html b/docs/2020-02/index.html
index 599436ee2..6537c31d0 100644
--- a/docs/2020-02/index.html
+++ b/docs/2020-02/index.html
@@ -38,7 +38,7 @@ The code finally builds and runs with a fresh install
"/>
-
+
@@ -48,7 +48,7 @@ The code finally builds and runs with a fresh install
"@type": "BlogPosting",
"headline": "February, 2020",
"url": "https://alanorth.github.io/cgspace-notes/2020-02/",
- "wordCount": "7238",
+ "wordCount": "7239",
"datePublished": "2020-02-02T11:56:30+02:00",
"dateModified": "2022-05-05T16:50:10+03:00",
"author": {
diff --git a/docs/2020-03/index.html b/docs/2020-03/index.html
index 6d9ff71ad..21472ee8f 100644
--- a/docs/2020-03/index.html
+++ b/docs/2020-03/index.html
@@ -42,7 +42,7 @@ You need to download this into the DSpace 6.x source and compile it
"/>
-
+
diff --git a/docs/2020-04/index.html b/docs/2020-04/index.html
index ed16f8691..eed790914 100644
--- a/docs/2020-04/index.html
+++ b/docs/2020-04/index.html
@@ -48,7 +48,7 @@ The third item now has a donut with score 1 since I tweeted it last week
On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week
"/>
-
+
diff --git a/docs/2020-05/index.html b/docs/2020-05/index.html
index b1d15d345..cb13418e7 100644
--- a/docs/2020-05/index.html
+++ b/docs/2020-05/index.html
@@ -34,7 +34,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2
"/>
-
+
diff --git a/docs/2020-06/index.html b/docs/2020-06/index.html
index b1c27c9d7..e1658b3f7 100644
--- a/docs/2020-06/index.html
+++ b/docs/2020-06/index.html
@@ -36,7 +36,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to
In other news, I checked the statistics API on DSpace 6 and it’s working
I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error:
"/>
-
+
diff --git a/docs/2020-07/index.html b/docs/2020-07/index.html
index 26a928b24..2193394b5 100644
--- a/docs/2020-07/index.html
+++ b/docs/2020-07/index.html
@@ -38,7 +38,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone
Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter’s request
"/>
-
+
diff --git a/docs/2020-08/index.html b/docs/2020-08/index.html
index 1cc7a39b5..8df6e3d80 100644
--- a/docs/2020-08/index.html
+++ b/docs/2020-08/index.html
@@ -36,7 +36,7 @@ It is class based so I can easily add support for other vocabularies, and the te
"/>
-
+
diff --git a/docs/2020-09/index.html b/docs/2020-09/index.html
index 89341db8f..b206a6848 100644
--- a/docs/2020-09/index.html
+++ b/docs/2020-09/index.html
@@ -48,7 +48,7 @@ I filed a bug on OpenRXV: https://github.com/ilri/OpenRXV/issues/39
I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40
"/>
-
+
diff --git a/docs/2020-10/index.html b/docs/2020-10/index.html
index 4a8356763..927dd37b4 100644
--- a/docs/2020-10/index.html
+++ b/docs/2020-10/index.html
@@ -44,7 +44,7 @@ During the FlywayDB migration I got an error:
"/>
-
+
diff --git a/docs/2020-11/index.html b/docs/2020-11/index.html
index 9a7963e64..d4b5b7036 100644
--- a/docs/2020-11/index.html
+++ b/docs/2020-11/index.html
@@ -32,7 +32,7 @@ So far we’ve spent at least fifty hours to process the statistics and stat
"/>
-
+
diff --git a/docs/2020-12/index.html b/docs/2020-12/index.html
index 189b93a1c..fdba2e798 100644
--- a/docs/2020-12/index.html
+++ b/docs/2020-12/index.html
@@ -36,7 +36,7 @@ I started processing those (about 411,000 records):
"/>
-
+
diff --git a/docs/2021-01/index.html b/docs/2021-01/index.html
index f20b723d1..53117feb3 100644
--- a/docs/2021-01/index.html
+++ b/docs/2021-01/index.html
@@ -50,7 +50,7 @@ For example, this item has 51 views on CGSpace, but 0 on AReS
"/>
-
+
diff --git a/docs/2021-02/index.html b/docs/2021-02/index.html
index 9bdd5aeb9..2c1cda0ba 100644
--- a/docs/2021-02/index.html
+++ b/docs/2021-02/index.html
@@ -60,7 +60,7 @@ $ curl -s 'http://localhost:9200/openrxv-items-temp/_count?q=*&pretty
}
}
"/>
-
+
diff --git a/docs/2021-03/index.html b/docs/2021-03/index.html
index d8592678b..0b99f04b5 100644
--- a/docs/2021-03/index.html
+++ b/docs/2021-03/index.html
@@ -34,7 +34,7 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst
"/>
-
+
diff --git a/docs/2021-04/index.html b/docs/2021-04/index.html
index c306b6e7d..93edbdf09 100644
--- a/docs/2021-04/index.html
+++ b/docs/2021-04/index.html
@@ -44,7 +44,7 @@ Perhaps one of the containers crashed, I should have looked closer but I was in
"/>
-
+
@@ -54,7 +54,7 @@ Perhaps one of the containers crashed, I should have looked closer but I was in
"@type": "BlogPosting",
"headline": "April, 2021",
"url": "https://alanorth.github.io/cgspace-notes/2021-04/",
- "wordCount": "4668",
+ "wordCount": "4669",
"datePublished": "2021-04-01T09:50:54+03:00",
"dateModified": "2021-04-28T18:57:48+03:00",
"author": {
diff --git a/docs/2021-05/index.html b/docs/2021-05/index.html
index 66aaa7670..791e9fa72 100644
--- a/docs/2021-05/index.html
+++ b/docs/2021-05/index.html
@@ -36,7 +36,7 @@ I looked at the top user agents and IPs in the Solr statistics for last month an
I will add the RI/1.0 pattern to our DSpace agents overload and purge them from Solr (we had previously seen this agent with 9,000 hits or so in 2020-09), but I think I will leave the Microsoft Word one… as that’s an actual user…
"/>
-
+
diff --git a/docs/2021-06/index.html b/docs/2021-06/index.html
index b66ce7368..5c60d7167 100644
--- a/docs/2021-06/index.html
+++ b/docs/2021-06/index.html
@@ -36,7 +36,7 @@ I simply started it and AReS was running again:
"/>
-
+
diff --git a/docs/2021-07/index.html b/docs/2021-07/index.html
index 337106677..84551c0b0 100644
--- a/docs/2021-07/index.html
+++ b/docs/2021-07/index.html
@@ -30,7 +30,7 @@ Export another list of ALL subjects on CGSpace, including AGROVOC and non-AGROVO
localhost/dspace63= > \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER;
COPY 20994
"/>
-
+
diff --git a/docs/2021-08/index.html b/docs/2021-08/index.html
index bda031060..089541799 100644
--- a/docs/2021-08/index.html
+++ b/docs/2021-08/index.html
@@ -32,7 +32,7 @@ Update Docker images on AReS server (linode20) and reboot the server:
I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
"/>
-
+
diff --git a/docs/2021-09/index.html b/docs/2021-09/index.html
index b3981ff8d..01563e4a5 100644
--- a/docs/2021-09/index.html
+++ b/docs/2021-09/index.html
@@ -48,7 +48,7 @@ The syntax Moayad showed me last month doesn’t seem to honor the search qu
"/>
-
+
diff --git a/docs/2021-10/index.html b/docs/2021-10/index.html
index 181abf3b8..7832c057b 100644
--- a/docs/2021-10/index.html
+++ b/docs/2021-10/index.html
@@ -46,7 +46,7 @@ $ wc -l /tmp/2021-10-01-affiliations.txt
So we have 1879/7100 (26.46%) matching already
"/>
-
+
diff --git a/docs/2021-11/index.html b/docs/2021-11/index.html
index 394679974..c433e3a21 100644
--- a/docs/2021-11/index.html
+++ b/docs/2021-11/index.html
@@ -32,7 +32,7 @@ First I exported all the 2019 stats from CGSpace:
$ ./run.sh -s http://localhost:8081/solr/statistics -f 'time:2019-*' -a export -o statistics-2019.json -k uid
$ zstd statistics-2019.json
"/>
-
+
diff --git a/docs/2021-12/index.html b/docs/2021-12/index.html
index ddf23f5e7..a56e81376 100644
--- a/docs/2021-12/index.html
+++ b/docs/2021-12/index.html
@@ -40,7 +40,7 @@ Purging 455 hits from WhatsApp in statistics
Total number of bot hits purged: 3679
"/>
-
+
diff --git a/docs/2022-01/index.html b/docs/2022-01/index.html
index 1f7276224..b6be69a1e 100644
--- a/docs/2022-01/index.html
+++ b/docs/2022-01/index.html
@@ -24,7 +24,7 @@ Start a full harvest on AReS
Start a full harvest on AReS
"/>
-
+
diff --git a/docs/2022-02/index.html b/docs/2022-02/index.html
index d04bd8f6c..963cf1e8b 100644
--- a/docs/2022-02/index.html
+++ b/docs/2022-02/index.html
@@ -38,7 +38,7 @@ We agreed to try to do more alignment of affiliations/funders with ROR
"/>
-
+
diff --git a/docs/2022-03/index.html b/docs/2022-03/index.html
index dd4610b9a..1f95b7986 100644
--- a/docs/2022-03/index.html
+++ b/docs/2022-03/index.html
@@ -34,7 +34,7 @@ $ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p 'fuuu&
$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv
$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
"/>
-
+
diff --git a/docs/2022-04/index.html b/docs/2022-04/index.html
index ab7298d45..ba83c9b78 100644
--- a/docs/2022-04/index.html
+++ b/docs/2022-04/index.html
@@ -7,7 +7,7 @@
-
+
@@ -17,8 +17,8 @@
-
-
+
+
diff --git a/docs/2022-05/index.html b/docs/2022-05/index.html
index 08cc4a413..ddae9fc62 100644
--- a/docs/2022-05/index.html
+++ b/docs/2022-05/index.html
@@ -66,7 +66,7 @@ If I query Solr for time:2022-04* AND dns:*msnbot* AND dns:*.msn.com. I see a ha
I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script
"/>
-
+
diff --git a/docs/2022-06/index.html b/docs/2022-06/index.html
index 3d6583e5c..e14656888 100644
--- a/docs/2022-06/index.html
+++ b/docs/2022-06/index.html
@@ -48,7 +48,7 @@ There seem to be many more of these:
"/>
-
+
@@ -58,7 +58,7 @@ There seem to be many more of these:
"@type": "BlogPosting",
"headline": "June, 2022",
"url": "https://alanorth.github.io/cgspace-notes/2022-06/",
- "wordCount": "461",
+ "wordCount": "509",
"datePublished": "2022-06-06T09:01:36+03:00",
"dateModified": "2022-06-08T15:36:09+03:00",
"author": {
@@ -204,6 +204,14 @@ There seem to be many more of these:
+
2022-06-13
+
+- Create a user for Mohammed Salem to test MEL submission on DSpace Test:
+
+$ dspace user -a -m mel-submit@cgiar.org -g MEL -s Submit -p 'owwwwwwww'
+
+- According to my notes from 2020-10 the account must be in the admin group in order to submit via the REST API
+
diff --git a/docs/404.html b/docs/404.html
index 38074f695..605ebf6e6 100644
--- a/docs/404.html
+++ b/docs/404.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/categories/index.html b/docs/categories/index.html
index b1f03d895..2e6f084ce 100644
--- a/docs/categories/index.html
+++ b/docs/categories/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/categories/notes/index.html b/docs/categories/notes/index.html
index d9e73fc83..8ed0a581a 100644
--- a/docs/categories/notes/index.html
+++ b/docs/categories/notes/index.html
@@ -17,7 +17,7 @@
-
+
@@ -165,7 +165,7 @@
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
Read more →
diff --git a/docs/categories/notes/index.xml b/docs/categories/notes/index.xml
index 303c672c8..00ab781bd 100644
--- a/docs/categories/notes/index.xml
+++ b/docs/categories/notes/index.xml
@@ -70,7 +70,7 @@
Fri, 01 Apr 2022 10:53:39 +0300
https://alanorth.github.io/cgspace-notes/2022-04/
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
-
@@ -686,7 +686,7 @@
Tue, 01 Oct 2019 13:20:51 +0300
https://alanorth.github.io/cgspace-notes/2019-10/
- 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
+ 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
-
diff --git a/docs/categories/notes/page/2/index.html b/docs/categories/notes/page/2/index.html
index a9e2482a7..822a5e07a 100644
--- a/docs/categories/notes/page/2/index.html
+++ b/docs/categories/notes/page/2/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/categories/notes/page/3/index.html b/docs/categories/notes/page/3/index.html
index ee7405258..5604ca8df 100644
--- a/docs/categories/notes/page/3/index.html
+++ b/docs/categories/notes/page/3/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/categories/notes/page/4/index.html b/docs/categories/notes/page/4/index.html
index a73a95561..8e7af856d 100644
--- a/docs/categories/notes/page/4/index.html
+++ b/docs/categories/notes/page/4/index.html
@@ -17,7 +17,7 @@
-
+
@@ -225,7 +225,7 @@
- 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
+ 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
Read more →
diff --git a/docs/categories/notes/page/5/index.html b/docs/categories/notes/page/5/index.html
index 99ce38091..c04513866 100644
--- a/docs/categories/notes/page/5/index.html
+++ b/docs/categories/notes/page/5/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/categories/notes/page/6/index.html b/docs/categories/notes/page/6/index.html
index 144350d0c..11db409d5 100644
--- a/docs/categories/notes/page/6/index.html
+++ b/docs/categories/notes/page/6/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/cgiar-library-migration/index.html b/docs/cgiar-library-migration/index.html
index 4cdbecf6e..01742ff04 100644
--- a/docs/cgiar-library-migration/index.html
+++ b/docs/cgiar-library-migration/index.html
@@ -18,7 +18,7 @@
-
+
diff --git a/docs/cgspace-cgcorev2-migration/index.html b/docs/cgspace-cgcorev2-migration/index.html
index e48923ed7..de8a566ae 100644
--- a/docs/cgspace-cgcorev2-migration/index.html
+++ b/docs/cgspace-cgcorev2-migration/index.html
@@ -18,7 +18,7 @@
-
+
diff --git a/docs/cgspace-dspace6-upgrade/index.html b/docs/cgspace-dspace6-upgrade/index.html
index ce7226490..781cb2352 100644
--- a/docs/cgspace-dspace6-upgrade/index.html
+++ b/docs/cgspace-dspace6-upgrade/index.html
@@ -18,7 +18,7 @@
-
+
diff --git a/docs/index.html b/docs/index.html
index 430f2c412..923f0dd6b 100644
--- a/docs/index.html
+++ b/docs/index.html
@@ -17,7 +17,7 @@
-
+
@@ -180,7 +180,7 @@
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
Read more →
diff --git a/docs/index.xml b/docs/index.xml
index b2feb602f..4b1deabb1 100644
--- a/docs/index.xml
+++ b/docs/index.xml
@@ -70,7 +70,7 @@
Fri, 01 Apr 2022 10:53:39 +0300
https://alanorth.github.io/cgspace-notes/2022-04/
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
-
@@ -686,7 +686,7 @@
Tue, 01 Oct 2019 13:20:51 +0300
https://alanorth.github.io/cgspace-notes/2019-10/
- 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
+ 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
-
@@ -1327,7 +1327,7 @@ COPY 54701
Thu, 01 Jun 2017 10:14:52 +0300
https://alanorth.github.io/cgspace-notes/2017-06/
- 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
+ 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
-
@@ -1336,7 +1336,7 @@ COPY 54701
Mon, 01 May 2017 16:21:52 +0200
https://alanorth.github.io/cgspace-notes/2017-05/
- 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
+ 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
-
diff --git a/docs/page/2/index.html b/docs/page/2/index.html
index 430d8f41d..946e4af12 100644
--- a/docs/page/2/index.html
+++ b/docs/page/2/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/page/3/index.html b/docs/page/3/index.html
index 49c3a4d54..33e8fe776 100644
--- a/docs/page/3/index.html
+++ b/docs/page/3/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/page/4/index.html b/docs/page/4/index.html
index 3836420e1..ea3c35828 100644
--- a/docs/page/4/index.html
+++ b/docs/page/4/index.html
@@ -17,7 +17,7 @@
-
+
@@ -240,7 +240,7 @@
- 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
+ 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
Read more →
diff --git a/docs/page/5/index.html b/docs/page/5/index.html
index 03c144515..039145e88 100644
--- a/docs/page/5/index.html
+++ b/docs/page/5/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/page/6/index.html b/docs/page/6/index.html
index 3dc223c66..79e098d57 100644
--- a/docs/page/6/index.html
+++ b/docs/page/6/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/page/7/index.html b/docs/page/7/index.html
index 145b1e0e0..f76845895 100644
--- a/docs/page/7/index.html
+++ b/docs/page/7/index.html
@@ -17,7 +17,7 @@
-
+
@@ -196,7 +196,7 @@
- 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
+ 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
Read more →
@@ -214,7 +214,7 @@
- 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
+ 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
Read more →
diff --git a/docs/page/8/index.html b/docs/page/8/index.html
index c2e2f75a1..fa4772e64 100644
--- a/docs/page/8/index.html
+++ b/docs/page/8/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/page/9/index.html b/docs/page/9/index.html
index 4ca5f78c4..e82875a7b 100644
--- a/docs/page/9/index.html
+++ b/docs/page/9/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/posts/index.html b/docs/posts/index.html
index 93a66cbcf..6925fafca 100644
--- a/docs/posts/index.html
+++ b/docs/posts/index.html
@@ -17,7 +17,7 @@
-
+
@@ -180,7 +180,7 @@
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
Read more →
diff --git a/docs/posts/index.xml b/docs/posts/index.xml
index 2bf177d5c..10b995cf2 100644
--- a/docs/posts/index.xml
+++ b/docs/posts/index.xml
@@ -70,7 +70,7 @@
Fri, 01 Apr 2022 10:53:39 +0300
https://alanorth.github.io/cgspace-notes/2022-04/
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
-
@@ -686,7 +686,7 @@
Tue, 01 Oct 2019 13:20:51 +0300
https://alanorth.github.io/cgspace-notes/2019-10/
- 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
+ 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
-
@@ -1327,7 +1327,7 @@ COPY 54701
Thu, 01 Jun 2017 10:14:52 +0300
https://alanorth.github.io/cgspace-notes/2017-06/
- 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
+ 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
-
@@ -1336,7 +1336,7 @@ COPY 54701
Mon, 01 May 2017 16:21:52 +0200
https://alanorth.github.io/cgspace-notes/2017-05/
- 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
+ 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
-
diff --git a/docs/posts/page/2/index.html b/docs/posts/page/2/index.html
index 3b489bce0..14ad917b0 100644
--- a/docs/posts/page/2/index.html
+++ b/docs/posts/page/2/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/posts/page/3/index.html b/docs/posts/page/3/index.html
index 10afebe03..1b7ab4a8b 100644
--- a/docs/posts/page/3/index.html
+++ b/docs/posts/page/3/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/posts/page/4/index.html b/docs/posts/page/4/index.html
index fb4d227ea..480bc3992 100644
--- a/docs/posts/page/4/index.html
+++ b/docs/posts/page/4/index.html
@@ -17,7 +17,7 @@
-
+
@@ -240,7 +240,7 @@
- 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
+ 2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc.
Read more →
diff --git a/docs/posts/page/5/index.html b/docs/posts/page/5/index.html
index d4275be30..e0552a6ad 100644
--- a/docs/posts/page/5/index.html
+++ b/docs/posts/page/5/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/posts/page/6/index.html b/docs/posts/page/6/index.html
index 9012195d4..e2e201ca6 100644
--- a/docs/posts/page/6/index.html
+++ b/docs/posts/page/6/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/posts/page/7/index.html b/docs/posts/page/7/index.html
index 0f14fc5fd..6f8e9a47b 100644
--- a/docs/posts/page/7/index.html
+++ b/docs/posts/page/7/index.html
@@ -17,7 +17,7 @@
-
+
@@ -196,7 +196,7 @@
- 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
+ 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
Read more →
@@ -214,7 +214,7 @@
- 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
+ 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
Read more →
diff --git a/docs/posts/page/8/index.html b/docs/posts/page/8/index.html
index b5c1be6de..423945834 100644
--- a/docs/posts/page/8/index.html
+++ b/docs/posts/page/8/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/posts/page/9/index.html b/docs/posts/page/9/index.html
index d9be15f12..5d38170f9 100644
--- a/docs/posts/page/9/index.html
+++ b/docs/posts/page/9/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/tags/index.html b/docs/tags/index.html
index 57436d32e..9c8445adb 100644
--- a/docs/tags/index.html
+++ b/docs/tags/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/tags/migration/index.html b/docs/tags/migration/index.html
index 82669b684..1b9c61e00 100644
--- a/docs/tags/migration/index.html
+++ b/docs/tags/migration/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/tags/notes/index.html b/docs/tags/notes/index.html
index a3cffb935..e7ca42074 100644
--- a/docs/tags/notes/index.html
+++ b/docs/tags/notes/index.html
@@ -17,7 +17,7 @@
-
+
@@ -181,7 +181,7 @@
- 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
+ 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
Read more →
@@ -199,7 +199,7 @@
- 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
+ 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
Read more →
diff --git a/docs/tags/notes/index.xml b/docs/tags/notes/index.xml
index e00f6126a..7fcff9dae 100644
--- a/docs/tags/notes/index.xml
+++ b/docs/tags/notes/index.xml
@@ -77,7 +77,7 @@
Thu, 01 Jun 2017 10:14:52 +0300
https://alanorth.github.io/cgspace-notes/2017-06/
- 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
+ 2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg.
-
@@ -86,7 +86,7 @@
Mon, 01 May 2017 16:21:52 +0200
https://alanorth.github.io/cgspace-notes/2017-05/
- 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
+ 2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace.
-
diff --git a/docs/tags/notes/page/2/index.html b/docs/tags/notes/page/2/index.html
index 7261a2b3b..50d86c322 100644
--- a/docs/tags/notes/page/2/index.html
+++ b/docs/tags/notes/page/2/index.html
@@ -17,7 +17,7 @@
-
+
diff --git a/docs/tags/notes/page/3/index.html b/docs/tags/notes/page/3/index.html
index f042e485e..8ded8bae7 100644
--- a/docs/tags/notes/page/3/index.html
+++ b/docs/tags/notes/page/3/index.html
@@ -17,7 +17,7 @@
-
+