mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-24 23:50:17 +01:00
Add notes for 2023-03-21
This commit is contained in:
parent
cfdd1cb7fa
commit
66a1f54e3a
@ -403,4 +403,51 @@ Opened test.csv
|
||||
|
||||
- Start a harvest on AReS
|
||||
|
||||
## 2023-03-20
|
||||
|
||||
- Minor updates to a few of my DSpace Python scripts to fix the logging
|
||||
- Minor updates to some records for Mazingira reported by Sonja
|
||||
- Upgrade PostgreSQL on DSpace Test from version 12 to 14, the same way I did from 10 to 12 last year:
|
||||
- First, I installed the new version of PostgreSQL via the Ansible playbook scripts
|
||||
- Then I stopped Tomcat and all PostgreSQL clusters and used `pg_upgrade` to upgrade the old version:
|
||||
|
||||
```console
|
||||
# systemctl stop tomcat7
|
||||
# pg_ctlcluster 12 main stop
|
||||
# tar -cvzpf var-lib-postgresql-12.tar.gz /var/lib/postgresql/12
|
||||
# tar -cvzpf etc-postgresql-12.tar.gz /etc/postgresql/12
|
||||
# pg_ctlcluster 14 main stop
|
||||
# pg_dropcluster 14 main
|
||||
# pg_upgradecluster 12 main
|
||||
# pg_ctlcluster 14 main start
|
||||
```
|
||||
|
||||
- After that I [re-indexed the database indexes using a query](https://adamj.eu/tech/2021/04/13/reindexing-all-tables-after-upgrading-to-postgresql-13/):
|
||||
|
||||
```console
|
||||
$ su - postgres
|
||||
$ cat /tmp/generate-reindex.sql
|
||||
SELECT 'REINDEX TABLE CONCURRENTLY ' || quote_ident(relname) || ' /*' || pg_size_pretty(pg_total_relation_size(C.oid)) || '*/;'
|
||||
FROM pg_class C
|
||||
LEFT JOIN pg_namespace N ON (N.oid = C.relnamespace)
|
||||
WHERE nspname = 'public'
|
||||
AND C.relkind = 'r'
|
||||
AND nspname !~ '^pg_toast'
|
||||
ORDER BY pg_total_relation_size(C.oid) ASC;
|
||||
$ psql dspace < /tmp/generate-reindex.sql > /tmp/reindex.sql
|
||||
$ <trim the extra stuff from /tmp/reindex.sql>
|
||||
$ psql dspace < /tmp/reindex.sql
|
||||
```
|
||||
|
||||
- The index on `metadatavalue` shrunk by 90MB, and others a bit less
|
||||
- This is nice, but not as drastic as I noticed last year when upgrading to PostgreSQL 12
|
||||
|
||||
## 2023-03-21
|
||||
|
||||
- Leigh sent me a list of IFPRI authors with ORCID identifiers so I combined them with our list and resolved all their names with `resolve_orcids.py`
|
||||
- It adds 154 new ORCID identifiers
|
||||
- I did a follow up to the publisher names from last week using the list from doi.org
|
||||
- Last week I only updated items with a DOI that had *no* publisher, but now I was curious to see how our existing publisher information compared
|
||||
- I checked a dozen or so manually and, other than CIFOR/ICRAF and CIAT/Alliance, the metadata was better than our existing data, so I overwrote them
|
||||
|
||||
<!-- vim: set sw=2 ts=2: -->
|
||||
|
@ -34,7 +34,7 @@ Last week I had increased the limit from 30 to 60, which seemed to help, but now
|
||||
$ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace
|
||||
78
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less
|
||||
-rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo
|
||||
-rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -28,7 +28,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_
|
||||
I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated.
|
||||
Update GitHub wiki for documentation of maintenance tasks.
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -38,7 +38,7 @@ I noticed we have a very interesting list of countries on CGSpace:
|
||||
Not only are there 49,000 countries, we have some blanks (25)…
|
||||
Also, lots of things like “COTE D`LVOIRE” and “COTE D IVOIRE”
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -28,7 +28,7 @@ Looking at issues with author authorities on CGSpace
|
||||
For some reason we still have the index-lucene-update cron job active on CGSpace, but I’m pretty sure we don’t need it as of the latest few versions of Atmire’s Listings and Reports module
|
||||
Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -32,7 +32,7 @@ After running DSpace for over five years I’ve never needed to look in any
|
||||
This will save us a few gigs of backup space we’re paying for on S3
|
||||
Also, I noticed the checker log has some errors we should pay attention to:
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -34,7 +34,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period!
|
||||
# awk '{print $1}' /var/log/nginx/rest.log | uniq | wc -l
|
||||
3168
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -34,7 +34,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec
|
||||
You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets
|
||||
Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -44,7 +44,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and
|
||||
|
||||
In this case the select query was showing 95 results before the update
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -42,7 +42,7 @@ $ git checkout -b 55new 5_x-prod
|
||||
$ git reset --hard ilri/5_x-prod
|
||||
$ git rebase -i dspace-5.5
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -34,7 +34,7 @@ It looks like we might be able to use OUs now, instead of DCs:
|
||||
|
||||
$ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b "dc=cgiarad,dc=org" -D "admigration1@cgiarad.org" -W "(sAMAccountName=admigration1)"
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -42,7 +42,7 @@ I exported a random item’s metadata as CSV, deleted all columns except id
|
||||
|
||||
0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -26,7 +26,7 @@ Add dc.type to the output options for Atmire’s Listings and Reports module
|
||||
Add dc.type to the output options for Atmire’s Listings and Reports module (#286)
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -46,7 +46,7 @@ I see thousands of them in the logs for the last few months, so it’s not r
|
||||
I’ve raised a ticket with Atmire to ask
|
||||
Another worrying error from dspace.log is:
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -28,7 +28,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s
|
||||
I tested on DSpace Test as well and it doesn’t work there either
|
||||
I asked on the dspace-tech mailing list because it seems to be broken, and actually now I’m not sure if we’ve ever had the sharding task run successfully over all these years
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -50,7 +50,7 @@ DELETE 1
|
||||
Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301)
|
||||
Looks like we’ll be using cg.identifier.ccafsprojectpii as the field name
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -54,7 +54,7 @@ Interestingly, it seems DSpace 4.x’s thumbnails were sRGB, but forcing reg
|
||||
$ identify ~/Desktop/alc_contrastes_desafios.jpg
|
||||
/Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -40,7 +40,7 @@ Testing the CMYK patch on a collection with 650 items:
|
||||
|
||||
$ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p "ImageMagick PDF Thumbnail" -v >& /tmp/filter-media-cmyk.txt
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -18,7 +18,7 @@
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="May, 2017"/>
|
||||
<meta name="twitter:description" content="2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it’s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire’s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -18,7 +18,7 @@
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="June, 2017"/>
|
||||
<meta name="twitter:description" content="2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we’ll create a new sub-community for Phase II and create collections for the research themes there The current “Research Themes” community will be renamed to “WLE Phase I Research Themes” Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ Merge changes for WLE Phase II theme rename (#329)
|
||||
Looking at extracting the metadata registries from ICARDA’s MEL DSpace database so we can compare fields with CGSpace
|
||||
We can use PostgreSQL’s extended output format (-x) plus sed to format the output into quasi XML:
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -60,7 +60,7 @@ This was due to newline characters in the dc.description.abstract column, which
|
||||
I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d
|
||||
Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -32,7 +32,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two
|
||||
|
||||
Ask Sisay to clean up the WLE approvers a bit, as Marianne’s user account is both in the approvers step as well as the group
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -34,7 +34,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336
|
||||
There appears to be a pattern but I’ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine
|
||||
Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -48,7 +48,7 @@ Generate list of authors on CGSpace for Peter to go through and correct:
|
||||
dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
|
||||
COPY 54701
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -30,7 +30,7 @@ The logs say “Timeout waiting for idle object”
|
||||
PostgreSQL activity says there are 115 connections currently
|
||||
The list of connections to XMLUI and REST API for today:
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -150,7 +150,7 @@ dspace.log.2018-01-02:34
|
||||
|
||||
Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -30,7 +30,7 @@ We don’t need to distinguish between internal and external works, so that
|
||||
Yesterday I figured out how to monitor DSpace sessions using JMX
|
||||
I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -24,7 +24,7 @@ Export a CSV of the IITA community metadata for Martin Mueller
|
||||
|
||||
Export a CSV of the IITA community metadata for Martin Mueller
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -26,7 +26,7 @@ Catalina logs at least show some memory errors yesterday:
|
||||
I tried to test something on DSpace Test but noticed that it’s down since god knows when
|
||||
Catalina logs at least show some memory errors yesterday:
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -38,7 +38,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E
|
||||
Then I reduced the JVM heap size from 6144 back to 5120m
|
||||
Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -58,7 +58,7 @@ real 74m42.646s
|
||||
user 8m5.056s
|
||||
sys 2m7.289s
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r
|
||||
|
||||
There is insufficient memory for the Java Runtime Environment to continue.
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -46,7 +46,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did
|
||||
The server only has 8GB of RAM so we’ll eventually need to upgrade to a larger one because we’ll start starving the OS, PostgreSQL, and command line batch processes
|
||||
I ran all system updates on DSpace Test and rebooted it
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -30,7 +30,7 @@ I’ll update the DSpace role in our Ansible infrastructure playbooks and ru
|
||||
Also, I’ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system’s RAM, and we never re-ran them after migrating to larger Linodes last month
|
||||
I’m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I’m getting those autowire errors in Tomcat 8.5.30 again:
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -26,7 +26,7 @@ I created a GitHub issue to track this #389, because I’m super busy in Nai
|
||||
Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items
|
||||
I created a GitHub issue to track this #389, because I’m super busy in Nairobi right now
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list
|
||||
Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage
|
||||
Today these are the top 10 IPs:
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ Then I ran all system updates and restarted the server
|
||||
|
||||
I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -50,7 +50,7 @@ I don’t see anything interesting in the web server logs around that time t
|
||||
357 207.46.13.1
|
||||
903 54.70.40.11
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -72,7 +72,7 @@ real 0m19.873s
|
||||
user 0m22.203s
|
||||
sys 0m1.979s
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -46,7 +46,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo
|
||||
|
||||
I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -64,7 +64,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds
|
||||
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p 'fuuu' -m 228 -f cg.coverage.country -d
|
||||
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -48,7 +48,7 @@ DELETE 1
|
||||
|
||||
But after this I tried to delete the item from the XMLUI and it is still present…
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -34,7 +34,7 @@ Run system updates on CGSpace (linode18) and reboot it
|
||||
|
||||
Skype with Marie-Angélique and Abenet about CG Core v2
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -38,7 +38,7 @@ CGSpace
|
||||
|
||||
Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -46,7 +46,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck
|
||||
|
||||
Run system updates on DSpace Test (linode19) and reboot it
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -72,7 +72,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning:
|
||||
7249 2a01:7e00::f03c:91ff:fe18:7396
|
||||
9124 45.5.186.2
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -18,7 +18,7 @@
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="October, 2019"/>
|
||||
<meta name="twitter:description" content="2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script’s “unneccesary Unicode” fix: $ csvcut -c 'id,dc."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -58,7 +58,7 @@ Let’s see how many of the REST API requests were for bitstreams (because t
|
||||
# zcat --force /var/log/nginx/rest.log.*.gz | grep -E "[0-9]{1,2}/Oct/2019" | grep -c -E "/rest/bitstreams"
|
||||
106781
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -46,7 +46,7 @@ Make sure all packages are up to date and the package manager is up to date, the
|
||||
# dpkg -C
|
||||
# reboot
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -56,7 +56,7 @@ I tweeted the CGSpace repository link
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -38,7 +38,7 @@ The code finally builds and runs with a fresh install
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -42,7 +42,7 @@ You need to download this into the DSpace 6.x source and compile it
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -48,7 +48,7 @@ The third item now has a donut with score 1 since I tweeted it last week
|
||||
|
||||
On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -34,7 +34,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to
|
||||
In other news, I checked the statistics API on DSpace 6 and it’s working
|
||||
I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error:
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -38,7 +38,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone
|
||||
|
||||
Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter’s request
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ It is class based so I can easily add support for other vocabularies, and the te
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -48,7 +48,7 @@ I filed a bug on OpenRXV: https://github.com/ilri/OpenRXV/issues/39
|
||||
|
||||
I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -44,7 +44,7 @@ During the FlywayDB migration I got an error:
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -32,7 +32,7 @@ So far we’ve spent at least fifty hours to process the statistics and stat
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ I started processing those (about 411,000 records):
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -50,7 +50,7 @@ For example, this item has 51 views on CGSpace, but 0 on AReS
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -60,7 +60,7 @@ $ curl -s 'http://localhost:9200/openrxv-items-temp/_count?q=*&pretty
|
||||
}
|
||||
}
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -34,7 +34,7 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -44,7 +44,7 @@ Perhaps one of the containers crashed, I should have looked closer but I was in
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ I looked at the top user agents and IPs in the Solr statistics for last month an
|
||||
|
||||
I will add the RI/1.0 pattern to our DSpace agents overload and purge them from Solr (we had previously seen this agent with 9,000 hits or so in 2020-09), but I think I will leave the Microsoft Word one… as that’s an actual user…
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ I simply started it and AReS was running again:
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -30,7 +30,7 @@ Export another list of ALL subjects on CGSpace, including AGROVOC and non-AGROVO
|
||||
localhost/dspace63= > \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER;
|
||||
COPY 20994
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -32,7 +32,7 @@ Update Docker images on AReS server (linode20) and reboot the server:
|
||||
|
||||
I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -48,7 +48,7 @@ The syntax Moayad showed me last month doesn’t seem to honor the search qu
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -46,7 +46,7 @@ $ wc -l /tmp/2021-10-01-affiliations.txt
|
||||
|
||||
So we have 1879/7100 (26.46%) matching already
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -32,7 +32,7 @@ First I exported all the 2019 stats from CGSpace:
|
||||
$ ./run.sh -s http://localhost:8081/solr/statistics -f 'time:2019-*' -a export -o statistics-2019.json -k uid
|
||||
$ zstd statistics-2019.json
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -40,7 +40,7 @@ Purging 455 hits from WhatsApp in statistics
|
||||
|
||||
Total number of bot hits purged: 3679
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -24,7 +24,7 @@ Start a full harvest on AReS
|
||||
|
||||
Start a full harvest on AReS
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -38,7 +38,7 @@ We agreed to try to do more alignment of affiliations/funders with ROR
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -34,7 +34,7 @@ $ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p 'fuuu&
|
||||
$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv
|
||||
$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -18,7 +18,7 @@
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="April, 2022"/>
|
||||
<meta name="twitter:description" content="2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -66,7 +66,7 @@ If I query Solr for time:2022-04* AND dns:*msnbot* AND dns:*.msn.com. I see a ha
|
||||
|
||||
I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -48,7 +48,7 @@ There seem to be many more of these:
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -34,7 +34,7 @@ Also, the trgm functions I’ve used before are case insensitive, but Levens
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -24,7 +24,7 @@ Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago
|
||||
|
||||
Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -46,7 +46,7 @@ I also fixed a few bugs and improved the region-matching logic
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ I filed an issue to ask about Java 11+ support
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -44,7 +44,7 @@ I want to make sure they use groups instead of individuals where possible!
|
||||
|
||||
I reverted the Cocoon autosave change because it was more of a nuissance that Peter can’t upload CSVs from the web interface and is a very low severity security issue
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -36,7 +36,7 @@ I exported the CCAFS and IITA communities, extracted just the country and region
|
||||
Add a few more authors to my CSV with author names and ORCID identifiers and tag 283 items!
|
||||
Replace “East Asia” with “Eastern Asia” region on CGSpace (UN M.49 region)
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -34,7 +34,7 @@ I see we have some new ones that aren’t in our list if I combine with this
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -32,7 +32,7 @@ I want to try to expand my use of their data to journals, publishers, volumes, i
|
||||
|
||||
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -16,7 +16,7 @@ I finally got through with porting the input form from DSpace 6 to DSpace 7
|
||||
<meta property="og:type" content="article" />
|
||||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2023-03/" />
|
||||
<meta property="article:published_time" content="2023-03-01T07:58:36+03:00" />
|
||||
<meta property="article:modified_time" content="2023-03-18T17:42:40+03:00" />
|
||||
<meta property="article:modified_time" content="2023-03-19T19:48:06+03:00" />
|
||||
|
||||
|
||||
|
||||
@ -28,7 +28,7 @@ Remove cg.subject.wle and cg.identifier.wletheme from CGSpace input form after c
|
||||
iso-codes 4.13.0 was released, which incorporates my changes to the common names for Iran, Laos, and Syria
|
||||
I finally got through with porting the input form from DSpace 6 to DSpace 7
|
||||
"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
@ -38,9 +38,9 @@ I finally got through with porting the input form from DSpace 6 to DSpace 7
|
||||
"@type": "BlogPosting",
|
||||
"headline": "March, 2023",
|
||||
"url": "https://alanorth.github.io/cgspace-notes/2023-03/",
|
||||
"wordCount": "2810",
|
||||
"wordCount": "3128",
|
||||
"datePublished": "2023-03-01T07:58:36+03:00",
|
||||
"dateModified": "2023-03-18T17:42:40+03:00",
|
||||
"dateModified": "2023-03-19T19:48:06+03:00",
|
||||
"author": {
|
||||
"@type": "Person",
|
||||
"name": "Alan Orth"
|
||||
@ -556,6 +556,61 @@ pd.options.mode.nullable_dtypes = True
|
||||
<ul>
|
||||
<li>Start a harvest on AReS</li>
|
||||
</ul>
|
||||
<h2 id="2023-03-20">2023-03-20</h2>
|
||||
<ul>
|
||||
<li>Minor updates to a few of my DSpace Python scripts to fix the logging</li>
|
||||
<li>Minor updates to some records for Mazingira reported by Sonja</li>
|
||||
<li>Upgrade PostgreSQL on DSpace Test from version 12 to 14, the same way I did from 10 to 12 last year:
|
||||
<ul>
|
||||
<li>First, I installed the new version of PostgreSQL via the Ansible playbook scripts</li>
|
||||
<li>Then I stopped Tomcat and all PostgreSQL clusters and used <code>pg_upgrade</code> to upgrade the old version:</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># systemctl stop tomcat7
|
||||
</span></span><span style="display:flex;"><span># pg_ctlcluster <span style="color:#ae81ff">12</span> main stop
|
||||
</span></span><span style="display:flex;"><span># tar -cvzpf var-lib-postgresql-12.tar.gz /var/lib/postgresql/12
|
||||
</span></span><span style="display:flex;"><span># tar -cvzpf etc-postgresql-12.tar.gz /etc/postgresql/12
|
||||
</span></span><span style="display:flex;"><span># pg_ctlcluster <span style="color:#ae81ff">14</span> main stop
|
||||
</span></span><span style="display:flex;"><span># pg_dropcluster <span style="color:#ae81ff">14</span> main
|
||||
</span></span><span style="display:flex;"><span># pg_upgradecluster <span style="color:#ae81ff">12</span> main
|
||||
</span></span><span style="display:flex;"><span># pg_ctlcluster <span style="color:#ae81ff">14</span> main start
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>After that I <a href="https://adamj.eu/tech/2021/04/13/reindexing-all-tables-after-upgrading-to-postgresql-13/">re-indexed the database indexes using a query</a>:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ su - postgres
|
||||
</span></span><span style="display:flex;"><span>$ cat /tmp/generate-reindex.sql
|
||||
</span></span><span style="display:flex;"><span>SELECT 'REINDEX TABLE CONCURRENTLY ' || quote_ident(relname) || ' /*' || pg_size_pretty(pg_total_relation_size(C.oid)) || '*/;'
|
||||
</span></span><span style="display:flex;"><span>FROM pg_class C
|
||||
</span></span><span style="display:flex;"><span>LEFT JOIN pg_namespace N ON (N.oid = C.relnamespace)
|
||||
</span></span><span style="display:flex;"><span>WHERE nspname = 'public'
|
||||
</span></span><span style="display:flex;"><span> AND C.relkind = 'r'
|
||||
</span></span><span style="display:flex;"><span> AND nspname !~ '^pg_toast'
|
||||
</span></span><span style="display:flex;"><span>ORDER BY pg_total_relation_size(C.oid) ASC;
|
||||
</span></span><span style="display:flex;"><span>$ psql dspace < /tmp/generate-reindex.sql > /tmp/reindex.sql
|
||||
</span></span><span style="display:flex;"><span>$ <trim the extra stuff from /tmp/reindex.sql>
|
||||
</span></span><span style="display:flex;"><span>$ psql dspace < /tmp/reindex.sql
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>The index on <code>metadatavalue</code> shrunk by 90MB, and others a bit less
|
||||
<ul>
|
||||
<li>This is nice, but not as drastic as I noticed last year when upgrading to PostgreSQL 12</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<h2 id="2023-03-21">2023-03-21</h2>
|
||||
<ul>
|
||||
<li>Leigh sent me a list of IFPRI authors with ORCID identifiers so I combined them with our list and resolved all their names with <code>resolve_orcids.py</code>
|
||||
<ul>
|
||||
<li>It adds 154 new ORCID identifiers</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>I did a follow up to the publisher names from last week using the list from doi.org
|
||||
<ul>
|
||||
<li>Last week I only updated items with a DOI that had <em>no</em> publisher, but now I was curious to see how our existing publisher information compared</li>
|
||||
<li>I checked a dozen or so manually and, other than CIFOR/ICRAF and CIAT/Alliance, the metadata was better than our existing data, so I overwrote them</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<!-- raw HTML omitted -->
|
||||
|
||||
|
||||
|
@ -17,7 +17,7 @@
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="404 Page not found"/>
|
||||
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -10,14 +10,14 @@
|
||||
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
|
||||
<meta property="og:type" content="website" />
|
||||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/" />
|
||||
<meta property="og:updated_time" content="2023-03-18T17:42:40+03:00" />
|
||||
<meta property="og:updated_time" content="2023-03-19T19:48:06+03:00" />
|
||||
|
||||
|
||||
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="Categories"/>
|
||||
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -10,14 +10,14 @@
|
||||
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
|
||||
<meta property="og:type" content="website" />
|
||||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
|
||||
<meta property="og:updated_time" content="2023-03-18T17:42:40+03:00" />
|
||||
<meta property="og:updated_time" content="2023-03-19T19:48:06+03:00" />
|
||||
|
||||
|
||||
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="Notes"/>
|
||||
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -10,14 +10,14 @@
|
||||
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
|
||||
<meta property="og:type" content="website" />
|
||||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
|
||||
<meta property="og:updated_time" content="2023-03-18T17:42:40+03:00" />
|
||||
<meta property="og:updated_time" content="2023-03-19T19:48:06+03:00" />
|
||||
|
||||
|
||||
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="Notes"/>
|
||||
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -10,14 +10,14 @@
|
||||
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
|
||||
<meta property="og:type" content="website" />
|
||||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
|
||||
<meta property="og:updated_time" content="2023-03-18T17:42:40+03:00" />
|
||||
<meta property="og:updated_time" content="2023-03-19T19:48:06+03:00" />
|
||||
|
||||
|
||||
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="Notes"/>
|
||||
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -10,14 +10,14 @@
|
||||
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
|
||||
<meta property="og:type" content="website" />
|
||||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
|
||||
<meta property="og:updated_time" content="2023-03-18T17:42:40+03:00" />
|
||||
<meta property="og:updated_time" content="2023-03-19T19:48:06+03:00" />
|
||||
|
||||
|
||||
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="Notes"/>
|
||||
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -10,14 +10,14 @@
|
||||
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
|
||||
<meta property="og:type" content="website" />
|
||||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
|
||||
<meta property="og:updated_time" content="2023-03-18T17:42:40+03:00" />
|
||||
<meta property="og:updated_time" content="2023-03-19T19:48:06+03:00" />
|
||||
|
||||
|
||||
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="Notes"/>
|
||||
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -10,14 +10,14 @@
|
||||
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
|
||||
<meta property="og:type" content="website" />
|
||||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
|
||||
<meta property="og:updated_time" content="2023-03-18T17:42:40+03:00" />
|
||||
<meta property="og:updated_time" content="2023-03-19T19:48:06+03:00" />
|
||||
|
||||
|
||||
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="Notes"/>
|
||||
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -10,14 +10,14 @@
|
||||
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
|
||||
<meta property="og:type" content="website" />
|
||||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
|
||||
<meta property="og:updated_time" content="2023-03-18T17:42:40+03:00" />
|
||||
<meta property="og:updated_time" content="2023-03-19T19:48:06+03:00" />
|
||||
|
||||
|
||||
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="Notes"/>
|
||||
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
@ -18,7 +18,7 @@
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="CGIAR Library Migration"/>
|
||||
<meta name="twitter:description" content="Notes on the migration of the CGIAR Library to CGSpace"/>
|
||||
<meta name="generator" content="Hugo 0.110.0">
|
||||
<meta name="generator" content="Hugo 0.111.3">
|
||||
|
||||
|
||||
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user