mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-10 17:15:46 +01:00
469 lines
24 KiB
Markdown
469 lines
24 KiB
Markdown
---
|
|
title: "May, 2019"
|
|
date: 2019-05-01T07:37:43+03:00
|
|
author: "Alan Orth"
|
|
categories: ["Notes"]
|
|
---
|
|
|
|
## 2019-05-01
|
|
|
|
- Help CCAFS with regenerating some item thumbnails after they uploaded new PDFs to some items on CGSpace
|
|
- A user on the dspace-tech mailing list offered some suggestions for troubleshooting the problem with the inability to delete certain items
|
|
- Apparently if the item is in the `workflowitem` table it is submitted to a workflow
|
|
- And if it is in the `workspaceitem` table it is in the pre-submitted state
|
|
- The item seems to be in a pre-submitted state, so I tried to delete it from there:
|
|
|
|
```
|
|
dspace=# DELETE FROM workspaceitem WHERE item_id=74648;
|
|
DELETE 1
|
|
```
|
|
|
|
- But after this I tried to delete the item from the XMLUI and it is *still* present...
|
|
|
|
<!--more-->
|
|
|
|
- I managed to delete the problematic item from the database
|
|
- First I deleted the item's bitstream in XMLUI and then ran `dspace cleanup -v` to remove it from the assetstore
|
|
- Then I ran the following SQL:
|
|
|
|
```
|
|
dspace=# DELETE FROM metadatavalue WHERE resource_id=74648;
|
|
dspace=# DELETE FROM workspaceitem WHERE item_id=74648;
|
|
dspace=# DELETE FROM item WHERE item_id=74648;
|
|
```
|
|
- Now the item is (hopefully) really gone and I can continue to troubleshoot the issue with REST API's `/items/find-by-metadata-value` endpoint
|
|
- Of course I run into another HTTP 401 error when I continue trying the LandPortal search from last month:
|
|
|
|
```
|
|
$ curl -f -H "Content-Type: application/json" -X POST "http://localhost:8080/rest/items/find-by-metadata-field" -d '{"key":"cg.subject.cpwf", "value":"WATER MANAGEMENT","language": "en_US"}'
|
|
curl: (22) The requested URL returned error: 401 Unauthorized
|
|
```
|
|
|
|
- The DSpace log shows the item ID (because I modified the error text):
|
|
|
|
```
|
|
2019-05-01 11:41:11,069 ERROR org.dspace.rest.ItemsResource @ User(anonymous) has not permission to read item(id=77708)!
|
|
```
|
|
|
|
- If I delete that one I get another, making the list of item IDs so far:
|
|
- 74648
|
|
- 77708
|
|
- 85079
|
|
- Some are in the `workspaceitem` table (pre-submission), others are in the `workflowitem` table (submitted), and others are actually approved, but withdrawn...
|
|
- This is actually a worthless exercise because the real issue is that the `/items/find-by-metadata-value` endpoint is simply designed flawed and shouldn't be fatally erroring when the search returns items the user doesn't have permission to access
|
|
- It would take way too much time to try to fix the fucked up items that are in limbo by deleting them in SQL, but also, it doesn't actually fix the problem because some items are *submitted* but *withdrawn*, so they actually have handles and everything
|
|
- I think the solution is to recommend people don't use the `/items/find-by-metadata-value` endpoint
|
|
- CIP is asking about embedding PDF thumbnail images in their RSS feeds again
|
|
- They asked in 2018-09 as well and I told them it wasn't possible
|
|
- To make sure, I looked at [the documentation for RSS media feeds](https://wiki.lyrasis.org/display/DSPACE/Enable+Media+RSS+Feeds) and tried it, but couldn't get it to work
|
|
- It seems to be geared towards iTunes and Podcasts... I dunno
|
|
- CIP also asked for a way to get an XML file of all their RTB journal articles on CGSpace
|
|
- I told them to use the REST API like (where `1179` is the id of the RTB journal articles collection):
|
|
|
|
```
|
|
https://cgspace.cgiar.org/rest/collections/1179/items?limit=812&expand=metadata
|
|
```
|
|
|
|
## 2019-05-03
|
|
|
|
- A user from CIAT emailed to say that CGSpace submission emails have not been working the last few weeks
|
|
- I checked the `dspace test-email` script on CGSpace and they are indeed failing:
|
|
|
|
```
|
|
$ dspace test-email
|
|
|
|
About to send test email:
|
|
- To: woohoo@cgiar.org
|
|
- Subject: DSpace test email
|
|
- Server: smtp.office365.com
|
|
|
|
Error sending email:
|
|
- Error: javax.mail.AuthenticationFailedException
|
|
|
|
Please see the DSpace documentation for assistance.
|
|
```
|
|
|
|
- I will ask ILRI ICT to reset the password
|
|
- They reset the password and I tested it on CGSpace
|
|
|
|
## 2019-05-05
|
|
|
|
- Run all system updates on DSpace Test (linode19) and reboot it
|
|
- Merge changes into the `5_x-prod` branch of CGSpace:
|
|
- Updates to remove deprecated social media websites (Google+ and Delicious), update Twitter share intent, and add item title to Twitter and email links ([#421](https://github.com/ilri/DSpace/pull/421))
|
|
- Add new CCAFS Phase II project tags ([#420](https://github.com/ilri/DSpace/pull/420))
|
|
- Add item ID to REST API error logging ([#422](https://github.com/ilri/DSpace/pull/422))
|
|
- Re-deploy CGSpace from `5_x-prod` branch
|
|
- Run all system updates on CGSpace (linode18) and reboot it
|
|
- Tag version 1.1.0 of the [dspace-statistics-api](https://github.com/ilri/dspace-statistics-api) (with Falcon 2.0.0)
|
|
- Deploy on DSpace Test
|
|
|
|
## 2019-05-06
|
|
|
|
- Peter pointed out that Solr stats are only showing 2019 stats
|
|
- I looked at the Solr Admin UI and I see:
|
|
|
|
```
|
|
statistics-2018: org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: Error opening new searcher
|
|
```
|
|
|
|
- As well as this error in the logs:
|
|
|
|
```
|
|
Caused by: org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: NativeFSLock@/home/cgspace.cgiar.org/solr/statistics-2018/data/index/write.lock
|
|
```
|
|
|
|
- Strangely enough, I *do* see the statistics-2018, statistics-2017, etc cores in the Admin UI...
|
|
- I restarted Tomcat a few times (and even deleted all the Solr write locks) and at least five times there were issues loading one statistics core, causing the Atmire stats to be incomplete
|
|
- Also, I tried to increase the `writeLockTimeout` in `solrconfig.xml` from the default of 1000ms to 10000ms
|
|
- Eventually the Atmire stats started working, despite errors about "Error opening new searcher" in the Solr Admin UI
|
|
- I wrote to the dspace-tech mailing list again on the thread from March, 2019
|
|
- There were a few alerts from UptimeRobot about CGSpace going up and down this morning, along with an alert from Linode about 596% load
|
|
- Looking at the Munin stats I see an exponential rise in DSpace XMLUI sessions, firewall activity, and PostgreSQL connections this morning:
|
|
|
|
![CGSpace XMLUI sessions day](/cgspace-notes/2019/05/2019-05-06-jmx_dspace_sessions-day.png)
|
|
|
|
![linode18 firewall connections day](/cgspace-notes/2019/05/2019-05-06-fw_conntrack-day.png)
|
|
|
|
![linode18 postgres connections day](/cgspace-notes/2019/05/2019-05-06-postgres_connections_db-day.png)
|
|
|
|
![linode18 CPU day](/cgspace-notes/2019/05/2019-05-06-cpu-day.png)
|
|
|
|
- The number of unique sessions today is *ridiculously* high compared to the last few days considering it's only 12:30PM right now:
|
|
|
|
```
|
|
$ grep -o -E 'session_id=[A-Z0-9]{32}' dspace.log.2019-05-06 | sort | uniq | wc -l
|
|
101108
|
|
$ grep -o -E 'session_id=[A-Z0-9]{32}' dspace.log.2019-05-05 | sort | uniq | wc -l
|
|
14618
|
|
$ grep -o -E 'session_id=[A-Z0-9]{32}' dspace.log.2019-05-04 | sort | uniq | wc -l
|
|
14946
|
|
$ grep -o -E 'session_id=[A-Z0-9]{32}' dspace.log.2019-05-03 | sort | uniq | wc -l
|
|
6410
|
|
$ grep -o -E 'session_id=[A-Z0-9]{32}' dspace.log.2019-05-02 | sort | uniq | wc -l
|
|
7758
|
|
$ grep -o -E 'session_id=[A-Z0-9]{32}' dspace.log.2019-05-01 | sort | uniq | wc -l
|
|
20528
|
|
```
|
|
|
|
- The number of unique IP addresses from 2 to 6 AM this morning is already several times higher than the average for that time of the morning this past week:
|
|
|
|
```
|
|
# zcat --force /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep -E '06/May/2019:(02|03|04|05|06)' | awk '{print $1}' | sort | uniq | wc -l
|
|
7127
|
|
# zcat --force /var/log/nginx/access.log.1 /var/log/nginx/access.log.2.gz | grep -E '05/May/2019:(02|03|04|05|06)' | awk '{print $1}' | sort | uniq | wc -l
|
|
1231
|
|
# zcat --force /var/log/nginx/access.log.2.gz /var/log/nginx/access.log.3.gz | grep -E '04/May/2019:(02|03|04|05|06)' | awk '{print $1}' | sort | uniq | wc -l
|
|
1255
|
|
# zcat --force /var/log/nginx/access.log.3.gz /var/log/nginx/access.log.4.gz | grep -E '03/May/2019:(02|03|04|05|06)' | awk '{print $1}' | sort | uniq | wc -l
|
|
1736
|
|
# zcat --force /var/log/nginx/access.log.4.gz /var/log/nginx/access.log.5.gz | grep -E '02/May/2019:(02|03|04|05|06)' | awk '{print $1}' | sort | uniq | wc -l
|
|
1573
|
|
# zcat --force /var/log/nginx/access.log.5.gz /var/log/nginx/access.log.6.gz | grep -E '01/May/2019:(02|03|04|05|06)' | awk '{print $1}' | sort | uniq | wc -l
|
|
1410
|
|
```
|
|
|
|
- Just this morning between the hours of 2 and 6 the number of unique sessions was *very* high compared to previous mornings:
|
|
|
|
```
|
|
$ cat dspace.log.2019-05-06 | grep -E '2019-05-06 (02|03|04|05|06):' | grep -o -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
83650
|
|
$ cat dspace.log.2019-05-05 | grep -E '2019-05-05 (02|03|04|05|06):' | grep -o -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
2547
|
|
$ cat dspace.log.2019-05-04 | grep -E '2019-05-04 (02|03|04|05|06):' | grep -o -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
2574
|
|
$ cat dspace.log.2019-05-03 | grep -E '2019-05-03 (02|03|04|05|06):' | grep -o -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
2911
|
|
$ cat dspace.log.2019-05-02 | grep -E '2019-05-02 (02|03|04|05|06):' | grep -o -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
2704
|
|
$ cat dspace.log.2019-05-01 | grep -E '2019-05-01 (02|03|04|05|06):' | grep -o -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
3699
|
|
```
|
|
|
|
- Most of the requests were GETs:
|
|
|
|
```
|
|
# cat /var/log/nginx/{access,library-access}.log /var/log/nginx/{access,library-access}.log.1 | grep -E '06/May/2019:(02|03|04|05|06)' | grep -o -E "(GET|HEAD|POST|PUT)" | sort | uniq -c | sort -n
|
|
1 PUT
|
|
98 POST
|
|
2845 HEAD
|
|
98121 GET
|
|
```
|
|
|
|
- I'm not exactly sure what happened this morning, but it looks like some legitimate user traffic—perhaps someone launched a new publication and it got a bunch of hits?
|
|
- Looking again, I see 84,000 requests to `/handle` this morning (not including logs for library.cgiar.org because those get HTTP 301 redirect to CGSpace and appear here in `access.log`):
|
|
|
|
```
|
|
# cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep -E '06/May/2019:(02|03|04|05|06)' | grep -c -o -E " /handle/[0-9]+/[0-9]+"
|
|
84350
|
|
```
|
|
|
|
- But it would be difficult to find a pattern for those requests because they cover 78,000 *unique* Handles (ie direct browsing of items, collections, or communities) and only 2,492 discover/browse (total, not unique):
|
|
|
|
```
|
|
# cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep -E '06/May/2019:(02|03|04|05|06)' | grep -o -E " /handle/[0-9]+/[0-9]+ HTTP" | sort | uniq | wc -l
|
|
78104
|
|
# cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep -E '06/May/2019:(02|03|04|05|06)' | grep -o -E " /handle/[0-9]+/[0-9]+/(discover|browse)" | wc -l
|
|
2492
|
|
```
|
|
|
|
- In other news, I see some IP is making several requests per second to the exact same REST API endpoints, for example:
|
|
|
|
```
|
|
# grep /rest/handle/10568/3703?expand=all rest.log | awk '{print $1}' | sort | uniq -c
|
|
3 2a01:7e00::f03c:91ff:fe0a:d645
|
|
113 63.32.242.35
|
|
```
|
|
|
|
- According to [viewdns.info](https://viewdns.info/reverseip/?host=63.32.242.35&t=1) that server belongs to Macaroni Brothers
|
|
- The user agent of their non-REST API requests from the same IP is Drupal
|
|
- This is one very good reason to limit REST API requests, and perhaps to enable caching via nginx
|
|
|
|
## 2019-05-07
|
|
|
|
- The total number of unique IPs on CGSpace yesterday was almost 14,000, which is several thousand higher than previous day totals:
|
|
|
|
```
|
|
# zcat --force /var/log/nginx/access.log.1 /var/log/nginx/access.log.2.gz | grep -E '06/May/2019' | awk '{print $1}' | sort | uniq | wc -l
|
|
13969
|
|
# zcat --force /var/log/nginx/access.log.2.gz /var/log/nginx/access.log.3.gz | grep -E '05/May/2019' | awk '{print $1}' | sort | uniq | wc -l
|
|
5936
|
|
# zcat --force /var/log/nginx/access.log.3.gz /var/log/nginx/access.log.4.gz | grep -E '04/May/2019' | awk '{print $1}' | sort | uniq | wc -l
|
|
6229
|
|
# zcat --force /var/log/nginx/access.log.4.gz /var/log/nginx/access.log.5.gz | grep -E '03/May/2019' | awk '{print $1}' | sort | uniq | wc -l
|
|
8051
|
|
```
|
|
|
|
- Total number of sessions yesterday was *much* higher compared to days last week:
|
|
|
|
```
|
|
$ cat dspace.log.2019-05-06 | grep -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
144160
|
|
$ cat dspace.log.2019-05-05 | grep -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
57269
|
|
$ cat dspace.log.2019-05-04 | grep -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
58648
|
|
$ cat dspace.log.2019-05-03 | grep -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
27883
|
|
$ cat dspace.log.2019-05-02 | grep -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
26996
|
|
$ cat dspace.log.2019-05-01 | grep -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
61866
|
|
```
|
|
|
|
- The usage statistics seem to agree that yesterday was crazy:
|
|
|
|
![Atmire Usage statistics spike 2019-05-06](/cgspace-notes/2019/05/2019-05-07-atmire-usage-week.png)
|
|
|
|
- Sarah from RTB asked me about the RSS / XML link for the the CGIAR.org website again
|
|
- Apparently Sam Stacey is trying to add an RSS feed so the items get automatically syndicated to the CGIAR website
|
|
- I send her the link to the collection RSS feed
|
|
- Add requests cache to `resolve-addresses.py` script
|
|
|
|
## 2019-05-08
|
|
|
|
- A user said that CGSpace emails have stopped sending again
|
|
- Indeed, the `dspace test-email` script is showing an authentication failure:
|
|
|
|
```
|
|
$ dspace test-email
|
|
|
|
About to send test email:
|
|
- To: wooooo@cgiar.org
|
|
- Subject: DSpace test email
|
|
- Server: smtp.office365.com
|
|
|
|
Error sending email:
|
|
- Error: javax.mail.AuthenticationFailedException
|
|
|
|
Please see the DSpace documentation for assistance.
|
|
```
|
|
|
|
- I checked the settings and apparently I had updated it incorrectly last week after ICT reset the password
|
|
- Help Moayad with certbot-auto for Let's Encrypt scripts on the new AReS server (linode20)
|
|
- Normalize all `text_lang` values for metadata on CGSpace and DSpace Test (as I had tested last month):
|
|
|
|
```
|
|
UPDATE metadatavalue SET text_lang='en_US' WHERE resource_type_id=2 AND metadata_field_id != 28 AND text_lang IN ('ethnob', 'en', '*', 'E.', '');
|
|
UPDATE metadatavalue SET text_lang='en_US' WHERE resource_type_id=2 AND metadata_field_id != 28 AND text_lang IS NULL;
|
|
UPDATE metadatavalue SET text_lang='es_ES' WHERE resource_type_id=2 AND metadata_field_id != 28 AND text_lang IN ('es', 'spa');
|
|
```
|
|
|
|
- Send Francesca Giampieri from Bioversity a CSV export of all their items issued in 2018
|
|
- They will be doing a migration of 1500 items from their TYPO3 database into CGSpace soon and want an example CSV with all required metadata columns
|
|
|
|
## 2019-05-10
|
|
|
|
- I finally had time to analyze the 7,000 IPs from the major traffic spike on 2019-05-06 after several runs of my `resolve-addresses.py` script (ipapi.co has a limit of 1,000 requests per day)
|
|
- Resolving the unique IP addresses to organization and AS names reveals some pretty big abusers:
|
|
- 1213 from Region40 LLC (AS200557)
|
|
- 697 from Trusov Ilya Igorevych (AS50896)
|
|
- 687 from UGB Hosting OU (AS206485)
|
|
- 620 from UAB Rakrejus (AS62282)
|
|
- 491 from Dedipath (AS35913)
|
|
- 476 from Global Layer B.V. (AS49453)
|
|
- 333 from QuadraNet Enterprises LLC (AS8100)
|
|
- 278 from GigeNET (AS32181)
|
|
- 261 from Psychz Networks (AS40676)
|
|
- 196 from Cogent Communications (AS174)
|
|
- 125 from Blockchain Network Solutions Ltd (AS43444)
|
|
- 118 from Silverstar Invest Limited (AS35624)
|
|
- All of the IPs from these networks are using generic user agents like this, but MANY more, and they change many times:
|
|
|
|
```
|
|
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2703.0 Safari/537.36"
|
|
```
|
|
|
|
- I found a [blog post from 2018 detailing an attack from a DDoS service](https://www.qurium.org/alerts/azerbaijan/azerbaijan-and-the-region40-ddos-service/) that matches our pattern exactly
|
|
- They specifically mention:
|
|
|
|
<pre>The attack that targeted the "Search" functionality of the website, aimed to bypass our mitigation by performing slow but simultaneous searches from 5500 IP addresses.</pre>
|
|
|
|
- So this was definitely an attack of some sort... only God knows why
|
|
- I noticed a few new bots that don't use the word "bot" in their user agent and therefore don't match Tomcat's Crawler Session Manager Valve:
|
|
- `Blackboard Safeassign`
|
|
- `Unpaywall`
|
|
|
|
## 2019-05-12
|
|
|
|
- I see that the Unpaywall bot is resonsible for a few thousand XMLUI sessions every day (IP addresses come from nginx access.log):
|
|
|
|
```
|
|
$ cat dspace.log.2019-05-11 | grep -E 'ip_addr=(100.26.206.188|100.27.19.233|107.22.98.199|174.129.156.41|18.205.243.110|18.205.245.200|18.207.176.164|18.207.209.186|18.212.126.89|18.212.5.59|18.213.4.150|18.232.120.6|18.234.180.224|18.234.81.13|3.208.23.222|34.201.121.183|34.201.241.214|34.201.39.122|34.203.188.39|34.207.197.154|34.207.232.63|34.207.91.147|34.224.86.47|34.227.205.181|34.228.220.218|34.229.223.120|35.171.160.166|35.175.175.202|3.80.201.39|3.81.120.70|3.81.43.53|3.84.152.19|3.85.113.253|3.85.237.139|3.85.56.100|3.87.23.95|3.87.248.240|3.87.250.3|3.87.62.129|3.88.13.9|3.88.57.237|3.89.71.15|3.90.17.242|3.90.68.247|3.91.44.91|3.92.138.47|3.94.250.180|52.200.78.128|52.201.223.200|52.90.114.186|52.90.48.73|54.145.91.243|54.160.246.228|54.165.66.180|54.166.219.216|54.166.238.172|54.167.89.152|54.174.94.223|54.196.18.211|54.198.234.175|54.208.8.172|54.224.146.147|54.234.169.91|54.235.29.216|54.237.196.147|54.242.68.231|54.82.6.96|54.87.12.181|54.89.217.141|54.89.234.182|54.90.81.216|54.91.104.162)' | grep -E 'session_id=[A-Z0-9]{32}' | sort | uniq | wc -l
|
|
2206
|
|
```
|
|
|
|
- I added "Unpaywall" to the list of bots in the Tomcat Crawler Session Manager Valve
|
|
- Set up nginx to use TLS and proxy pass to NodeJS on the AReS development server (linode20)
|
|
- Run all system updates on linode20 and reboot it
|
|
- Also, there is 10 to 20% CPU steal on that VM, so I will ask Linode to move it to another host
|
|
- Commit changes to the `resolve-addresses.py` script to add proper CSV output support
|
|
|
|
## 2019-05-14
|
|
|
|
- Skype with Peter and AgroKnow about CTA story telling modification they want to do on the CTA ICT Update collection on CGSpace
|
|
- I told them they should aim for modifying the collection theme and insert some custom HTML / JS
|
|
- I need to send Panagis some documentation about Mirage 2 and the DSpace build process, as well as the Maven settings for build
|
|
|
|
## 2019-05-15
|
|
|
|
- Tezira says she's having issues with email reports for approved submissions, but I received an email about collection subscriptions this morning, and I tested with `dspace test-email` and it's also working...
|
|
- Send a list of DSpace build tips to Panagis from AgroKnow
|
|
- Finally fix the AReS v2 to work via DSpace Test and send it to Peter et al to give their feedback
|
|
- We had issues with CORS due to Moayad using a hard-coded domain name rather than a relative URL
|
|
|
|
## 2019-05-16
|
|
|
|
- Export a list of all investors (`dc.description.sponsorship`) for Peter to look through and correct:
|
|
|
|
```
|
|
dspace=# \COPY (SELECT DISTINCT text_value, count(*) FROM metadatavalue WHERE resource_type_id = 2 AND metadata_field_id = 29 GROUP BY text_value ORDER BY count DESC LIMIT 1500) to /tmp/2019-05-16-investors.csv WITH CSV HEADER;
|
|
COPY 995
|
|
```
|
|
|
|
- Fork the [ICARDA AReS v1 repository](https://github.com/icarda-git/AReS) to [ILRI's GitHub](https://github.com/ilri/AReS) and give access to CodeObia guys
|
|
- The plan is that we develop the v2 code here
|
|
|
|
## 2019-05-17
|
|
|
|
- Peter sent me a bunch of fixes for investors from yesterday
|
|
- I did a quick check in Open Refine (trim and collapse whitespace, clean smart quotes, etc) and then applied them on CGSpace:
|
|
|
|
```
|
|
$ ./fix-metadata-values.py -i /tmp/2019-05-16-fix-306-Investors.csv -db dspace-u dspace-p 'fuuu' -f dc.description.sponsorship -m 29 -t correct -d
|
|
$ ./delete-metadata-values.py -i /tmp/2019-05-16-delete-297-Investors.csv -db dspace -u dspace -p 'fuuu' -m 29 -f dc.description.sponsorship -d
|
|
```
|
|
|
|
- Then I started a full Discovery re-indexing:
|
|
|
|
```
|
|
$ export JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx1024m"
|
|
$ time schedtool -B -e ionice -c2 -n7 nice -n19 dspace index-discovery -b
|
|
```
|
|
|
|
- I was going to make a new controlled vocabulary of the top 100 terms after these corrections, but I noticed a bunch of duplicates and variations when I sorted them alphabetically
|
|
- Instead, I exported a new list and asked Peter to look at it again
|
|
- Apply Peter's new corrections on DSpace Test and CGSpace:
|
|
|
|
```
|
|
$ ./fix-metadata-values.py -i /tmp/2019-05-17-fix-25-Investors.csv -db dspace -u dspace -p 'fuuu' -f dc.description.sponsorship -m 29 -t correct -d
|
|
$ ./delete-metadata-values.py -i /tmp/2019-05-17-delete-14-Investors.csv -db dspace -u dspace -p 'fuuu' -m 29 -f dc.description.sponsorship -d
|
|
```
|
|
|
|
- Then I re-exported the sponsors and took the top 100 to update the existing controlled vocabulary ([#423](https://github.com/ilri/DSpace/pull/423))
|
|
- I will deploy the changes on CGSpace the next time we re-deploy
|
|
|
|
## 2019-05-19
|
|
|
|
- Add "ISI journal" to item view sidebar at the request of Maria Garruccio
|
|
- Update `fix-metadata-values.py` and `delete-metadata-values.py` scripts to add some basic checking of CSV fields and colorize shell output using Colorama
|
|
|
|
## 2019-05-24
|
|
|
|
- Update AReS README.md on GitHub repository to add a proper introduction, credits, requirements, installation instructions, and legal information
|
|
- Update CIP subjects in input forms on CGSpace ([#424](https://github.com/ilri/DSpace/pull/424))
|
|
|
|
## 2019-05-25
|
|
|
|
- Help Abenet proof ten Africa Rice publications
|
|
- Convert some dates to string (from number in Excel)
|
|
- Trim whitespace on all fields
|
|
- Correct and standardize affiliations
|
|
- Validate subject terms against AGROVOC
|
|
- Add rights information to all items
|
|
- Correct and standardize sponsors
|
|
- Generate Simple Archive Format bundle with SAFBuilder and import into the [AfricaRice Articles in Journals](https://cgspace.cgiar.org/handle/10568/101106) collection on CGSpace:
|
|
|
|
```
|
|
$ dspace import -a -e me@cgiar.org -m 2019-05-25-AfricaRice.map -s /tmp/SimpleArchiveFormat
|
|
```
|
|
|
|
## 2019-05-27
|
|
|
|
- Peter sent me over two thousand corrections for the authors on CGSpace that I had dumped last month
|
|
- I proofed them for whitespace and invalid special characters in OpenRefine and then applied them on CGSpace and DSpace Test:
|
|
|
|
```
|
|
$ ./fix-metadata-values.py -i /tmp/2019-05-27-fix-2472-Authors.csv -db dspace -u dspace -p 'fuuu' -f dc.contributor.author -m 3 -t corrections -d
|
|
```
|
|
|
|
- Then start a full Discovery re-indexing on each server:
|
|
|
|
```
|
|
$ export JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx1024m"
|
|
$ time schedtool -B -e ionice -c2 -n7 nice -n19 dspace index-discovery -b
|
|
```
|
|
|
|
- Export new list of all authors from CGSpace database to send to Peter:
|
|
|
|
```
|
|
dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/2019-05-27-all-authors.csv with csv header;
|
|
COPY 64871
|
|
```
|
|
|
|
- Run all system updates on DSpace Test (linode19) and reboot it
|
|
- Paola from CIAT asked for a way to generate a report of the top keywords for each year of their articles and journals
|
|
- I told them that the best way (even though it's low tech) is to work on a CSV dump of the collection
|
|
|
|
## 2019-05-29
|
|
|
|
- A CIMMYT user was having problems registering or logging into CGSpace
|
|
- I tried to register her and it gave an error, then I remembered for CGIAR LDAP users we actually need to just log in and it will automatically create an eperson
|
|
- I told her to try to log in with the LDAP login method and let me know what happens (then I can look in the logs too)
|
|
|
|
## 2019-05-30
|
|
|
|
- I see the following error in the DSpace log when the user tries to log in with her CGIAR email and password on the LDAP login:
|
|
|
|
```
|
|
2019-05-30 07:19:35,166 INFO org.dspace.authenticate.LDAPAuthentication @ anonymous:session_id=A5E0C836AF8F3ABB769FE47107AE1CFF:ip_addr=185.71.4.34:failed_login:no DN found for user sa.saini@cgiar.org
|
|
```
|
|
|
|
- For now I just created an eperson with her personal email address until I have time to check LDAP to see what's up with her CGIAR account:
|
|
|
|
```
|
|
$ dspace user -a -m blah@blah.com -g Sakshi -s Saini -p 'sknflksnfksnfdls'
|
|
```
|
|
|
|
<!-- vim: set sw=2 ts=2: -->
|