Troubleshooting the missing Altmetric scores on AReS
Turns out that I didn’t actually fix them last month because the check for content.altmetric still exists, and I can’t access the DOIs using _h.source.DOI for some reason
I can access all other kinds of item metadata using the Elasticsearch label, but not DOI!!!
I will change DOI to tomato in the repository setup and start a re-harvest… I need to see if this is some kind of reserved word or something…
Troubleshooting the missing Altmetric scores on AReS
Turns out that I didn’t actually fix them last month because the check for content.altmetric still exists, and I can’t access the DOIs using _h.source.DOI for some reason
I can access all other kinds of item metadata using the Elasticsearch label, but not DOI!!!
I will change DOI to tomato in the repository setup and start a re-harvest… I need to see if this is some kind of reserved word or something…
<li>Troubleshooting the missing Altmetric scores on AReS
<ul>
<li>Turns out that I didn’t actually fix them last month because the check for <code>content.altmetric</code> still exists, and I can’t access the DOIs using <code>_h.source.DOI</code> for some reason</li>
<li>I can access all other kinds of item metadata using the Elasticsearch label, but not DOI!!!</li>
<li>I will change <code>DOI</code> to <code>tomato</code> in the repository setup and start a re-harvest… I need to see if this is some kind of reserved word or something…</li>
<li>Even as <code>tomato</code> I can’t access that field as <code>_h.source.tomato</code> in Angular, but it does work as a filter source… sigh</li>
<li>Checking last month’s Solr statistics to see if there are any new bots that I need to purge and add to the list
<ul>
<li>78.203.225.68 made 50,000 requests on one day in August, and it is using this user agent: <code>Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.66 Safari/537.36</code></li>
<li>It’s a fixed line ISP in Montpellier according to AbuseIPDB.com, and has not been flagged as abusive, so it must be some CGIAR SMO person doing some web application harvesting from the browser</li>
<li>130.255.162.154 is in Sweden and made 46,000 requests in August and it is using this user agent: <code>Mozilla/5.0 (Macintosh; Intel Mac OS X 11.1; rv:84.0) Gecko/20100101 Firefox/84.0</code></li>
<li>35.174.144.154 is on Amazon and made 28,000 requests with this user agent: <code>Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.97 Safari/537.36</code></li>
<li>192.121.135.6 is in Sweden and made 9,000 requests with this user agent: <code>Mozilla/5.0 (Macintosh; Intel Mac OS X 11.1; rv:84.0) Gecko/20100101 Firefox/84.0</code></li>
<li>185.38.40.66 is in Germany and made 6,000 requests with this user agent: <code>Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:89.0) Gecko/20100101 Firefox/89.0 BoldBrains SC/1.10.2.4</code></li>
<li>3.225.28.105 is in Amazon and made 3,000 requests with this user agent: <code>Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.102 Safari/537.36</code></li>
<li>I also noticed that we still have tons (25,000) of requests by MSNbot using this normal-looking user agent: <code>Mozilla/5.0 (Windows NT 6.3; Trident/7.0; rv:11.0) like Gecko</code></li>
<li>I can identify them by their reverse DNS: msnbot-40-77-167-105.search.msn.com.</li>
<li>I had already purged a bunch of these by their IPs in 2021-06, so it looks like I have to do that again</li>
<li>While looking at the MSN requests I noticed tons of requests from another strange host using reverse IP DNS: malta2095.startdedicated.com., astra5139.startdedicated.com., and many others</li>
<li>They must be related, because I see them all using the exact same user agent: <code>Mozilla/5.0 (Windows NT 6.3; Trident/7.0; rv:11.0) like Gecko</code></li>
<li>So this startdedicated.com DNS is some Bing bot also…</li>
</ul>
</li>
<li>I extracted all the IPs and purged them using my <code>check-spider-ip-hits.sh</code> script
<ul>
<li>In total I purged 225,000 hits…</li>
</ul>
</li>
</ul>
<h2id="2021-09-12">2021-09-12</h2>
<ul>
<li>Start a harvest on AReS</li>
</ul>
<h2id="2021-09-13">2021-09-13</h2>
<ul>
<li>Mishell Portilla asked me about thumbnails on CGSpace being small
<ul>
<li>For example, <ahref="https://cgspace.cgiar.org/handle/10568/114576">10568/114576</a> has a lot of white space on the left side</li>
<li>I created a new thumbnail with vipsthumbnail:</li>
<li>Some people from the Alliance contacted me last week about AICCRA metadata
<ul>
<li>They have internal things called Components and Clusters, so they were asking how to store these in CGSpace</li>
<li>I suggested adding new metadata values: <code>cg.subject.aiccraComponent</code> and cg.subject.aiccraCluster`</li>
<li>On second thought, these are identifiers so perhaps this is better: <code>cg.identifier.aiccraComponent</code> and <code>cg.identifier.aiccraCluster</code></li>
</ul>
</li>
</ul>
<h2id="2021-09-15">2021-09-15</h2>
<ul>
<li>Add ORCID identifier for new ILRI staff to our controlled vocabualary
<ul>
<li>Also tag their twenty-five existing items on CGSpace:</li>
<li>Meeting with Leroy Mwanzia and some other Alliance people about depositing to CGSpace via API
<ul>
<li>I gave them some technical information about the CGSpace API and links to the controlled vocabularies and metadata registries we are using</li>
<li>I also told them that I would create some documentation listing the metadata fields, which are mandatory, and the respective controlled vocabularies</li>
<li>Start writing a Python script to parse <code>input-forms.xml</code> to create documentation for submissions
<ul>
<li>Found a bug with the DSpace 6.3 REST API, it returns HTTP 500 for <code>dc.title</code> even though it exists in the registry: <ahref="https://demo.dspace.org/rest/registries/schema/dc/metadata-fields/title">https://demo.dspace.org/rest/registries/schema/dc/metadata-fields/title</a></li>
<li>Seems to be with any field that does not have a qualifier</li>
<li>I filed an issue: <ahref="https://github.com/DSpace/DSpace/issues/7946">https://github.com/DSpace/DSpace/issues/7946</a></li>
<li>I decided to update all the metadata field descriptions in our registry so I can use that instead of the “hint” for each field in the input form
<ul>
<li>I will include examples as well so that it becomes a better resource</li>
</ul>
</li>
</ul>
<h2id="2021-09-17">2021-09-17</h2>
<ul>
<li>I filed <ahref="https://github.com/AgriculturalSemantics/cg-core/issues/41">an issue about using SPDX License Identifiers in CG Core v2</a></li>
<li>Peter Ballantyne emailed me to say that CGSpace was very slow
<ul>
<li>The front page was returning a blank white page</li>
<li>I looked at the database and the connections look low:</li>
</ul>
</li>
</ul>
<pretabindex="0"><codeclass="language-console"data-lang="console">$ psql -c 'SELECT * FROM pg_stat_activity' | wc -l
63
</code></pre><ul>
<li>Load on the server is under 1.0, and there are only about 1,000 XMLUI sessions, which seems to be normal for this time of day according to Munin</li>
<li>But the DSpace log file shows tons of database issues:</li>
</ul>
<pretabindex="0"><codeclass="language-console"data-lang="console">$ grep -c "Timeout waiting for idle object" dspace.log.2021-09-17
14779
</code></pre><ul>
<li>The earliest one I see is around midnight (now is 2PM):</li>
<li>Continue working on cleaning up and annotating the metadata registry on CGSpace
<ul>
<li>I removed two old metadata fields that we stopped using earlier this year with the CG Core v2 migration: <code>cg.targetaudience</code> and <code>cg.title.journal</code></li>
</ul>
</li>
</ul>
<h2id="2021-09-18">2021-09-18</h2>
<ul>
<li>Make more progress on parsing and documenting the CGSpace submission form
<ul>
<li>Publish on GitHub: <ahref="https://github.com/ilri/cgspace-submission-guidelines">https://github.com/ilri/cgspace-submission-guidelines</a></li>
</ul>
</li>
</ul>
<h2id="2021-09-19">2021-09-19</h2>
<ul>
<li>Improve CGSpace Submission Guidelines metadata parsing and documentation
<ul>
<li>GitHub Pages is live now: <ahref="https://ilri.github.io/cgspace-submission-guidelines/">https://ilri.github.io/cgspace-submission-guidelines/</a></li>
</ul>
</li>
<li>Start a full harvest on AReS
<ul>
<li>The harvest completed successfully, but for some reason there were only 92,000 items…</li>
<li>I updated all Docker images, rebuilt the application, then ran all system updates and rebooted the system:</li>
ldap_sasl_bind(SIMPLE): Can't contact LDAP server (-1)
</code></pre><ul>
<li>I sent a message to CGNET to ask about the server settings and see if our IP is still whitelisted
<ul>
<li>It turns out that CGNET created a new Active Directory server (AZCGNEROOT3.cgiarad.org) and decomissioned the old one last week</li>
<li>I updated the configuration on CGSpace and confirmed that it is working</li>
</ul>
</li>
<li>Create another test account for Rafael from Bioversity-CIAT to submit some items to DSpace Test:</li>
</ul>
<pretabindex="0"><codeclass="language-console"data-lang="console">$ dspace user -a -m tip-submit@cgiar.org -g CIAT -s Submit -p 'fuuuuuuuu'
</code></pre><ul>
<li>I added the account to the Alliance Admins account, which is should allow him to submit to any Alliance collection
<ul>
<li>According to my notes from <ahref="/cgspace-notes/2020-10/">2020-10</a> the account must be in the admin group in order to submit via the REST API</li>
</ul>
</li>
<li>Run <code>dspace cleanup -v</code> process on CGSpace to clean up old bitstreams</li>
<li>Export lists of authors, donors, and affiliations for Peter Ballantyne to clean up:</li>
</ul>
<pretabindex="0"><codeclass="language-console"data-lang="console">localhost/dspace63= > \COPY (SELECT DISTINCT text_value as "dc.contributor.author", count(*) FROM metadatavalue WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id = 3 GROUP BY text_value ORDER BY count DESC) to /tmp/2021-09-20-authors.csv WITH CSV HEADER;
COPY 80901
localhost/dspace63= > \COPY (SELECT DISTINCT text_value as "cg.contributor.donor", count(*) FROM metadatavalue WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id = 248 GROUP BY text_value ORDER BY count DESC) to /tmp/2021-09-20-donors.csv WITH CSV HEADER;
COPY 1274
localhost/dspace63= > \COPY (SELECT DISTINCT text_value as "cg.contributor.affiliation", count(*) FROM metadatavalue WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id = 211 GROUP BY text_value ORDER BY count DESC) to /tmp/2021-09-20-affiliations.csv WITH CSV HEADER;