cgspace-notes/docs/2022-04/index.html

564 lines
37 KiB
HTML
Raw Normal View History

2022-04-27 08:58:45 +02:00
<!DOCTYPE html>
<html lang="en" >
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta property="og:title" content="April, 2022" />
2022-06-14 07:45:07 +02:00
<meta property="og:description" content="2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54." />
2022-04-27 08:58:45 +02:00
<meta property="og:type" content="article" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2022-04/" />
<meta property="article:published_time" content="2022-04-01T10:53:39+03:00" />
2022-05-04 15:48:24 +02:00
<meta property="article:modified_time" content="2022-05-04T11:09:45+03:00" />
2022-04-27 08:58:45 +02:00
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="April, 2022"/>
2022-06-14 07:45:07 +02:00
<meta name="twitter:description" content="2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54."/>
2022-09-05 15:59:11 +02:00
<meta name="generator" content="Hugo 0.102.3" />
2022-04-27 08:58:45 +02:00
<script type="application/ld+json">
{
"@context": "http://schema.org",
"@type": "BlogPosting",
"headline": "April, 2022",
"url": "https://alanorth.github.io/cgspace-notes/2022-04/",
2022-05-04 10:09:45 +02:00
"wordCount": "2015",
2022-04-27 08:58:45 +02:00
"datePublished": "2022-04-01T10:53:39+03:00",
2022-05-04 15:48:24 +02:00
"dateModified": "2022-05-04T11:09:45+03:00",
2022-04-27 08:58:45 +02:00
"author": {
"@type": "Person",
"name": "Alan Orth"
},
"keywords": "Notes"
}
</script>
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2022-04/">
<title>April, 2022 | CGSpace Notes</title>
<!-- combined, minified CSS -->
2022-09-12 10:35:57 +02:00
<link href="https://alanorth.github.io/cgspace-notes/css/style.c6ba80bc50669557645abe05f86b73cc5af84408ed20f1551a267bc19ece8228.css" rel="stylesheet" integrity="sha256-xrqAvFBmlVdkWr4F&#43;GtzzFr4RAjtIPFVGiZ7wZ7Ogig=" crossorigin="anonymous">
2022-04-27 08:58:45 +02:00
<!-- minified Font Awesome for SVG icons -->
<script defer src="https://alanorth.github.io/cgspace-notes/js/fontawesome.min.f5072c55a0721857184db93a50561d7dc13975b4de2e19db7f81eb5f3fa57270.js" integrity="sha256-9QcsVaByGFcYTbk6UFYdfcE5dbTeLhnbf4HrXz&#43;lcnA=" crossorigin="anonymous"></script>
<!-- RSS 2.0 feed -->
</head>
<body>
<div class="blog-masthead">
<div class="container">
<nav class="nav blog-nav">
<a class="nav-link " href="https://alanorth.github.io/cgspace-notes/">Home</a>
</nav>
</div>
</div>
<header class="blog-header">
<div class="container">
<h1 class="blog-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/" rel="home">CGSpace Notes</a></h1>
<p class="lead blog-description" dir="auto">Documenting day-to-day work on the <a href="https://cgspace.cgiar.org">CGSpace</a> repository.</p>
</div>
</header>
<div class="container">
<div class="row">
<div class="col-sm-8 blog-main">
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-04/">April, 2022</a></h2>
<p class="blog-post-meta">
<time datetime="2022-04-01T10:53:39+03:00">Fri Apr 01, 2022</time>
in
2022-06-23 07:40:53 +02:00
<span class="fas fa-folder" aria-hidden="true"></span>&nbsp;<a href="/categories/notes/" rel="category tag">Notes</a>
2022-04-27 08:58:45 +02:00
</p>
</header>
<h2 id="2022-04-01">2022-04-01</h2>
<ul>
<li>I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday
<ul>
<li>The Discovery indexing took this long:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>real 334m33.625s
</span></span><span style="display:flex;"><span>user 227m51.331s
</span></span><span style="display:flex;"><span>sys 3m43.037s
</span></span></code></pre></div><h2 id="2022-04-04">2022-04-04</h2>
<ul>
<li>Start a full harvest on AReS</li>
<li>Help Marianne with submit/approve access on a new collection on CGSpace</li>
<li>Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc)</li>
<li>Looking at the Solr statistics for 2022-03 on CGSpace
<ul>
<li>I see 54.229.218.204 on Amazon AWS made 49,000 requests, some of which with this user agent: <code>Apache-HttpClient/4.5.9 (Java/1.8.0_322)</code>, and many others with a normal browser agent, so that&rsquo;s fishy!</li>
<li>The DSpace agent pattern <code>http.?agent</code> seems to have caught the first ones, but I&rsquo;ll purge the IP ones</li>
<li>I see 40.77.167.80 is Bing or MSN Bot, but using a normal browser user agent, and if I search Solr for <code>dns:*msnbot* AND dns:*.msn.com.</code> I see over 100,000, which is a problem I noticed a few months ago too&hellip;</li>
<li>I extracted the MSN Bot IPs from Solr using an IP facet, then used the <code>check-spider-ip-hits.sh</code> script to purge them</li>
</ul>
</li>
</ul>
<h2 id="2022-04-10">2022-04-10</h2>
<ul>
<li>Start a full harvest on AReS</li>
</ul>
<h2 id="2022-04-13">2022-04-13</h2>
<ul>
<li>UptimeRobot mailed to say that CGSpace was down
<ul>
<li>I looked and found the load at 44&hellip;</li>
</ul>
</li>
<li>There seem to be a lot of locks from the XMLUI:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ psql -c <span style="color:#e6db74">&#39;SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;&#39;</span> | grep -o -E <span style="color:#e6db74">&#39;(dspaceWeb|dspaceApi)&#39;</span> | sort | uniq -c | sort -n
</span></span><span style="display:flex;"><span> 3173 dspaceWeb
</span></span></code></pre></div><ul>
<li>Looking at the top IPs in nginx&rsquo;s access log one IP in particular stands out:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span> 941 66.249.66.222
</span></span><span style="display:flex;"><span> 1224 95.108.213.28
</span></span><span style="display:flex;"><span> 2074 157.90.209.76
</span></span><span style="display:flex;"><span> 3064 66.249.66.221
</span></span><span style="display:flex;"><span> 95743 185.192.69.15
</span></span></code></pre></div><ul>
<li>185.192.69.15 is in the UK</li>
<li>I added a block for that IP in nginx and the load went down&hellip;</li>
</ul>
<h2 id="2022-04-16">2022-04-16</h2>
<ul>
<li>Start harvest on AReS</li>
</ul>
<h2 id="2022-04-18">2022-04-18</h2>
<ul>
<li>I woke up to several notices from UptimeRobot that CGSpace had gone down and up in the night (of course I&rsquo;m on holiday out of the country for Easter)
<ul>
<li>I see there are many locks in use from the XMLUI:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ psql -c <span style="color:#e6db74">&#39;SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;&#39;</span> | grep -o -E <span style="color:#e6db74">&#39;(dspaceWeb|dspaceApi)&#39;</span> | sort | uniq -c
</span></span><span style="display:flex;"><span> 8932 dspaceWeb
</span></span></code></pre></div><ul>
<li>Looking at the top IPs making requests it seems they are Yandex, bingbot, and Googlebot:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq -c | sort -h
</span></span><span style="display:flex;"><span> 752 69.162.124.231
</span></span><span style="display:flex;"><span> 759 66.249.64.213
</span></span><span style="display:flex;"><span> 864 66.249.66.222
</span></span><span style="display:flex;"><span> 905 2a01:4f8:221:f::2
</span></span><span style="display:flex;"><span> 1013 84.33.2.97
</span></span><span style="display:flex;"><span> 1201 157.55.39.159
</span></span><span style="display:flex;"><span> 1204 157.55.39.144
</span></span><span style="display:flex;"><span> 1209 157.55.39.102
</span></span><span style="display:flex;"><span> 1217 157.55.39.161
</span></span><span style="display:flex;"><span> 1252 207.46.13.177
</span></span><span style="display:flex;"><span> 1274 157.55.39.162
</span></span><span style="display:flex;"><span> 2553 66.249.66.221
</span></span><span style="display:flex;"><span> 2941 95.108.213.28
</span></span></code></pre></div><ul>
<li>One IP is using a stange user agent though:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>84.33.2.97 - - [18/Apr/2022:00:20:38 +0200] &#34;GET /bitstream/handle/10568/109581/Banana_Blomme%20_2020.pdf.jpg HTTP/1.1&#34; 404 10890 &#34;-&#34; &#34;SomeRandomText&#34;
</span></span></code></pre></div><ul>
<li>Overall, it seems we had 17,000 unique IPs connecting in the last nine hours (currently 9:14AM and log file rolled over at 00:00):</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>17314
</span></span></code></pre></div><ul>
<li>That&rsquo;s a lot of unique IPs, and I see some patterns of IPs in China making ten to twenty requests each
<ul>
<li>The ISPs I&rsquo;ve seen so far are ChinaNet and China Unicom</li>
</ul>
</li>
<li>I extracted all the IPs from today and resolved them:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq &gt; /tmp/2022-04-18-ips.txt
</span></span><span style="display:flex;"><span>$ ./ilri/resolve-addresses-geoip2.py -i /tmp/2022-04-18-ips.txt -o /tmp/2022-04-18-ips.csv
</span></span></code></pre></div><ul>
<li>The top ASNs by IP are:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ csvcut -c <span style="color:#ae81ff">2</span> /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n <span style="color:#ae81ff">10</span>
</span></span><span style="display:flex;"><span> 102 GOOGLE
</span></span><span style="display:flex;"><span> 139 Maxihost LTDA
</span></span><span style="display:flex;"><span> 165 AMAZON-02
</span></span><span style="display:flex;"><span> 393 &#34;China Mobile Communications Group Co., Ltd.&#34;
</span></span><span style="display:flex;"><span> 473 AMAZON-AES
</span></span><span style="display:flex;"><span> 616 China Mobile communications corporation
</span></span><span style="display:flex;"><span> 642 M247 Ltd
</span></span><span style="display:flex;"><span> 2336 HostRoyale Technologies Pvt Ltd
</span></span><span style="display:flex;"><span> 4556 Chinanet
</span></span><span style="display:flex;"><span> 5527 CHINA UNICOM China169 Backbone
</span></span><span style="display:flex;"><span>$ csvcut -c <span style="color:#ae81ff">4</span> /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n <span style="color:#ae81ff">10</span>
</span></span><span style="display:flex;"><span> 139 262287
</span></span><span style="display:flex;"><span> 165 16509
</span></span><span style="display:flex;"><span> 180 204287
</span></span><span style="display:flex;"><span> 393 9808
</span></span><span style="display:flex;"><span> 473 14618
</span></span><span style="display:flex;"><span> 615 56041
</span></span><span style="display:flex;"><span> 642 9009
</span></span><span style="display:flex;"><span> 2156 203020
</span></span><span style="display:flex;"><span> 4556 4134
</span></span><span style="display:flex;"><span> 5527 4837
</span></span></code></pre></div><ul>
<li>I spot checked a few IPs from each of these and they are definitely just making bullshit requests to Discovery and HTML sitemap etc</li>
<li>I will download the IP blocks for each ASN except Google and Amazon and ban them</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ wget https://asn.ipinfo.app/api/text/nginx/AS4837 https://asn.ipinfo.app/api/text/nginx/AS4134 https://asn.ipinfo.app/api/text/nginx/AS203020 https://asn.ipinfo.app/api/text/nginx/AS9009 https://asn.ipinfo.app/api/text/nginx/AS56041 https://asn.ipinfo.app/api/text/nginx/AS9808
</span></span><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">&#39;/^$/d&#39;</span> -e <span style="color:#e6db74">&#39;/^#/d&#39;</span> -e <span style="color:#e6db74">&#39;/^{/d&#39;</span> -e <span style="color:#e6db74">&#39;s/deny //&#39;</span> -e <span style="color:#e6db74">&#39;s/;//&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>20296
</span></span></code></pre></div><ul>
<li>I extracted the IPv4 and IPv6 networks:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">&#39;/^$/d&#39;</span> -e <span style="color:#e6db74">&#39;/^#/d&#39;</span> -e <span style="color:#e6db74">&#39;/^{/d&#39;</span> -e <span style="color:#e6db74">&#39;s/deny //&#39;</span> -e <span style="color:#e6db74">&#39;s/;//&#39;</span> | grep <span style="color:#e6db74">&#34;:&#34;</span> | sort &gt; /tmp/ipv6-networks.txt
</span></span><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">&#39;/^$/d&#39;</span> -e <span style="color:#e6db74">&#39;/^#/d&#39;</span> -e <span style="color:#e6db74">&#39;/^{/d&#39;</span> -e <span style="color:#e6db74">&#39;s/deny //&#39;</span> -e <span style="color:#e6db74">&#39;s/;//&#39;</span> | grep -v <span style="color:#e6db74">&#34;:&#34;</span> | sort &gt; /tmp/ipv4-networks.txt
</span></span></code></pre></div><ul>
<li>I suspect we need to aggregate these networks since they are so many and nftables doesn&rsquo;t like it when they overlap:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ wc -l /tmp/ipv4-networks.txt
</span></span><span style="display:flex;"><span>15464 /tmp/ipv4-networks.txt
</span></span><span style="display:flex;"><span>$ aggregate6 /tmp/ipv4-networks.txt | wc -l
</span></span><span style="display:flex;"><span>2781
</span></span><span style="display:flex;"><span>$ wc -l /tmp/ipv6-networks.txt
</span></span><span style="display:flex;"><span>4833 /tmp/ipv6-networks.txt
</span></span><span style="display:flex;"><span>$ aggregate6 /tmp/ipv6-networks.txt | wc -l
</span></span><span style="display:flex;"><span>338
</span></span></code></pre></div><ul>
<li>I deployed these lists on CGSpace, ran all updates, and rebooted the server
<ul>
<li>This list is SURELY too broad because we will block legitimate users in China&hellip; but right now how can I discern?</li>
<li>Also, I need to purge the hits from these 14,000 IPs in Solr when I get time</li>
</ul>
</li>
<li>Looking back at the Munin graphs a few hours later I see this was indeed some kind of spike that was out of the ordinary:</li>
</ul>
<p><img src="/cgspace-notes/2022/04/postgres_connections_ALL-day.png" alt="PostgreSQL connections day">
<img src="/cgspace-notes/2022/04/jmx_dspace_sessions-day.png" alt="DSpace sessions day"></p>
<ul>
<li>I used <code>grepcidr</code> with the aggregated network lists to extract IPs matching those networks from the nginx logs for the past day:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort -u &gt; /tmp/ips.log
</span></span><span style="display:flex;"><span># <span style="color:#66d9ef">while</span> read -r network; <span style="color:#66d9ef">do</span> grepcidr $network /tmp/ips.log &gt;&gt; /tmp/ipv4-ips.txt; <span style="color:#66d9ef">done</span> &lt; /tmp/ipv4-networks-aggregated.txt
</span></span><span style="display:flex;"><span># <span style="color:#66d9ef">while</span> read -r network; <span style="color:#66d9ef">do</span> grepcidr $network /tmp/ips.log &gt;&gt; /tmp/ipv6-ips.txt; <span style="color:#66d9ef">done</span> &lt; /tmp/ipv6-networks-aggregated.txt
</span></span><span style="display:flex;"><span># wc -l /tmp/ipv4-ips.txt
</span></span><span style="display:flex;"><span>15313 /tmp/ipv4-ips.txt
</span></span><span style="display:flex;"><span># wc -l /tmp/ipv6-ips.txt
</span></span><span style="display:flex;"><span>19 /tmp/ipv6-ips.txt
</span></span></code></pre></div><ul>
<li>Then I purged them from Solr using the <code>check-spider-ip-hits.sh</code>:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ ./ilri/check-spider-ip-hits.sh -f /tmp/ipv4-ips.txt -p
</span></span></code></pre></div><h2 id="2022-04-23">2022-04-23</h2>
<ul>
<li>A handful of spider user agents that I identified were merged into COUNTER-Robots so I updated the ILRI override in our DSpace and regenerated the <code>example</code> file that contains most patterns
<ul>
<li>I updated CGSpace, then ran all system updates and rebooted the host</li>
<li>I also ran <code>dspace cleanup -v</code> to prune the database</li>
</ul>
</li>
</ul>
<h2 id="2022-04-24">2022-04-24</h2>
<ul>
<li>Start a harvest on AReS</li>
</ul>
<h2 id="2022-04-25">2022-04-25</h2>
<ul>
<li>Looking at the countries on AReS I decided to collect a list to remind Jacquie at WorldFish again about how many incorrect ones they have
<ul>
<li>There are about sixty incorrect ones, some of which I can correct via the value mappings on AReS, but most I can&rsquo;t</li>
<li>I set up value mappings for seventeen countries, then sent another sixty or so to Jacquie and Salem to hopefully delete</li>
</ul>
</li>
<li>I notice we have over 1,000 items with region <code>Africa South of Sahara</code>
<ul>
<li>I am surprised to see these because we did a mass migration to <code>Sub-Saharan Africa</code> in 2020-10 when we aligned to UN M.49</li>
<li>Oh! It seems I used a capital O in <code>Of</code>!</li>
<li>This is curious, I see we missed <code>East Asia</code> and <code>Northern America</code>, because those are still in our list, but UN M.49 uses <code>Eastern Asia</code> and <code>Northern America</code>&hellip; I will have to raise that with Peter and Abenet later</li>
<li>For now I will just re-run my fixes:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ cat /tmp/regions.csv
</span></span><span style="display:flex;"><span>cg.coverage.region,correct
</span></span><span style="display:flex;"><span>East Africa,Eastern Africa
</span></span><span style="display:flex;"><span>West Africa,Western Africa
</span></span><span style="display:flex;"><span>Southeast Asia,South-eastern Asia
</span></span><span style="display:flex;"><span>South Asia,Southern Asia
</span></span><span style="display:flex;"><span>Africa South of Sahara,Sub-Saharan Africa
</span></span><span style="display:flex;"><span>North Africa,Northern Africa
</span></span><span style="display:flex;"><span>West Asia,Western Asia
</span></span><span style="display:flex;"><span>$ ./ilri/fix-metadata-values.py -i /tmp/regions.csv -db dspace -u dspace -p <span style="color:#e6db74">&#39;fuuu&#39;</span> -f cg.coverage.region -m <span style="color:#ae81ff">227</span> -t correct
</span></span></code></pre></div><ul>
<li>Then I started a new harvest on AReS</li>
</ul>
<h2 id="2022-04-27">2022-04-27</h2>
<ul>
<li>I woke up to many up down notices for CGSpace from UptimeRobot
<ul>
<li>The server has load 111.0&hellip; sigh.</li>
</ul>
</li>
<li>According to Grafana it seems to have started at 4:00 AM</li>
</ul>
<p><img src="/cgspace-notes/2022/04/cgspace-load.png" alt="Grafana load"></p>
<ul>
<li>There are a metric fuck ton of database locks from the XMLUI:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ psql -c <span style="color:#e6db74">&#39;SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;&#39;</span> | grep -o -E <span style="color:#e6db74">&#39;(dspaceWeb|dspaceApi)&#39;</span> | sort | uniq -c
</span></span><span style="display:flex;"><span> 128 dspaceApi
</span></span><span style="display:flex;"><span> 16890 dspaceWeb
</span></span></code></pre></div><ul>
<li>As for the server logs, I don&rsquo;t see many IPs connecting today:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>2924
</span></span></code></pre></div><ul>
<li>But there appear to be some IPs making many requests:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq -c | sort -h
</span></span><span style="display:flex;"><span>...
</span></span><span style="display:flex;"><span> 345 207.46.13.53
</span></span><span style="display:flex;"><span> 646 66.249.66.222
</span></span><span style="display:flex;"><span> 678 54.90.79.112
</span></span><span style="display:flex;"><span> 1529 136.243.148.249
</span></span><span style="display:flex;"><span> 1797 54.175.8.110
</span></span><span style="display:flex;"><span> 2304 174.129.118.171
</span></span><span style="display:flex;"><span> 2523 66.249.66.221
</span></span><span style="display:flex;"><span> 2632 52.73.204.196
</span></span><span style="display:flex;"><span> 2667 54.174.240.122
</span></span><span style="display:flex;"><span> 5206 35.172.193.232
</span></span><span style="display:flex;"><span> 5646 35.153.131.101
</span></span><span style="display:flex;"><span> 6373 3.85.92.145
</span></span><span style="display:flex;"><span> 7383 34.227.10.4
</span></span><span style="display:flex;"><span> 8330 100.24.63.172
</span></span><span style="display:flex;"><span> 8342 34.236.36.176
</span></span><span style="display:flex;"><span> 8369 44.200.190.111
</span></span><span style="display:flex;"><span> 8371 3.238.116.153
</span></span><span style="display:flex;"><span> 8391 18.232.101.158
</span></span><span style="display:flex;"><span> 8631 3.239.81.247
</span></span><span style="display:flex;"><span> 8634 54.82.125.225
</span></span></code></pre></div><ul>
<li>54.82.125.225, 3.239.81.247, 18.232.101.158, 3.238.116.153, 44.200.190.111, 34.236.36.176, 100.24.63.172, 3.85.92.145, 35.153.131.101, 35.172.193.232, 54.174.240.122, 52.73.204.196, 174.129.118.171, 54.175.8.110, and 54.90.79.112 are all on Amazon and using this normal-looking user agent:</li>
</ul>
<pre tabindex="0"><code>Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.125 Safari/537.3
</code></pre><ul>
<li>None of these hosts are re-using their DSpace session ID so they are definitely not normal browsers as they are claiming:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ grep 54.82.125.225 dspace.log.2022-04-27 | grep -oE <span style="color:#e6db74">&#39;session_id=[A-Z0-9]{32}:ip_addr=&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>5760
</span></span><span style="display:flex;"><span>$ grep 3.239.81.247 dspace.log.2022-04-27 | grep -oE <span style="color:#e6db74">&#39;session_id=[A-Z0-9]{32}:ip_addr=&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>6053
</span></span><span style="display:flex;"><span>$ grep 18.232.101.158 dspace.log.2022-04-27 | grep -oE <span style="color:#e6db74">&#39;session_id=[A-Z0-9]{32}:ip_addr=&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>5841
</span></span><span style="display:flex;"><span>$ grep 3.238.116.153 dspace.log.2022-04-27 | grep -oE <span style="color:#e6db74">&#39;session_id=[A-Z0-9]{32}:ip_addr=&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>5887
</span></span><span style="display:flex;"><span>$ grep 44.200.190.111 dspace.log.2022-04-27 | grep -oE <span style="color:#e6db74">&#39;session_id=[A-Z0-9]{32}:ip_addr=&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>5899
</span></span><span style="display:flex;"><span>...
</span></span></code></pre></div><ul>
<li>And we can see a massive spike in sessions in Munin:</li>
</ul>
<p><img src="/cgspace-notes/2022/04/jmx_dspace_sessions-day2.png" alt="Grafana load"></p>
<ul>
<li>I see the following IPs using that user agent today:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># grep <span style="color:#e6db74">&#39;Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.125 Safari/537.36&#39;</span> /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq -c | sort -h
</span></span><span style="display:flex;"><span> 678 54.90.79.112
</span></span><span style="display:flex;"><span> 1797 54.175.8.110
</span></span><span style="display:flex;"><span> 2697 174.129.118.171
</span></span><span style="display:flex;"><span> 2765 52.73.204.196
</span></span><span style="display:flex;"><span> 3072 54.174.240.122
</span></span><span style="display:flex;"><span> 5206 35.172.193.232
</span></span><span style="display:flex;"><span> 5646 35.153.131.101
</span></span><span style="display:flex;"><span> 6783 3.85.92.145
</span></span><span style="display:flex;"><span> 7763 34.227.10.4
</span></span><span style="display:flex;"><span> 8738 100.24.63.172
</span></span><span style="display:flex;"><span> 8748 34.236.36.176
</span></span><span style="display:flex;"><span> 8787 3.238.116.153
</span></span><span style="display:flex;"><span> 8794 18.232.101.158
</span></span><span style="display:flex;"><span> 8806 44.200.190.111
</span></span><span style="display:flex;"><span> 9021 54.82.125.225
</span></span><span style="display:flex;"><span> 9027 3.239.81.247
</span></span></code></pre></div><ul>
<li>I added those IPs to the firewall and then purged their hits from Solr:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ ./ilri/check-spider-ip-hits.sh -f /tmp/ips.txt -p
</span></span><span style="display:flex;"><span>Purging 6024 hits from 100.24.63.172 in statistics
</span></span><span style="display:flex;"><span>Purging 1719 hits from 174.129.118.171 in statistics
</span></span><span style="display:flex;"><span>Purging 5972 hits from 18.232.101.158 in statistics
</span></span><span style="display:flex;"><span>Purging 6053 hits from 3.238.116.153 in statistics
</span></span><span style="display:flex;"><span>Purging 6228 hits from 3.239.81.247 in statistics
</span></span><span style="display:flex;"><span>Purging 5305 hits from 34.227.10.4 in statistics
</span></span><span style="display:flex;"><span>Purging 6002 hits from 34.236.36.176 in statistics
</span></span><span style="display:flex;"><span>Purging 3908 hits from 35.153.131.101 in statistics
</span></span><span style="display:flex;"><span>Purging 3692 hits from 35.172.193.232 in statistics
</span></span><span style="display:flex;"><span>Purging 4525 hits from 3.85.92.145 in statistics
</span></span><span style="display:flex;"><span>Purging 6048 hits from 44.200.190.111 in statistics
</span></span><span style="display:flex;"><span>Purging 1942 hits from 52.73.204.196 in statistics
</span></span><span style="display:flex;"><span>Purging 1944 hits from 54.174.240.122 in statistics
</span></span><span style="display:flex;"><span>Purging 1264 hits from 54.175.8.110 in statistics
</span></span><span style="display:flex;"><span>Purging 6117 hits from 54.82.125.225 in statistics
</span></span><span style="display:flex;"><span>Purging 486 hits from 54.90.79.112 in statistics
</span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010">
</span></span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010"></span>Total number of bot hits purged: 67229
</span></span></code></pre></div><ul>
<li>Then I created a CSV with these IPs and reported them to AbuseIPDB.com:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ cat /tmp/ips.csv
</span></span><span style="display:flex;"><span>IP,Categories,ReportDate,Comment
</span></span><span style="display:flex;"><span>100.24.63.172,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>174.129.118.171,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>18.232.101.158,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>3.238.116.153,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>3.239.81.247,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>34.227.10.4,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>34.236.36.176,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>35.153.131.101,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>35.172.193.232,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>3.85.92.145,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>44.200.190.111,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>52.73.204.196,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>54.174.240.122,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>54.175.8.110,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>54.82.125.225,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>54.90.79.112,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
2022-04-28 07:49:31 +02:00
</span></span></code></pre></div><ul>
<li>An hour or so later two more IPs on Amazon started making requests with that user agent too:
<ul>
<li>3.82.22.114</li>
<li>18.234.122.84</li>
</ul>
</li>
<li>Load on the server went back up, sigh</li>
<li>I added those IPs to the firewall drop list and purged their hits from Solr as well:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ ./ilri/check-spider-ip-hits.sh -f /tmp/ips.txt -p
</span></span><span style="display:flex;"><span>Purging 2839 hits from 3.82.22.114 in statistics
</span></span><span style="display:flex;"><span>Purging 592 hits from 18.234.122.84 in statistics
</span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010">
</span></span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010"></span>Total number of bot hits purged: 343
</span></span></code></pre></div><ul>
<li>Oh god, there are more coming
<ul>
<li>3.81.21.251</li>
<li>54.162.92.93</li>
<li>54.226.171.89</li>
</ul>
</li>
</ul>
2022-05-04 10:09:45 +02:00
<h2 id="2022-04-28">2022-04-28</h2>
<ul>
<li>Had a meeting with FAO and the team from SEAFDAC, who run many repositories that are integrated with AGROVOC
<ul>
<li>Elvi from SEAFDAC has modified the <a href="https://github.com/eulereadgbe/DSpace/blob/sair-6.3/dspace-api/src/main/java/org/dspace/content/authority/AgrovocAuthority.java">DSpace-CRIS 6.x VIAF lookup plugin to query AGROVOC</a></li>
<li>Also, they are doing a nice integration similar to the WorldFish / MELSpace repositories where they store the AGROVOC URIs in DSpace and show the terms with an icon in the UI</li>
<li>See: <a href="https://repository.seafdec.org.ph/handle/10862/6320">https://repository.seafdec.org.ph/handle/10862/6320</a></li>
</ul>
</li>
</ul>
2022-04-28 07:49:31 +02:00
<!-- raw HTML omitted -->
2022-04-27 08:58:45 +02:00
</article>
</div> <!-- /.blog-main -->
<aside class="col-sm-3 ml-auto blog-sidebar">
<section class="sidebar-module">
<h4>Recent Posts</h4>
<ol class="list-unstyled">
2022-09-15 07:37:57 +02:00
<li><a href="/cgspace-notes/2022-09/">September, 2022</a></li>
2022-08-01 15:36:13 +02:00
<li><a href="/cgspace-notes/2022-08/">August, 2022</a></li>
2022-07-04 08:25:14 +02:00
<li><a href="/cgspace-notes/2022-07/">July, 2022</a></li>
2022-06-06 08:45:43 +02:00
<li><a href="/cgspace-notes/2022-06/">June, 2022</a></li>
2022-05-04 10:09:45 +02:00
<li><a href="/cgspace-notes/2022-05/">May, 2022</a></li>
2022-04-27 08:58:45 +02:00
</ol>
</section>
<section class="sidebar-module">
<h4>Links</h4>
<ol class="list-unstyled">
<li><a href="https://cgspace.cgiar.org">CGSpace</a></li>
<li><a href="https://dspacetest.cgiar.org">DSpace Test</a></li>
<li><a href="https://github.com/ilri/DSpace">CGSpace @ GitHub</a></li>
</ol>
</section>
</aside>
</div> <!-- /.row -->
</div> <!-- /.container -->
<footer class="blog-footer">
<p dir="auto">
Blog template created by <a href="https://twitter.com/mdo">@mdo</a>, ported to Hugo by <a href='https://twitter.com/mralanorth'>@mralanorth</a>.
</p>
<p>
<a href="#">Back to top</a>
</p>
</footer>
</body>
</html>