mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-12-23 05:32:20 +01:00
524 lines
18 KiB
HTML
524 lines
18 KiB
HTML
<!DOCTYPE html>
|
||
<html lang="en">
|
||
|
||
<head>
|
||
<meta charset="utf-8">
|
||
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
||
|
||
<meta property="og:title" content="November, 2017" />
|
||
<meta property="og:description" content="2017-11-01
|
||
|
||
|
||
The CORE developers responded to say they are looking into their bot not respecting our robots.txt
|
||
|
||
|
||
2017-11-02
|
||
|
||
|
||
Today there have been no hits by CORE and no alerts from Linode (coincidence?)
|
||
|
||
|
||
# grep -c "CORE" /var/log/nginx/access.log
|
||
0
|
||
|
||
|
||
|
||
Generate list of authors on CGSpace for Peter to go through and correct:
|
||
|
||
|
||
dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
|
||
COPY 54701
|
||
|
||
|
||
" />
|
||
<meta property="og:type" content="article" />
|
||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2017-11/" />
|
||
|
||
|
||
|
||
<meta property="article:published_time" content="2017-11-02T09:37:54+02:00"/>
|
||
|
||
<meta property="article:modified_time" content="2017-11-05T15:53:35+02:00"/>
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
<meta name="twitter:card" content="summary"/><meta name="twitter:title" content="November, 2017"/>
|
||
<meta name="twitter:description" content="2017-11-01
|
||
|
||
|
||
The CORE developers responded to say they are looking into their bot not respecting our robots.txt
|
||
|
||
|
||
2017-11-02
|
||
|
||
|
||
Today there have been no hits by CORE and no alerts from Linode (coincidence?)
|
||
|
||
|
||
# grep -c "CORE" /var/log/nginx/access.log
|
||
0
|
||
|
||
|
||
|
||
Generate list of authors on CGSpace for Peter to go through and correct:
|
||
|
||
|
||
dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
|
||
COPY 54701
|
||
|
||
|
||
"/>
|
||
<meta name="generator" content="Hugo 0.30.2" />
|
||
|
||
|
||
|
||
<script type="application/ld+json">
|
||
{
|
||
"@context": "http://schema.org",
|
||
"@type": "BlogPosting",
|
||
"headline": "November, 2017",
|
||
"url": "https://alanorth.github.io/cgspace-notes/2017-11/",
|
||
"wordCount": "1445",
|
||
"datePublished": "2017-11-02T09:37:54+02:00",
|
||
"dateModified": "2017-11-05T15:53:35+02:00",
|
||
"author": {
|
||
"@type": "Person",
|
||
"name": "Alan Orth"
|
||
},
|
||
"keywords": "Notes"
|
||
}
|
||
</script>
|
||
|
||
|
||
|
||
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2017-11/">
|
||
|
||
<title>November, 2017 | CGSpace Notes</title>
|
||
|
||
<!-- combined, minified CSS -->
|
||
<link href="https://alanorth.github.io/cgspace-notes/css/style.css" rel="stylesheet" integrity="sha384-O8wjsnz02XiyrPxnhfF6AVOv6YLBaEGRCnVF+DL3gCPBy9cieyHcpixIrVyD2JS5" crossorigin="anonymous">
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
</head>
|
||
|
||
<body>
|
||
|
||
<div class="blog-masthead">
|
||
<div class="container">
|
||
<nav class="nav blog-nav">
|
||
<a class="nav-link " href="https://alanorth.github.io/cgspace-notes/">Home</a>
|
||
|
||
|
||
</nav>
|
||
</div>
|
||
</div>
|
||
|
||
<header class="blog-header">
|
||
<div class="container">
|
||
<h1 class="blog-title"><a href="https://alanorth.github.io/cgspace-notes/" rel="home">CGSpace Notes</a></h1>
|
||
<p class="lead blog-description">Documenting day-to-day work on the <a href="https://cgspace.cgiar.org">CGSpace</a> repository.</p>
|
||
</div>
|
||
</header>
|
||
|
||
<div class="container">
|
||
<div class="row">
|
||
<div class="col-sm-8 blog-main">
|
||
|
||
|
||
|
||
|
||
<article class="blog-post">
|
||
<header>
|
||
<h2 class="blog-post-title"><a href="https://alanorth.github.io/cgspace-notes/2017-11/">November, 2017</a></h2>
|
||
<p class="blog-post-meta"><time datetime="2017-11-02T09:37:54+02:00">Thu Nov 02, 2017</time> by Alan Orth in
|
||
|
||
<i class="fa fa-tag" aria-hidden="true"></i> <a href="/cgspace-notes/tags/notes" rel="tag">Notes</a>
|
||
|
||
</p>
|
||
</header>
|
||
<h2 id="2017-11-01">2017-11-01</h2>
|
||
|
||
<ul>
|
||
<li>The CORE developers responded to say they are looking into their bot not respecting our robots.txt</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-11-02">2017-11-02</h2>
|
||
|
||
<ul>
|
||
<li>Today there have been no hits by CORE and no alerts from Linode (coincidence?)</li>
|
||
</ul>
|
||
|
||
<pre><code># grep -c "CORE" /var/log/nginx/access.log
|
||
0
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Generate list of authors on CGSpace for Peter to go through and correct:</li>
|
||
</ul>
|
||
|
||
<pre><code>dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
|
||
COPY 54701
|
||
</code></pre>
|
||
|
||
<p></p>
|
||
|
||
<ul>
|
||
<li>Abenet asked if it would be possible to generate a report of items in Listing and Reports that had “International Fund for Agricultural Development” as the <em>only</em> investor</li>
|
||
<li>I opened a ticket with Atmire to ask if this was possible: <a href="https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=540">https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=540</a></li>
|
||
<li>Work on making the thumbnails in the item view clickable</li>
|
||
<li>Basically, once you read the METS XML for an item it becomes easy to trace the structure to find the bitstream link</li>
|
||
</ul>
|
||
|
||
<pre><code>//mets:fileSec/mets:fileGrp[@USE='CONTENT']/mets:file/mets:FLocat[@LOCTYPE='URL']/@xlink:href
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>METS XML is available for all items with this pattern: /metadata/handle/10568/95947/mets.xml</li>
|
||
<li>I whipped up a quick hack to print a clickable link with this URL on the thumbnail but it needs to check a few corner cases, like when there is a thumbnail but no content bitstream!</li>
|
||
<li>Help proof fifty-three CIAT records for Sisay: <a href="https://dspacetest.cgiar.org/handle/10568/95895">https://dspacetest.cgiar.org/handle/10568/95895</a></li>
|
||
<li>A handful of issues with <code>cg.place</code> using format like “Lima, PE” instead of “Lima, Peru”</li>
|
||
<li>Also, some dates like with completely invalid format like “2010- 06” and “2011-3-28”</li>
|
||
<li>I also collapsed some consecutive whitespace on a handful of fields</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-11-03">2017-11-03</h2>
|
||
|
||
<ul>
|
||
<li>Atmire got back to us to say that they estimate it will take two days of labor to implement the change to Listings and Reports</li>
|
||
<li>I said I’d ask Abenet if she wants that feature</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-11-04">2017-11-04</h2>
|
||
|
||
<ul>
|
||
<li>I finished looking through Sisay’s CIAT records for the “Alianzas de Aprendizaje” data</li>
|
||
<li>I corrected about half of the authors to standardize them</li>
|
||
<li>Linode emailed this morning to say that the CPU usage was high again, this time at 6:14AM</li>
|
||
<li>It’s the first time in a few days that this has happened</li>
|
||
<li>I had a look to see what was going on, but it isn’t the CORE bot:</li>
|
||
</ul>
|
||
|
||
<pre><code># awk '{print $1}' /var/log/nginx/access.log | sort -n | uniq -c | sort -h | tail
|
||
306 68.180.229.31
|
||
323 61.148.244.116
|
||
414 66.249.66.91
|
||
507 40.77.167.16
|
||
618 157.55.39.161
|
||
652 207.46.13.103
|
||
666 157.55.39.254
|
||
1173 104.196.152.243
|
||
1737 66.249.66.90
|
||
23101 138.201.52.218
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>138.201.52.218 is from some Hetzner server, and I see it making 40,000 requests yesterday too, but none before that:</li>
|
||
</ul>
|
||
|
||
<pre><code># zgrep -c 138.201.52.218 /var/log/nginx/access.log*
|
||
/var/log/nginx/access.log:24403
|
||
/var/log/nginx/access.log.1:45958
|
||
/var/log/nginx/access.log.2.gz:0
|
||
/var/log/nginx/access.log.3.gz:0
|
||
/var/log/nginx/access.log.4.gz:0
|
||
/var/log/nginx/access.log.5.gz:0
|
||
/var/log/nginx/access.log.6.gz:0
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>It’s clearly a bot as it’s making tens of thousands of requests, but it’s using a “normal” user agent:</li>
|
||
</ul>
|
||
|
||
<pre><code>Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.0 Safari/537.36
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>For now I don’t know what this user is!</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-11-05">2017-11-05</h2>
|
||
|
||
<ul>
|
||
<li>Peter asked if I could fix the appearance of “International Livestock Research Institute” in the author lookup during item submission</li>
|
||
<li>It looks to be just an issue with the user interface expecting authors to have both a first and last name:</li>
|
||
</ul>
|
||
|
||
<p><img src="/cgspace-notes/2017/11/author-lookup.png" alt="Author lookup" />
|
||
<img src="/cgspace-notes/2017/11/add-author.png" alt="Add author" /></p>
|
||
|
||
<ul>
|
||
<li>But in the database the authors are correct (none with weird <code>, /</code> characters):</li>
|
||
</ul>
|
||
|
||
<pre><code>dspace=# select distinct text_value, authority, confidence from metadatavalue value where resource_type_id=2 and metadata_field_id=3 and text_value like 'International Livestock Research Institute%';
|
||
text_value | authority | confidence
|
||
--------------------------------------------+--------------------------------------+------------
|
||
International Livestock Research Institute | 8f3865dc-d056-4aec-90b7-77f49ab4735c | 0
|
||
International Livestock Research Institute | f4db1627-47cd-4699-b394-bab7eba6dadc | 0
|
||
International Livestock Research Institute | | -1
|
||
International Livestock Research Institute | 8f3865dc-d056-4aec-90b7-77f49ab4735c | 600
|
||
International Livestock Research Institute | f4db1627-47cd-4699-b394-bab7eba6dadc | -1
|
||
International Livestock Research Institute | | 600
|
||
International Livestock Research Institute | 8f3865dc-d056-4aec-90b7-77f49ab4735c | -1
|
||
International Livestock Research Institute | 8f3865dc-d056-4aec-90b7-77f49ab4735c | 500
|
||
(8 rows)
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>So I’m not sure if this is just a graphical glitch or if editors have to edit this metadata field prior to approval</li>
|
||
<li>Looking at monitoring Tomcat’s JVM heap with Prometheus, it looks like we need to use JMX + <a href="https://github.com/prometheus/jmx_exporter">jmx_exporter</a></li>
|
||
<li>This guide shows how to <a href="https://geekflare.com/enable-jmx-tomcat-to-monitor-administer/">enable JMX in Tomcat</a> by modifying <code>CATALINA_OPTS</code></li>
|
||
<li>I was able to successfully connect to my local Tomcat with jconsole!</li>
|
||
</ul>
|
||
|
||
<h2 id="2017-11-07">2017-11-07</h2>
|
||
|
||
<ul>
|
||
<li>CGSpace when down and up a few times this morning, first around 3AM, then around 7</li>
|
||
<li>Tsega had to restart Tomcat 7 to fix it temporarily</li>
|
||
<li>I will start by looking at bot usage (access.log.1 includes usage until 6AM today):</li>
|
||
</ul>
|
||
|
||
<pre><code># cat /var/log/nginx/access.log.1 | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
619 65.49.68.184
|
||
840 65.49.68.199
|
||
924 66.249.66.91
|
||
1131 68.180.229.254
|
||
1583 66.249.66.90
|
||
1953 207.46.13.103
|
||
1999 207.46.13.80
|
||
2021 157.55.39.161
|
||
2034 207.46.13.36
|
||
4681 104.196.152.243
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>104.196.152.243 seems to be a top scraper for a few weeks now:</li>
|
||
</ul>
|
||
|
||
<pre><code># zgrep -c 104.196.152.243 /var/log/nginx/access.log*
|
||
/var/log/nginx/access.log:336
|
||
/var/log/nginx/access.log.1:4681
|
||
/var/log/nginx/access.log.2.gz:3531
|
||
/var/log/nginx/access.log.3.gz:3532
|
||
/var/log/nginx/access.log.4.gz:5786
|
||
/var/log/nginx/access.log.5.gz:8542
|
||
/var/log/nginx/access.log.6.gz:6988
|
||
/var/log/nginx/access.log.7.gz:7517
|
||
/var/log/nginx/access.log.8.gz:7211
|
||
/var/log/nginx/access.log.9.gz:2763
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>This user is responsible for hundreds and sometimes thousands of Tomcat sessions:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep 104.196.152.243 dspace.log.2017-11-07 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | uniq | wc -l
|
||
954
|
||
$ grep 104.196.152.243 dspace.log.2017-11-03 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | uniq | wc -l
|
||
6199
|
||
$ grep 104.196.152.243 dspace.log.2017-11-01 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | uniq | wc -l
|
||
7051
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>The worst thing is that this user never specifies a user agent string so we can’t lump it in with the other bots using the Tomcat Session Crawler Manager Valve</li>
|
||
<li>They don’t request dynamic URLs like “/discover” but they seem to be fetching handles from XMLUI instead of REST (and some with <code>//handle</code>, note the regex below):</li>
|
||
</ul>
|
||
|
||
<pre><code># grep -c 104.196.152.243 /var/log/nginx/access.log.1
|
||
4681
|
||
# grep 104.196.152.243 /var/log/nginx/access.log.1 | grep -c -P 'GET //?handle'
|
||
4618
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I just realized that <code>ciat.cgiar.org</code> points to 104.196.152.243, so I should contact Leroy from CIAT to see if we can change their scraping behavior</li>
|
||
<li>The next IP (207.46.13.36) seem to be Microsoft’s bingbot, but all its requests specify the “bingbot” user agent and there are no requests for dynamic URLs that are forbidden, like “/discover”:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep -c 207.46.13.36 /var/log/nginx/access.log.1
|
||
2034
|
||
# grep 207.46.13.36 /var/log/nginx/access.log.1 | grep -c "GET /discover"
|
||
0
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>The next IP (157.55.39.161) also seems to be bingbot, and none of its requests are for URLs forbidden by robots.txt either:</li>
|
||
</ul>
|
||
|
||
<pre><code># grep 157.55.39.161 /var/log/nginx/access.log.1 | grep -c "GET /discover"
|
||
0
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>The next few seem to be bingbot as well, and they declare a proper user agent and do not request dynamic URLs like “/discover”:</li>
|
||
</ul>
|
||
|
||
<pre><code># grep -c -E '207.46.13.[0-9]{2,3}' /var/log/nginx/access.log.1
|
||
5997
|
||
# grep -E '207.46.13.[0-9]{2,3}' /var/log/nginx/access.log.1 | grep -c "bingbot"
|
||
5988
|
||
# grep -E '207.46.13.[0-9]{2,3}' /var/log/nginx/access.log.1 | grep -c "GET /discover"
|
||
0
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>The next few seem to be Googlebot, and they declare a proper user agent and do not request dynamic URLs like “/discover”:</li>
|
||
</ul>
|
||
|
||
<pre><code># grep -c -E '66.249.66.[0-9]{2,3}' /var/log/nginx/access.log.1
|
||
3048
|
||
# grep -E '66.249.66.[0-9]{2,3}' /var/log/nginx/access.log.1 | grep -c Google
|
||
3048
|
||
# grep -E '66.249.66.[0-9]{2,3}' /var/log/nginx/access.log.1 | grep -c "GET /discover"
|
||
0
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>The next seems to be Yahoo, which declares a proper user agent and does not request dynamic URLs like “/discover”:</li>
|
||
</ul>
|
||
|
||
<pre><code># grep -c 68.180.229.254 /var/log/nginx/access.log.1
|
||
1131
|
||
# grep 68.180.229.254 /var/log/nginx/access.log.1 | grep -c "GET /discover"
|
||
0
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>The last of the top ten IPs seems to be some bot with a weird user agent, but they are not behaving too well:</li>
|
||
</ul>
|
||
|
||
<pre><code># grep -c -E '65.49.68.[0-9]{3}' /var/log/nginx/access.log.1
|
||
2950
|
||
# grep -E '65.49.68.[0-9]{3}' /var/log/nginx/access.log.1 | grep -c "GET /discover"
|
||
330
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Their user agents vary, ie:
|
||
|
||
<ul>
|
||
<li><code>Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36</code></li>
|
||
<li><code>Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.97 Safari/537.11</code></li>
|
||
<li><code>Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/7.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E)</code></li>
|
||
</ul></li>
|
||
<li>I’ll just keep an eye on that one for now, as it only made a few hundred requests to dynamic discovery URLs</li>
|
||
<li>While it’s not in the top ten, Baidu is one bot that seems to not give a fuck:</li>
|
||
</ul>
|
||
|
||
<pre><code># grep -c Baiduspider /var/log/nginx/access.log.1
|
||
8068
|
||
# grep Baiduspider /var/log/nginx/access.log.1 | grep -c -E "GET /(browse|discover)"
|
||
1431
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>According to their documentation their bot <a href="http://www.baidu.com/search/robots_english.html">respects <code>robots.txt</code></a>, but I don’t see this being the case</li>
|
||
<li>I think I will end up blocking Baidu as well…</li>
|
||
<li>Next is for me to look and see what was happening specifically at 3AM and 7AM when the server crashed</li>
|
||
<li>I should look in nginx access.log, rest.log, oai.log, and DSpace’s dspace.log.2017-11-07</li>
|
||
<li>Here are the top IPs during 2–10 AM:</li>
|
||
</ul>
|
||
|
||
<pre><code># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep -E '07/Nov/2017:0[2-8]' | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||
279 66.249.66.91
|
||
373 65.49.68.199
|
||
446 68.180.229.254
|
||
470 104.196.152.243
|
||
470 197.210.168.174
|
||
598 207.46.13.103
|
||
603 157.55.39.161
|
||
637 207.46.13.80
|
||
703 207.46.13.36
|
||
724 66.249.66.90
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Of those, most are Google, Bing, Yahoo, etc, except 63.143.42.244 and 63.143.42.242 which are Uptime Robot</li>
|
||
</ul>
|
||
|
||
|
||
|
||
|
||
|
||
</article>
|
||
|
||
|
||
|
||
</div> <!-- /.blog-main -->
|
||
|
||
<aside class="col-sm-3 ml-auto blog-sidebar">
|
||
|
||
|
||
|
||
<section class="sidebar-module">
|
||
<h4>Recent Posts</h4>
|
||
<ol class="list-unstyled">
|
||
|
||
|
||
<li><a href="/cgspace-notes/2017-11/">November, 2017</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2017-10/">October, 2017</a></li>
|
||
|
||
<li><a href="/cgspace-notes/cgiar-library-migration/">CGIAR Library Migration</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2017-09/">September, 2017</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2017-08/">August, 2017</a></li>
|
||
|
||
</ol>
|
||
</section>
|
||
|
||
|
||
|
||
|
||
<section class="sidebar-module">
|
||
<h4>Links</h4>
|
||
<ol class="list-unstyled">
|
||
|
||
<li><a href="https://cgspace.cgiar.org">CGSpace</a></li>
|
||
|
||
<li><a href="https://dspacetest.cgiar.org">DSpace Test</a></li>
|
||
|
||
<li><a href="https://github.com/ilri/DSpace">CGSpace @ GitHub</a></li>
|
||
|
||
</ol>
|
||
</section>
|
||
|
||
</aside>
|
||
|
||
|
||
</div> <!-- /.row -->
|
||
</div> <!-- /.container -->
|
||
|
||
<footer class="blog-footer">
|
||
<p>
|
||
|
||
Blog template created by <a href="https://twitter.com/mdo">@mdo</a>, ported to Hugo by <a href='https://twitter.com/mralanorth'>@mralanorth</a>.
|
||
|
||
</p>
|
||
<p>
|
||
<a href="#">Back to top</a>
|
||
</p>
|
||
</footer>
|
||
|
||
</body>
|
||
|
||
</html>
|