mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2025-01-27 05:49:12 +01:00
Add notes for 2018-01-17
This commit is contained in:
@ -717,3 +717,105 @@ sys 0m2.210s
|
||||
- I removed Tsega's access to Linode dashboard as well
|
||||
- I ended up creating a Jira issue for my `db.jndi` documentation fix: [DS-3803](https://jira.duraspace.org/browse/DS-3803)
|
||||
- The DSpace developers said they wanted each pull request to be associated with a Jira issue
|
||||
|
||||
## 2018-01-17
|
||||
|
||||
- Abenet asked me to proof and upload 54 records for LIVES
|
||||
- A few records were missing countries (even though they're all from Ethiopia)
|
||||
- Also, there are whitespace issues in many columns, and the items are mapped to the LIVES and ILRI articles collections, not Theses
|
||||
- In any case, importing them like this:
|
||||
|
||||
```
|
||||
$ export JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx512m -XX:+TieredCompilation -XX:TieredStopAtLevel=1"
|
||||
$ dspace import -a -e aorth@mjanja.ch -s /tmp/2018-01-16\ LIVES/SimpleArchiveFormat -m lives.map &> lives.log
|
||||
```
|
||||
|
||||
- And fantastic, before I started the import there were 10 PostgreSQL connections, and then CGSpace crashed during the upload
|
||||
- When I looked there were 210 PostgreSQL connections!
|
||||
- I don't see any high load in XMLUI or REST/OAI:
|
||||
|
||||
```
|
||||
# cat /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E "17/Jan/2018" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||||
381 40.77.167.124
|
||||
403 213.55.99.121
|
||||
431 207.46.13.60
|
||||
445 157.55.39.113
|
||||
445 157.55.39.231
|
||||
449 95.108.181.88
|
||||
453 68.180.229.254
|
||||
593 54.91.48.104
|
||||
757 104.196.152.243
|
||||
776 66.249.66.90
|
||||
# cat /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/oai.log /var/log/nginx/oai.log.1 | grep -E "17/Jan/2018" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
|
||||
11 205.201.132.14
|
||||
11 40.77.167.124
|
||||
15 35.226.23.240
|
||||
16 157.55.39.231
|
||||
16 66.249.64.155
|
||||
18 66.249.66.90
|
||||
22 95.108.181.88
|
||||
58 104.196.152.243
|
||||
4106 70.32.83.92
|
||||
9229 45.5.184.196
|
||||
```
|
||||
|
||||
- But I do see this strange message in the dspace log:
|
||||
|
||||
```
|
||||
2018-01-17 07:59:25,856 INFO org.apache.http.impl.client.SystemDefaultHttpClient @ I/O exception (org.apache.http.NoHttpResponseException) caught when processing request to {}->http://localhost:8081: The target server failed to respond
|
||||
2018-01-17 07:59:25,856 INFO org.apache.http.impl.client.SystemDefaultHttpClient @ Retrying request to {}->http://localhost:8081
|
||||
```
|
||||
|
||||
- I have NEVER seen this error before, and there is no error before or after that in DSpace's solr.log
|
||||
- Tomcat's catalina.out does show something interesting, though, right at that time:
|
||||
|
||||
```
|
||||
[====================> ]40% time remaining: 7 hour(s) 14 minute(s) 45 seconds. timestamp: 2018-01-17 07:57:02
|
||||
[====================> ]40% time remaining: 7 hour(s) 14 minute(s) 45 seconds. timestamp: 2018-01-17 07:57:11
|
||||
[====================> ]40% time remaining: 7 hour(s) 14 minute(s) 44 seconds. timestamp: 2018-01-17 07:57:37
|
||||
[====================> ]40% time remaining: 7 hour(s) 16 minute(s) 5 seconds. timestamp: 2018-01-17 07:57:49
|
||||
Exception in thread "http-bio-127.0.0.1-8081-exec-627" java.lang.OutOfMemoryError: Java heap space
|
||||
at org.apache.lucene.util.FixedBitSet.clone(FixedBitSet.java:576)
|
||||
at org.apache.solr.search.BitDocSet.andNot(BitDocSet.java:222)
|
||||
at org.apache.solr.search.SolrIndexSearcher.getProcessedFilter(SolrIndexSearcher.java:1067)
|
||||
at org.apache.solr.search.SolrIndexSearcher.getDocListNC(SolrIndexSearcher.java:1557)
|
||||
at org.apache.solr.search.SolrIndexSearcher.getDocListC(SolrIndexSearcher.java:1433)
|
||||
at org.apache.solr.search.SolrIndexSearcher.search(SolrIndexSearcher.java:514)
|
||||
at org.apache.solr.handler.component.QueryComponent.process(QueryComponent.java:485)
|
||||
at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:218)
|
||||
at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135)
|
||||
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1967)
|
||||
at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:777)
|
||||
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:418)
|
||||
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:207)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
|
||||
at org.dspace.solr.filters.LocalHostRestrictionFilter.doFilter(LocalHostRestrictionFilter.java:50)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
|
||||
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:221)
|
||||
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
|
||||
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:505)
|
||||
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:169)
|
||||
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
|
||||
at org.apache.catalina.valves.CrawlerSessionManagerValve.invoke(CrawlerSessionManagerValve.java:180)
|
||||
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:956)
|
||||
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
|
||||
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:436)
|
||||
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1078)
|
||||
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:625)
|
||||
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:318)
|
||||
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
|
||||
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
|
||||
```
|
||||
|
||||
- You can see the timestamp above, which is some Atmire nightly task I think, but I can't figure out which one
|
||||
- So I restarted Tomcat and tried the import again, which finished very quickly and without errors!
|
||||
|
||||
```
|
||||
$ dspace import -a -e aorth@mjanja.ch -s /tmp/2018-01-16\ LIVES/SimpleArchiveFormat -m lives2.map &> lives2.log
|
||||
```
|
||||
|
||||
- Looking at the JVM graphs from Munin it does look like the heap ran out of memory (see the blue dip just before the green spike when I restarted Tomcat):
|
||||
|
||||

|
||||
|
Reference in New Issue
Block a user