Help wanted!

Tuesday, November 16, 2021 at 8:26 AM UTC

Today's post is a bit unusual as I am not telling you anything but asking you for help.

I have an app that collects data from another database (docs from various views) and transforms them into JSON. Basically I go through a view and for each document in there I add JSON to a JSON array which finally is stored as a string inside a MIME entity of a separate document. The goal is to "preload" those data in JSON format to have it accessible later much faster than it would be when accessing and computing everything in "realtime". The code is executed periodically using an agent which calls the XPage with it's URL, so there is no other user context involved than the one which signed the app.

I am using the OpenNTF Domino API here and the code is called from an XPage using a custom REST service control.

As the views I crawl through contain about 4000 documents (where each document is also quite complex and contains MIME, too), the resulting JSON can be 20-25 MB in size later.

It sometimes happens that the script stops and causes an internal server error (500) as a response instead of saving my preload-document and continuing with the next view or ending up with a 200 status.

I already tried to reduce the overall size and the amount of data per document by skipping all MIME items but this didn't help.

When the error occurs and the server stops executing the code, I get this in the error-log:

java.lang.IllegalStateException: Internal ClassLoader mismatch

The complete error message(s) that where recorded right before the script stopped can be found here: https://docs.google.com/document/d/14KqX5dPa0m0DGLPQKkPZyxNWzjdHDYisV5mha-vvb0c/edit?usp=sharing

My code finally tries to create and save the document that should hold the complete JSON in the MIME entity and looks like this:

// document ("buffered") created in my database
// some fields are set before
// and the doc is saved initially
// then:
Stream str = XSPUtil.getCurrentSession().createStream();
InputStream ist = new ByteArrayInputStream(content.getBytes("UTF-8"));
// - sometime it crashes here
str.setContents(ist); // <- crash here as the logs tell
MIMEEntity mime = buffered.createMIMEEntity(Constants.PWA_PRELOAD_CONTENT_FIELD);
mime.setContentFromBytes(str, "application/json;charset=UTF-8", MIMEEntity.ENC_NONE);
// - sometime it crashes here, too
buffered.save();
Utils.recycle(buffered);
// end

So there is nothing special at first glance. As you can see, the lines where it crashes differ from time to time. The logs I attached show that the crash happened in the line where the stream get it's content.

Again: this doesn't happen every time but I cannot tell the exact circumstances when this happens.

If you have any idea on what could be the problem, please let me know in the comments or even better: drop me a note via email (obusse at googlemail dot com).

Any input is much appreciated!






Latest comments to this post

Oliver Busse wrote on 17.11.2021, 14:58

Chris, the existing data does not change very much, but new data is created so the resulting content is increasing constantly. There is filtering involved though to omit data which is "old" enough.

 Link to this comment
Chris Toohey wrote on 17.11.2021, 00:55

How often does the data in the remote database change?

 Link to this comment

Leave a comment right here