I have an application that generates reports by building views as DXL and importing them. The views are not complex: a few columns and one line selection formula.
After a few days of usage, the import always starts to fail. The LotusScript error is this:
If my wife needs to park at a meter and there’s not enough quarters in the glove box, that doesn’t prove that she used them up and failed to replenish them. Maybe it was I who failed to add quarters to the pool, but my wife is the one who experiences the “Insufficient quarters” condition.
In the same way, the fact that the DXL importer is reporting the error, doesn’t show that the DXL importer caused the error, because it is not the only one driving that car. It may simply be the biggest requester of memory, so it notices the shortage first (or maybe other processes are also having memory errors and aborting, but their error reporting is not as good so you don’t know it).
To narrow down the problem, maybe you could write one agent that only repeatedly imports the same DXL, and run it locally. See if that runs out of memory.
Incidentally, there are other ways of generating reports besides creating views. Tables in a rich text field are easy enough to create (ReportGenerator class on openntf.org) and don’t slow up your whole application the way a lot of views will.
Yes, I know but this one looks very much it is the DXL import which is causing the problem. When investigating this, I noticed I had given wrong info: in one report case the selection formula is very large and I believe that is causing the problem. I’ll post here when I have more info.
Thanks for the tip. I’ll have to take a look at the ReportGenerator at some point. These reports are directed to Excel and the view is deleted after use.
So you’re using a view solely as a searching and sorting mechanism for data that you write to a spreadsheet. I would say eliminate the middleman; search for the documents and sort the data yourself, and write it directly to the spreadsheet. Creating the view is unnecessary.
It’s not a middleman, it’s the report itself which opens in Excel (the content type of $$ViewTemplate is set to application/vnd.ms-excel). This should be the most efficient way of reporting data without using FT-index (unless of course IBM coding of view indexing is very bad :).
And then there is also one big thing: categorizing. I hate to code (sometimes multi-level) categorizing with LotusScript. The view does it for me easily. Maybe the Report class you suggested will do it also. I’ll take a look at it when I next time need to code reporting.
But back to the problem itself: I haven’t been able to reproduce the error in dev environment (at least yet). So you might be right, maybe it’s something else that eats up the memory. But everything else work when I start getting this error so it seems the DXL importer uses a lot of memory. If it’s something else, it will be very difficult to find…
Sometimes we get the error with lower values (all the LSI memory values are lower than when reporting works).
Also, sometimes these values go up and sometimes they go down, even when the reporting is used intensively. So it looks like these values are not very useful.
Just to say I get this error from time to time as well. I’m sure (although I can’t prove it), that my scheduled agents creating documents with DXL to have a memory leak, and eventually I’ll get these errors in the log and at some point I have to reboot the server or it will completely stop due to lack of memory.
This never happened before I started using DXL.
The 7.0.3 fixlist mentions DXL memory leak fixes, and it has been slightly better since we upgraded to 7.0.3, so I would suggest that. It’s not completely fixed though.
I’ll freely admit though that this is just my hunch and not based on any true ‘fact’.
By the way, have you noticed the spell checker in this forum doesn’t recognise DXL…