Tabular Import more than 65535 documents

I have a notes Db with view into which I import a lot of documents using a tab delimited text import file. This text file has a header row and then 96000 rows of data

I have a *.col file that defines the import and includes the following at the top

HEADERLINE 1 FOOTER LINE 0 LINESPERPAGE 0

When I perform the standard notes import it works its way through the text fine, importing correctly.

However after various checks I found out two rows from the text file were NOT imported.

These happened to coincide with rows 65536 and 65537 in the import text file.

If I take this two rows out into a separate file (1 hdr + 2 data rows), they import normally so no issue with content.

I did a similar import on on a similar db and again rows 65536 and 65537 were not imported

So taking into account the header row my interpretationl of this would be that the notes importer uses an integer with the 65535 limit, which when it hits that value, starts again but as it does so misses the row it was working on an then ignores the one after (maybe thinking it is a header row)

I’m wondering if anyone else has seen something like this and whether this was solved other than by splitting the original import file.

Many Thanks

Subject: Tabular Import more than 65535 documents

I haven’t had this issue, but did find an import product through AGE Computer via this site. Check it out at: http://www.agecom.com.au/ It is the Domino Import Utility. I tested this and was able to import data into two databases last week with ease, one was a new database and one was to update existing documents. Awesome product!