Our customer wants us to check a list of URLs (more than 100 links) every month, and see if some URLs are brokens. Most of the URLs link to our Notes pages in different Notes databases.
I’m wondering if I have a way to check all those URLs? I would like to put those URLs in a text file.
You can read from a URL using a java agent using the java.net.URL and java.io.inputSteam classes. For example, you could loop through your list of files, and attempt to read from each URL. If reading fails, you can log it. I don’t know of a way to do this in script.
You can easily use LotusScript to traverse URLs within your applications. Here’s a starting point:
url = path & sdoc.universalid & "?opendocument"
Do While ie.busy
Doevents
Loop
ie.navigate url
Do While ie.busy
Doevents
Loop
body = ie.document.body.outerHTML
While Instr(Lcase$(body), "</body>") = 0
body = ie.document.body.outerHTML
Wend
In addition you may check the image links and see if they’re valid. Just parse through the OuterHTML and collect the <img…> tags to get the information needed. Then try to get the image file and handle any errors accordingly.
Declare Function URLDownloadToFile Lib “urlmon” Alias “URLDownloadToFileA”(_