I would like to exclude the https pages from being indexed, http pages are allowed.According to the search engine guidelines you should create a robots.txt for both ports.
How can this be done for a Domino site since both https and http share the same html root?
I solved this by adding a form to my website db with a computed text depending on the HTTPS cgi value. Called the form robots.txt. Added a Website Rule to link /robots.txt to this form. Works fine, but if someone knows a more standard solution I would like to hear it.