apycot #1381388 Add an url checker crawler test [open]

It would be usefull to have a crawler that checks websites for errors (404, 500, 501, etc.). The idea would be to get a report like the ones given by graphical tools like gurlchecker.

In the first place this could be used to check "public" websites. After that, we could use this to test temporarily created websites (checkout cube, instantiate instance, launch cw, crawl website).

A quick look at the tools available in debian/ubuntu (command line) gave me : linkchecker (better) or linklint (not convinced).

prioritynormal
typeenhancement
done in<not specified>
closed by<not specified>