Skip to content

algorithmic sabotage

I'm thinking it might be good to maintain an allow page with junk plus disallowed poisoned areas. The crawlers respecting robots.txt would pick up stuff that doesn't matter and the crawlers that ignore the permissions file would be fed scraper-hostile content.