You must log in or register to comment.
A search engine can’t pay a website for having the honor of bringing them visits and ad views.
Fuck reddit, get delisted, no problem.
Weird that google is ignoring their robots.txt though.
Even if they pay them for being able to say that glue is perfect on pizza, having
User-agent: * Disallow: /
should block googlebot too. That means google programmed an exception on googlebot to ignore robots.txt on that domain and that shouldn’t be done. What’s the purpose of that file then?
Because robots.txt is completely based on honor (there’s no need to pretend being another bot, could just ignore it), should be
User-agent: Googlebot Disallow: User-agent: * Disallow: /
[This comment has been deleted by an automated system]