Quote:
Originally Posted by 3x2
Whatever BT decide they will still have no way of obtaining content provider consent.
The delays are more likely BT figuring out how to make the whole scheme less easy to detect for both client and server. If it does turn out to be difficult to detect then I, as a content provider, will simply ban BT address ranges.
|
This is what worries me too. The one thing I have heard nothing from BT about, is the question of giving webmasters information about how to selectively block Webwise access with a robots.txt user-agent string. They have absolutely ignored that question, and insisted that the only thing they will respect is a total robots.txt ban on robots.It's not good enough, both legally and morally. Even looking at it on a purely PR basis,their stance can be made to look really grubby if we focus on how poorly they compare with search engines.
If they can read robots.txt to see a general ban on robots then they can read it to find a Webwise user-agent ban. But they won't declare a user-agent description. It makes Kent Ertugrul's rants about the evils of Google, sound really pathetic.
I think this is worth focussing on intensively - we've got some movement on the issue of opt-in.
We've got them thinking about cookies.
We are getting nowhere on the issue of webmaster informed consent. So I'd like to see some focussed attention on comparing the lack of choice offered to webmasters with the Webwise model, as compared with the ability webmasters have to selectively restrict the search engines Google/Yahoo et. al.
Kent's words about the evils of Google at the Town Hall have given us our starting point. Let's accept that as a battle ground and start applying the pressure.