There has been a lot of talk on Threadwatch whether Google Analytics is to be considered evil and most people said it should be – here is some factual proof it is!
If you have dug into Analytics beyond just adding the short snippet of code to your Web pages, you may know that you can use the urchinTracker() javascript function to create “virtual pages” that don’t actually exist. This is handy when you are tracking multiple “steps” of a process with the same URI.
As an example, if your shopping cart requires 4 steps — from sign up to payment — and this entire process is all done on the same physical page using a series of posts (ie. cart.php), you can dynamically output “urchinTracker(‘step1.html’)” to “urchinTracker(‘step4.html’)” rather than simply “urchinTracker()”. Most log file analyzers will see 4 requests to cart.php, whereas Analytics will record hits for step1.html through step4.html.
This by itself doesn’t prove anything, but when you consider that my awstats sees GoogleBot added a few extra pages to their crawl list, it becomes clear. GoogleBot is now crawling step1.html, step2.html, step3.html and step4.html even though they do not exist! The only way Google could know about these pages is if they use data gathered from my urchinTracker(“step#.html”) code!
Lovely… I wasn’t very eager to use it in the first place, now I know I won’t use it for sure as long as I’m in my right mind. For smallish sites with moderate traffic, Statcounter is more than enough. Meanwhile, people are coming out with ways to block Google Analytics cookies.
Comments are closed.