That's less than one API hit per 17 English pages viewed - the overwhelming majority of the time, users do not use this feature at all, based on these numbers.
It would be more interesting to know the adoption of this among website developers. And some use cases where it is beneficial for them (as I don't see any). As it is, it is quite useless. If the feature doesn't do anything, the numbers are just a consequence of the buzz done around it, nothing more.
This whole thing reeks of taking the easy way out and dumping the problem on the user. Why can't you analyze usage patterns in a controlled environment to identify the typical number of page loads?
You end up at a statistical answer like 20 hits to our home page equal 6.3 users, statistically speaking.
Hi, true indeed @zzo38computer. I see that there are
192 total page visit as of this writing,
186 page hit on the home route, and
5 page hit on the about route, and
0 page hit on the subscribe route (thus, 0 subscribers :'))
I don't want to come to conclusion this fast, but my prediction is that almost everyone have a hard time navigating with json view. What do you think?
I think it should be obvious that they measure lots of things and that the data says the page isn't used very much. This is a case of the data not telling you the full story. That happens sometimes. Data needs to be interpreted, and sometimes people get it wrong. There's probably a lesson in there somewhere.
However... This is tech, and the way engineering teams in tech work these days means it's probably something as simple as this feature is considered low impact and not "career enhancing", so no one was willing to take on maintaining it or promoting it. The only option was to kill it. When someone saw how passionate users are about it they changed their mind.
These would indeed be an interesting statistic. Maybe I can create my own little "analytics" tool and show stats for every page visited.
I'll provide a follow-up shortly.
For the other points, nearly no traffic is driven to my site by Google. Nearly 80% of the searches were done by me as testing. It's not necessarily a bad thing that Google isn't driving traffic to my site, because it's not complete and in the future it will turn into something more of a "white paper/rambling" website, but right now there is little-to-no useful information on it.
150 million page views a month works out to 57.9 pageviews a second. So the vast majority of those requests must be JS, CSS, and static image requests, although 98.5% seems a bit off?
We were featured on PH and while it does indeed drive a lot of traffic, the quality of it is about the same as from StumbleUpon's paid discovery, i.e. near zero. They come, they see, they bounce. No conversion at all.
It's not bad logic, it's an imperfect analogy which you are taking at face value instead of understanding the point that something can be useful even if infrequently used.
Wether or not it applies at this case is not determined, but it does underline the fact using only visit metrics for this particular page does not give the whole story.
Which is congruent with the reaction the announcement got.
That it's horrible as a popularity index and should never be used as such. The number of visits is very low and the population which visits the site is very selective.
Akamai's numbers are simple % of requests, which means 'hits'. So, a single Chrome user visiting 10 pages that happen to make 100 CDN requests is counted 10x as much as an IE user that visits 2 pages that happen to make 10 CDN requests. In short, the numbers are useless if you're looking for % of visitors.
NetMarketShare uses visitors, which is preferable if you're looking at percentage of users, which is what we're discussing here.
This response is pure sophistry.
reply