Net Apps base their stats on unique visitors per site per day. ("We 'count' unique visitors to our network sites, and only count one unique visit to each network site per day.") We [StatCounter] base our stats on page views
So if an os x user read more pages than a linux user on a specific site, that same user will have more "weight" in the statcounter analysis.
Basically.. different methods that actually measures different things.
Can someone explain why the Net Applications numbers are so different from the StatCounter numbers? I would expect some variation in counting "users" versus "page views" but the difference here is stark
I (W3Counter, 70k sites) measure unique users, and it's always tracked much closer to StatCounter's numbers than NetApps. Same for every other site that tracks this stuff. NetApps has always been an outlier, especially on their IE numbers.
The difference for NetApps is how they track unique users, particularly for countries such as China.
My understanding is that they basically look at the total number of unique visitors from each country that they get, then weight that number by the number of reported internet users in that country. So if NetApps sees 10 users from country A and 5 users from country B, but country B has twice as many 'connected' people as country A, then they both get equal share.
What this basically leads to is that China, which has a very large population of people who infrequently use the internet but are still counted as internet connected (and may be using shared computers), gets inflated numbers based on total population. This is also why IE8 numbers are so large- because it's still in wide use in China.
The NetApps number probably makes sense if you were to ask 'what is the browser use of the individual users across the entire planet that could possibly load my website'. But when looking at browser usage by # of page loads or predicted visitors, the numbers could very well be much closer to StatCounter.
Net Applications apparently have a sample of 40,000 websites, whereas StatCounter have a sample of 3,000,000 [1]. StatCounter shows radically different numbers (e.g. Chrome by far the most used, whereas NA shows IE majority). IMO, StatCounter's numbers are more likely to be accurate.
Some stats count unique visitor totals of each Browser.
Other stats count page hit totals of each Browser.
Then you have ones that mess around with not including all of the particular Browser's versions.
And there is no telling who's getting a world-wide sample, who's taking out some coutries, who's samples over-represent a country because their traffic collection base is there, etc.
These number are very different than that shown by StatCounter [1] and Wikipedia [2]. If you look at Wikipedia's comparisons, they compare data from many sources. NetApplications (the dataset used by this article) is by far the one source that differs greatly from the rest of the data.
This likely has to do with the types of audiences that visit the sites they monitor, so the data should be taken with a grain of salt.
Statscounter is a popular statistics for platforma usage, but I strongly doubt that it is heavily biased. I've never found that any websites use statcounter (though I haven't aggressively investigated), and never found installation guide in Japanese. Perhaps is userbase biased to smaller websites on some countries?
The first would still require a large number of people to update their browser in a short period.
Seems they are pulling data from caniuse.com which uses StatCounter Global. StatCounter updates their data 4 times a day and doesn't seem to have changed recently. http://gs.statcounter.com/faq#methodology.
They weight results against number of internet users in different countries according to CIA, most of others don't weight. Major source of difference is China, just select China on StatCounter and see.
Of course, they're also the only source that tries to count "unique visitors" rather than traffic, and uses country-level weighting to attempt to correct for sampling bias.
But personally, I still find StatCounter more useful.
The StatCounter measure seems more accurate to me too, if only because their numbers are more in line with Wikimedia's numbers. And I mean, who doesn't use Wikipedia? I imagine few people are in a better position to measure browser usage than they are, possibly not even google (though possibly with their ad network?)
Net Apps base their stats on unique visitors per site per day. ("We 'count' unique visitors to our network sites, and only count one unique visit to each network site per day.") We [StatCounter] base our stats on page views
So if an os x user read more pages than a linux user on a specific site, that same user will have more "weight" in the statcounter analysis.
Basically.. different methods that actually measures different things.
reply