While they can certainly penalize sites that optimize today with respect to the criteria of yesterday, that effectively just means changing the criteria of ranking. Nothing stops site owners to optimize with respect to the new criteria tomorrow. The full implications of the proposition are reminiscent of Russel's paradox, or worthy to be included in a next edition of Hofstadter's Gödel, Escher, Bach book.
Well no, it's clearly stated that the goal is to even the playing field, not change the rules. If SEOs build too many backlinks (and for big keywords you need them in the range of hundreds of thousands) they will get punished. What are they gonna do, build less backlinks? This is of course over-simplified but it does come across as a sound solution.
Well, that's pretty good news... if it works. There's a lot of garbage on the first page just because the page/site has a lot of backlinks and has the whole site SEO'd by the book (titles, keyword density, headers, etc.).
Too many sites are ranking by stuffing keywords and buying links.
Next, Google need to find a way to rank a page based on it's role in a broader user experience - rather than being measured in isolation.
Sites (especially in retail) are forced to optimise a page in a specific way to get indexed successfully. The result is everyone putting too much content on a single page - detracting from usability and creating homogeneity for users - like cars all optimised in wind tunnels.
You don't always necessarily get banned permanently, or banned at all. Google might just discount all the links they find to be bought and adjust your ranking according to your new link count minus the paid links. You can always clean up the links yourself and then submit a reconsideration request. Whether that request is accepted, that's up to Google.
True but some times you do get a big slap BMW is probably the best known. And getting back can be a problem for sites with out a high profile and for ever after your on Googles watch list :-)
Historically, a large component of Google's ranking algorithm involves the number of links pointing to a website (and a number of other qualities like anchor text, unique domains, surrounding text, etc.). If my memory serves me, there are over 300 ranking factors - but links are still the most important.
By buying links (undetected) you can manipulate the quality and relevancy of your website in Google's eyes and cause it to rank better than it should.
To add to this, the reason they can't penalize you is because then you would just buy links for your competitor. All Google can do is drop value on links when they detect malicious behavior. As such, buying links still largely results in a positive effect on your rankings. Google is starting to focus more on other factors than links though, social signals in particular are becoming a big factor.
I think algorithmically they can only discount links from bad neighborhoods, etc... but if they catch you, they will manually penalize you.
I think we are a ways away from social signals being a "big factor." While google does claim that they look at social signals, I think from a practical standpoint the only value they provide is raising a red flag when someone manipulates social signals like buying likes and plus ones. The same way google looks at a link profile of a site, they can catch a bad social profile that seems unnatural.
Chrome being penalized was a bit of a special case. Google got caught (even though it wasn't directly them) and they gave a harsher punishment to themselves because they claim they hold themselves to a higher standard. The whole thing was really damage control to save their reputation, they already had a lot of scandals this year.
In reality, links are almost never seen as a negative. Which is really a shame, because it makes it harder for good-guy SEO's to legitimately build links when you can just buy 10,000 on Fiverr and not worry about it. And most black-hat SEOs buy links from many players, so even if some are discounted they won't usually still come out on top.
Google is getting better at it though. Recently they've been cracking down on blog networks. This is purely conjecture, but I do believe that links are on their way out as the primary signal.
Apologies, missed a typo: "And most black-hat SEOs buy links from many players, so even if some are discounted they WILL usually still come out on top."
You pay for people, link farmers, to link back to your webpage with specific anchor text. The link farmers typically have a network of websites they can distribute this over, offering you a few hundred or thousand links back to your site with your targeted keywords. Sometimes, the link farms even have good pagerank, which makes it even better.
That's the whole point of the Google backlinks - get as many links back to your site as you can, and you'll get good SEO.
And if you do this, and Google detects it, they nail you.
It's very unlikely. This seems to be one of the reasons google has been turning what appears to be a blind eye toward paid links - it's hard not to throw out the baby with the bath water.
Exactly what I was thinking. Probably not any more than it would cost the website owner, however much that is. Depending on how successful your competition is and how much money you have in the bank this could be a viable strategy.
And the implications of that make me shiver a little.
(Disclosure: I have no idea how much it would cost. But my hypothesis is that it would be the same as if the website owner had done it.)
This is a quite realistic concern. There is a story going around from Germany were smaller online shops were the targets of link-blackmailing. If they refused to pay certain "fees" they were hit by massive amounts of destructive links, i.e. those with "sex", "viagra" or "porn" keywords. Some shops indeed refused to pay and promptly were loosing massive amounts of traffic/customers.
Here is the original story, unfortunately only in German:
http://www.golem.de/news/google-ranking-wie-ein-erpresser-ei...
People who have a high PR website may sell a link (usually in the footer, on all pages) to websites in a similar niche to help them rank better. However, for some people its not blackhat enough. Spammers nowadays tend to use automated software like the following:
-xrumer - Developed by a few russian people and primarily used by russian pharmacies for fraud, this is the most powerful "spammer" too out there. You can literally spend $1k and get 1million backlinks within a matter of days and have your site in the #1 position for whatever keyword you're aiming for.
-scrapebox - Similar to xrumer, but while xrumer targets forums (vbulletin etc), scrapebox specializes in harvesting large lists of wordpress blogs (throught google + proxies/botnets) and then posts comments on them (usually all those generic comments you see on wordpress blogs such as "nice post" "good advice, bookmarked" etc).
I could literally go on for hours and hours on how people are exploiting search engines, I think that its prime time for Google to develop an AI for this rather then just relying on algorithms. There needs to be a better PageRank alternative that needs to be implemented.
>benefits
In the most basic terms, for spammers, more links = better search rankings = more traffic = more $$$
Why is that a bad thing? Maybe they could do something more productive with their lives than figuring out ways to corrupt one of the most useful information tools humanity has built...
Indeed I (SEO engineer for a large publisher) spend 70% of my time giving feed back to our developers on how to build a site correctly - whats the best way to handle pagination or faceting on a 10M page site for example.
The most obvious means to counter this strategy is to watch the rank delta of any given site and dampen it. If a site goes from 12 incoming links to 12,000,000 in a few weeks, penalize proportionally pending human review.
Should the human reviewer check every link by hand?
Think codecademy.com now first for "learn to code". I mean, is it best place to learn how to code? At the moment, I'd say it's not. Is it the third result "codeyear.com" (run by codecademy as well)? I'm sure there are better sites that could take that spot and provide more value to the user than that landing page pointing to codecademy (Nothing personal against the codecademy guys of course, big kudos for their venture so far and best wishes for the future).
The problem with today's algorithm is that it can't really tell which site is really good and which one is just popular, mainstream, consolidated, with a high PR. Think lifehacker.com second for "learn to code" (on my side of the world, on google.com and chrome incognito mode on).
Right now if every single newspaper of the world cover your site about how to build a homemade nuclear device and actually write something about the matter in the article (how it would be natural to be), use as anchor text "how to build an atomic bomb" and link to your site, your site will skyrocket to the top for that kw (and many others similar). Even if it's just a school project or a joke that made it to the news.
Where's the quality check?
Anyway I'm sure that smart people are currently working on these details and that a comment reply can't really do justice to the complexity of the issue.
Im sure humans can quickly ascertain whether a site suddenly went viral or is being gamed, as the number of such sites is fairly small. I never said a comment reply is the panacea or do can possibly do justice to the totality of search quality, just that an explosive delta in page rank should not be a major issue to a company with the resources of a google.
Google's claim on this subject is that they not only use PR or popularity of pages that link to you but also their authority (or domain relevance). So a link from Stack Overflow would count more for a search term like "learn to code" than say, a link from Howcast.
"-xrumer - Developed by a few russian people and primarily used by russian pharmacies for fraud, this is the most powerful "spammer" too out there. You can literally spend $1k and get 1million backlinks within a matter of days and have your site in the #1 position for whatever keyword you're aiming for."
Sorry bro that ain't true anymore. Xrumer brutal blasts used to rule years ago, but I can assure you that just having a million of crappy profile/forum post links popping up at the same moment nowadays isn't going to help your site ranking as it used to be. It's just not as powerful as it was before the whole xrumer was translated and sold to non-russian speaking folks.
Xrumer is still being deployed by the "big dogs" but not to link your main, "money site" directly. More like to link to other pages with links pointing to your site (or to pages with links to other pages with links to other pages with link to other pages that eventually link to your money site). Never directly, too risky (even though it's debated how risky it is, but still why taking a chance to find it out).
Buying links today means (more like) purchasing services like seolinkvine.com or buildmyrank.com where you buy posts (about the kw you want to target and have on them a contextual link) on network of sites with an high home page Page Rank.
Nonetheless Google is catching up with these networks, recently deindexing a lot of their sites and making their users, who spent lots of money for those links, feel really, really, really bad.
Pros in tough industries had to adapt to Google's algo evolution: they now build their own networks, they put out decent content on them, they provide a good user experience, they tend not to over-optimize, they host them on different A-class IPs, they have their domains registered with different names/addresses/etc, they build multiple tiers of links to their backlinks with custom developed softwares (or Zennoposter) with private proxies and their 24/7 running servers and so on.
It's not a cheap process nor a quick one.
As for now and long-term rankings, spamming isn't just enough.
To be on the safe side you really need to follow Matt Cutt's evergreen advice: build a kick-ass site that even a human reviewer working for Google would love to see on the first spots.
Knowing how easy it was to manipulate google results (and to some extent, it still is) imo that's the right direction from the perspective of a everyday user who's searching for something that's not clicking on Adsense ads or buying something.
Google is a little smarter than all those spam tools... I think some might work, in some instances, but the real threat to google I think is buying links from high quality private sites off the grid, where you actually analyze which links are helping your competitors rank, using tools like opensiteexplorer and majestic seo, and approaching them privately to buy links or place sponsored blog posts on their sites....
Just to be clear, there are 40 some factors that effect the value of a link, and that is just on a generic level, these factors carry different weight in different verticals, and based on other signals.
http://wiep.net/link-value-factors/
I think we need to wait and see how google rolls out this new penalty to see how it plays out.
I think you underestimate how complex the google machine is. The sites that are ranking by "stuffing keywords and buying links" are typically ranking because the competition isn't offering any better quality or signals to google...
Just to explain, there are over 200 publicized ranking factors and I would bet that each one carries a different weight based on all sorts of signals google sees about your sites as a whole, your competitors sites...
So, a spammy old strategy like the ones mentioned above are hardly a strategy for ranking, they are just working in those instances because of broader signals at play...
A simple example of this would be... Most SEO's agree that page titles are one of the main ranking factors as far as relevancy goes. But if google sees half your titles are duplicates, they might discount all your titles across your site because they determined it to be a bad signal in your specific instance...
I have relatively little hope that it takes longer than a few days/weeks until a wave of seo-deoptimization services hit the SEO market. The incentives for beating "the system" are just too high.
As an SEO I am interested to see how this update goes. Google updates usually have a 'shotgun' result. They might kill the intended targets but there is a lot of damage from the spray to sites that might be unwarranted.
Define 'unwarranted'. For every site with a good ranking there's probably a dozen who rank worse but provide equally good content. Maybe just shaking things up a bit regularly isn't a bad idea in itself.
By 'unwarranted' I am referring to websites who might get dinged in this update but are not overdoing the SEO, or possibly not doing any SEO at all. It happened with the Panda update so I can't help but think that it will happen again.
If you're doing SEO at all you should be punished imo. Let Google do its job without interfering and trying to manipulate please. And no, I don't believe there is such a thing as "whitehat SEO". Whitehat SEO is just common sense (write good content and don't screw up your markup).
That's like saying trying to sell your product is a sin.
Mind telling me how a consumer web startup is supposed to get others who are interested in X to visit his site about X? This is the basis of SEO: matching searchers to pages. It doesn't happen magically just by wishful thinking, and writing good content unfortunately.
No, it's like saying trying to lure people into buying something worse instead of the better is a sin. Like the product manufacturer secretly sneaking around in stores hiding competing products and putting their own in the front.
In this world, there will always be people who will attempt to sell you a shoddy product, and snake oil salesman.
But because that's the case, and you got a quality product, you're saying to sit tight, don't get out of the building, don't try to sell... and hope people will eventually choose you?
So how exactly is a consumer startup going to get people who want to know X to visit your site about X?
You're assuming that the SEO being performed is to manipulate or 'trick' users and while this does happen its not always the case.Search spiders are far from perfect in recognizing the relevance of webpages, seo helps them understand what a page is all about.
Also, not everyone understands what good content is, or how to properly markup their websites.
The field of SEO is quickly (or slowly depending on who you ask) being transformed into a much more holistic discipline where content creation, usability, conversion optimization, pr and marketing are falling into SEO's job duties. It's much more than page markup and links.
Sortition is the word of the week! It would be a nice feature to cycle the top N results for a query and give the public and chance to see multiple takes on a topic
Unless the penalties from Google are long term I can't see this as being very effective, as you would assume competent SEOs will just tweak their processes in response.
From what I've read over the years, basic SEO mainly boils down to :
- Have good, relevant content
- Choose your page titles and URLs carefully
- Get lots of links to your pages , from as many different domains as possible, and try to get them from high Pagerank domains
- Where possible try to get anchor text relevant to the terms you want to rank for - but don't go overboard with a large percentage being the same phrase, as it appears artificial
- An older site can benefit you, as will exact match domains for the main TLDs
Except for site age, they are all easily changeable by the SEO.
Maybe Pagerank is too easily gameable, and what is old will become new again, and some of the approaches tried in the 90s and replaced by Pagerank will return with a new twist.
The problem is that Google actually aren't good enough at search yet.
They still rely on this dumb word-based approach to document retrieval. Example, if my page is about how to manage your time, and it's a really fantastic resource about that, but I don't actually mention the phrase "how to manage your time" anywhere, I won't rank for people searching for that phrase. I should, but I won't.
So I have to write my content for two audiences - humans and Google.
I don't want to do this. I'd be much happier just building great content for my human readers, and if I happen not to mention the exact keyword phrase a searcher might use, it doesn't matter, Google still knows it should put my site at the top for that phrase, because it understands that my site is about that phrase, even though I don't mention it exactly.
But Google just isn't good enough yet. Someone can set up a page which has a bunch of headers and URLs and variations of the keywords and beat me to #1, even if their content is utter rubbish for human consumption.
That's why SEO still exists. It's symptomatic of a bug in Google.
This penalty will help, hopefully, but we're still going to be in a state where we have to compromise our content to serve two masters.
Yeah semantic search, if solved, would address this problem.
That's really what I was getting at. Stripped right down, Google is still just viewing documents as a bag of words[1]. I mean, they have pagerank and they will apply more weight to words in headings, and they have 6-gram indexes and synonyms and all that clever stuff, but at it's core it's still lexically centered not semantically centered.
Meaning Google makes no effort to score your whole page against each word in the dictionary, it only attempts to score 1- to 6-game from the page and inbound links? That's unfortunate.
No. It means that for your comment, google will store each word, each pair of words, each triplet of words and so on up to 6 word long subsets. So if someone searched for 'Google makes no effort' then your comment will come up higher than a document which includes those words in a different order.
not yet, to my knowledge. My old IR lecturer (Stephen Clark) used to think highly of Powerset, who look like they've since been acquired by Bing. And while the launch and implementation of cuil was an epic fail, it did seem to have an interesting approach, and I'd have liked to see what it could have offered.
Google is actually pretty good at getting the semantic meaning, but the SERPs show snippets of the content, and users only click on results that match the query.
Google could stop bolding the keyword matches in the SERPs, but that would be a disaster because they are useful for picking the best result.
My point is that this is not purely a Google bug, but also a bug in human nature. People like exact keyword matches to their queries.
> the SERPs show snippets of the content, and users only click on results that match the query
I'd love to see a source to back that up. The only data I've seen on this is various eyetracking and/or click tracking studies which suggest higher up on SERP=more clicks.
I think this is going to be very interesting. I can understand the benefit of it, but as an SEO, I also feel that because we are working in the dark it is as much a science as an art. To me, this just means I have to work on fine tuning what it takes. Also, I think it will be harder for those who play on the wrong side of the tracks to rank well, which in turn may make it easier for those who do to rank well.
I think Google needs to be really careful with this. It would make sense that the first place they should look for over optimization is anchor text manipulation. The problem is that this could open up the door for "Negative SEO" services where you can blast a competitor with high traffic keyword anchor texts. The safest place to look will be on page factors and if they decide to go after anchor text manipulation there should be some type of authority metric that protects established sites.
Google is probably already do that, which is why the current traffic-generation methods advice is along the lines of varying anchor text.
This is done mostly by identifying related / tangential / complementary / supplementary keyword terms to the resource linked to. Non whitehat SEOers are wising up to the idea of a site targetting niches, and that considerably widens the list of appropriate keywords.
By tackling the long tail keywords en masse - these are less competitive, easier to rank for, and correlated better to buying intent; gradually they make significant inroads to ranking strongly for the big-head keyword in a manner that looks more natural.
I gave up trying to optimize for SEO a long time ago. The future is good content, period. Technology will eventually figure out a way to get the good content in front of users, whether it's through a Google-like search engine or something totally different.
I'd rather focus my energy on driving traffic in other ways, such as traditional media, list building, PR - and focusing on creating great content.
I just do the bare minimum stuff when it comes to SEO (title, h1 tags, some anchor text here and there). Other than that, I don't pay attention to SEO in the actual content of the pages.
I think this is wishful thinking. Google have been trying to make search smarter for many many years. The truth is that links are still a good signal, and you should try to consciously build links. There are many many many articles out in the internet that are well-written and informative but noone can find them because they're not in the top 10 in Google.
Creating good content and promoting it through PR should result in natural links - the exact type that Google looks for.
If you try to stay on top of the latest SEO tactics, you end up hurting yourself in the long-term. For example, awhile back it was recommended to create huge directories of conten t (something like: Travel deals in Ohio, Travel deals in NY, etc, etc). With Panda update, you are now penalized if you have too much similar content. Why deal with all that headache, just create good content.
Yep, that's what I said. Creating good content ALONE is not enough and never will be. The internet, like life is not a meritocracy. Just because you write great content doesn't mean Google will do you a favor and feature it. You need to promote it, which is what linkbuilding post-Panda is all about. Not blog commenting, forum spamming, or submitting to directories.
But telling someone to "just write good content" is like telling a programmer "just create a good product". The world doesn't work like that. You need to hustle, shout and say Look at how awesome this site/app/article is! Link to me! I don't like it that much, but that's how the game is, and if you don't play it accordingly, you'll end up with awesome content and no visitors.
Why do people believe links are still a good signal? It is a textbook case of distorting a metric by relying on it.
Traffic to (and engagement on) a site seems to be a much better signal, for the many sites that use Google Analytics, at least. But not everyone uses GA though...
Backpedalling now, PageRank is about weighted links. Link farms should have no weight to give, and links from the user generated parts of pages (which are easy to detect on most blog platforms) should be easy enough to weight fairly too. So it is hard to see why Googlebot gets fooled by any blackhat link techniques that Google humans are aware of, except maybe the case of selling links (influence peddling), but one can argue that is a social /legal issue, not a technical problem.
Today a client came to me whose website wouldn't appear at all in search results. Webmaster tools told her nothing.
Taking a look at the site the domain registrar wasn't doing a proper nameserver/cname/permanent redirect or the like, instead putting the shop host into an old-skool frame. Not only that but for some bizarre reason they created a subdomain and were framing that too.
So a frame inside a frame inside a frame. It's genuine hard faults like there that's the reason SEO will exist for the foreseeable future.
Thought 1:
This may be overly cynical, but it is hard to believe that this isn't just another way to dump more people into their adwords system.
People who "over optimize" on a large scale are probably the same people making money off of well ranked SERP's, penalizing those people pushes them to buy adwords.
They are in essence pre-qualified customers for adwords since they have already demonstrated a willingness to invest real time/money in ranking. Sure this will just trigger yet another race to optimize optimization, but until things are refigured out more money will be dumped into adwords.
Thought 2:
This SERP "Market Volatility" is a great way for Google and SEO's to make a little more money.
This is also a good way of reminding some people how much their revenue is dependent on Google, when you see an overnight dive in revenue it captures the attention/mind share of higher ups who have maybe been taking their well oiled SERP machine for granted.
It's like a one night only "Google Dance" reunion tour, and gets people obsessing over them again.
That sounds like a marvelous future. Everyone who can justify throwing money at the problem can use paid search and everyone left in organic search can wish them all good riddance and get on with producing content.
except this makes sites like StackOverFlow, Huffington Post, Mashable, Yelp, and most sites that depend on advertising economically unsustainable. Before you say "No..", remember even Yelp, a great word-of-mouth resource gets over 70% of their visits through search engines. As does Stack Overflow.
What will be left will be personal bloggers, charities, and hobbyist sites. Be careful what you wish for.
HuffPo is a spam site that Panda should have killed.
Stackoverflow is a unique content sites with practically no human-unfriendly SEO (Except maybe the huge list of related questions that tends to fool Google, where the search term is split over two questions and neither is actually relevant. )
I don't know of anyone who specifically optimizes for Bing - most of the ranking signals are the same or at least similar to Google, so with far less traffic than G it's not worth it to focus on bing SEO (or for bing to change their algorithm to include widely disparate ranking signals)
Many factors make Bing SEO different. Since Bing doesn't crawl as many pages as Google, they are more harsh on low quality sites and links coming from "content farms". Specifically, Bing is:
- Less likely to find results in forums
- Harsher on results from “content mills” (even after Panda)
- Less likely to give you ranking for a keyword if is not found verbatim in the page
- Also provides way fewer results for phone number queries
ehow should have been hit with the Panda update? Do you have a "block all example.com results" link on your Google results page? (You might have to visit the link and then hit back in your browser to get it to appear.)
Do you have any examples of search terms that don't return useful results until page two or three?
Let me tell you, searching for typical tech support issues, ehow is always on the first page and the content is clearly spam- usually only tangentially related to the query.
try Googling exchange server specify IP adress- and try to find something useful.
You. Don't know why people say about.com is a spam site. It is a paraphrase site, with long readable content pages that link to other resources. I learn much more and more quickly about tmobile cell phone plans from about.com then I do from wading through tmobile's worthless website.
I'd also like to note that for people creating content on the web, especially programming-type people who deal in logic and certainty, the SEO system resembles something like black magic. It's pretty clear that unless you understand SEO and apply it, you're never going to be seen no matter how good your content is. Now it appears that if you understand it too well that's also a bad thing.
The goal here is to let the users themselves inform the search engine as to what content is good -- hence the plus-ones, social search and all of that stuff. But all of this is still indirect evidence. Unless you could plug a computer inside the head of a person and watch their every thought, the only real data you have for input is server logs, click-throughs, and all kinds of other things that computers do, not people.
I just don't see this being solved any time soon. But I do see it getting so complex and unwieldy that it continues to frustrate searchers and content producers alike. Meanwhile the bad actors will continue to have a heyday.
Don't you think the obvious solution is to build an algorythm that thinks like the human brain? I suspect Google isn't that far away from that riht now.
I hope they'll be able to do it right. But, as soon as they change their algorithms, SEOs will reverse engineer them again and change the optimizations. There can be no definitive solutions, at least not with current technologies, because SEOs are very smart and very fast at adapting.
As an aside, the first over-optimization that I would target if I were google are keyword-based domain names. Keywords in the domain name are given WAY too much weight; how often do you find that the top three results for the search "XXXX YYY ZZZZZ" to be very shallow but keywords-rich websites with the domain names www.XXXXYYYZZZZZ.com, www.XXXXYYYZZZZZ.org, www.XXXXYYYZZZZZ.net?
I wonder if Google is getting trapped near a local peak.
I watched the video they released on improving search quality yesterday (http://insidesearch.blogspot.com/2012/03/video-search-qualit...) and was initially impressed that they gave so much thought to slightly improving a small subset of 0.1% of queries. That meeting must have had 40 people attending!
But afterwards it left me with the feeling that Google is becoming so big and entrenched in old ways of doing things, that they may not be focusing enough on the next big improvement in search. Penalties and heuristics can only go so far -- eventually they'll need something approaching AI -- and if any company can do it it's Google.
I don't know if you have any concrete idea in mind, with what AI Google should to approach AI, but Google does fund a lot of research in Information Retrieval and other subfields of AI (machine translation, machine learning, ...).
If you were thinking more along the lines of: "you ask a question and it understands your question and answers it like a human would", then I can tell you that this would require a scientific breakthrough first. That is not really something that you can plan for. Also Google mainly excels in engineering, less so in research. I actually think that Yahoo and Microsoft have stronger research divisions.
For years those penalties have often been mentioned on SEO forums. This is nothing new but just another filter they will add. Or maybe its just a little FUD to keep people nervous?
This is completely the wrong thing to do and in my opinion this means that, Google have essentially just admitted they're bad at search.
Firstly, for ecommerce sites etc due to the lack of content - especially unique content (because there are only so many ways you can say something is X length) these sites are forced to SEO themselves. This is even more apparent as Google are placing their “Google Product Search” within the results as well. Sure, you could add yourself to Google Product Search but that’s not the point - Google should be adding ecommerce sites etc in their automatically to make it a level playing field. Sure, you can argue that it is hard to do and that might be the case but, there is proof out there in the marketplace to highlight that this is possible - look at what TheFind etc are doing in this space. Additionally, I think there is a lot more that can be done in the Shopping Search space as well as other areas of search which I will cover below.
Having covered those, I will also highlight another problem which is hindering Google = their search engine is based around the Pagerank algorithm which despite evolving is actually hurting Google in trying to solve the problem of SEO.
I believe this is the case because of PageRank and the general Google Search algorithms which are in place – their search at its core is based on 'citations' like those in academic papers etc (yes it has evolved over time) but it is still even loosely based and developed upon on this ranking system. Hence there is the problem of paid links - although you can report them [1] this reporting method actually isn't effective and doesn't work. Additionally, there are tons of ways to get links really easily which appear natural, won’t appear overly seo’d and are extremely easy to get and game as well.
Google seems to be taking their search into the, Semantic & Social Search approaches which they will probably solve some of these aspects but, everyone is working out (and many have already worked out) how to game social search although, it is a step in at least a decent direction.
Currently, Google just provides links and documents etc to the ones which it believes is the most useful based on their Pagerank/keyword approach and even if it can improve search to a much greater degree the issue is also related to Adsense/Adwords.
This is because; these two print money for Google. For instance, Adwords really is used for Google Search (yes it does feed into Adsense as well but Advertisers really want their ads on Google Search Results). The last statistic I heard about Google Search, is around 2007 when Google was making $0.12 on average per search – you could probably calculate easily how much Google is making now by looking at their income and dividing it by publicly available search numbers but, I’m going to say it’s more than that because in 2004, they were making $0.09 per search. This is actually an issue for the user because, I believe Google really just want their ad results to be perfect – they don’t really want you to have to click on the result because then Google isn’t making any money – as long as their results are good enough so then they’re Ok with that If you don’t believe me, take a look at [2] that’s a whole lot of ads above the fold. Oh, and if you try and do that be prepared to be punished [3].
Now Adsense, sure it doesn’t make that much of Google’s income but it still is highly profitable – if it wasn’t they wouldn’t be doing it. If Google, really wants to fix search then they need to review every single Adsense site and I mean every single Adsense site since Google only reviews the site which you apply with. This leads of a huge problem as MFA’s aka. Made For Adsense slip through the net afterwards and if you look at their sites, they’re generally have poor content etc and are optimised for the search engine and the user to leave either by clicking the back button. Sure, Google will lose some Adsense income on their balance sheet but if they want to fix search this is also a good place to start.
Sure they're trying to solve this issue and get to grips with it but I don't think they will anywhere in the near future. You can argue this fix they have announced will solve some SEO problems but, as I said in my opening sentence I believe this is the wrong approach and some of the above reasons highlight several issues already but there are tons more. In fact, I actually hope you disagree with me because, not only do I value your opinion but am interested in the other aspects you may add to it.
and you are bad at commenting on the internet. The goal is almost explicitly to eliminate paid back linking as a strategy where it involves useless un-thought out spam text that only creates value for the back-linked website in terms of "Google juice".
I don't need to see any ecommerce site in the search results anyway. So good luck making them visible via SEO.
It's annoying when searching for information about a product. If I want to buy it, I go directly to amazon, not to the google serach
You've actually just highlighted another problem with Google there as well ;)
Google doesn't really know what you're searching for, it doesn't understand the context which is why you see shopping results when you're just looking for information. Yahoo actually tried to solve this with a slider with an option for more shopping/info results but it didn't catch on as users don't want to click anything, they just want to type words in to the box and get the perfect answer - be it an ad or an actual result. If you make a user do anything else then you've already lost the game.
This is another reason why I don't think Google will fix this SEO problem it has any time soon, and I definitely don't think this current 'fix' will solve it either
The trouble with SEO as a business is that you are always second-guessing the whims of people who's plans and actions are unknown to you other than the fact that they are almost definitely working against you, and furthermore they will usually have much better resources available.
A little off topic but: the idea of SEO has always bugged me: like a store owner investing in having really clean windows, but crappy products inside the store.
Admittedly, that is easy for me to say because I don't need high traffic at my site. I need my web site to have a few high value visitors: people who want to work with me or communicate because we are into the same technology. The way I "optimize" my site is writing about what most interests me, and that attracts people with my interests. Seems pretty straight forward to me.
Do not confuse black hat SEO with white hat SEO. One is not the other.
Creating good, worthwhile content is a key part of SEO. If you write, or create, good, worthwhile content, you are taking part in SEO. If you take your time to write a headline that best reflects your article, and mark it up correctly, that's a part SEO. If your site is using easy to read names, that's SEO. If people link to your content and write about it, that's a part SEO. If you make the description below your page title on Google's search results actually worthwhile then some random piece, that's a part SEO.
I am probably misconstruing things, as I often do and being simple-minded on top of that, but this seems like a circle-jerk and makes me wonder what happened to Tim Berners-Lee's musings about the Semantic Web, and why search hasn't progressed in that direction in some exponentially powerful and revolutionary way.
I believe the reason that we see so little of the semantic web is that advertisement is such a dominant business model on the internet. Since it is hard to embed ads in semantic web content, and the will to pay for it isn't there, the sad result is that it is not worth producing.
Various bits of the lowercase semantic web (and its related uppercase Semantic Web cousin) factors show through on Google results - mostly Google Rich Snippets, so reviews metadata. These mostly require human intervention to enable on a site-by-site basis (sometimes by the site owner, sometimes by humans in Google itself).
The deeper problem of the semantic web is that it's meta data, which tends not to be visible to visitors. And because of that, it's easy to forget / overlook / incomplete, and nefariously allows SEO over-optimisation invisibly that non-whitehat SEOers do in full view of the human audience.
I wish it wasn't true, but Cory Doctorow's portmanteau of metacrap is unfortunately still accurate ( http://www.well.com/~doctorow/metacrap.htm ). We have made several small metadata improvements over the years, but not much progress in substantially overturning Doctorow's original concerns.
Human beings are not great sources of accurate and up-to-date metadata. So we rely on scripts and services to fill in the blanks that we don't. (publish times being recorded in WordPress, for example.)
Converting human content into metadata ends up being an automated human-hands-off affair which means that those automative tools need to be able to parse and extract information / meaning from human prose. Very much what Google has been doing since inception.
It's flawed because it relies on automated interpretation of prose. But there isn't a viable non-flawed method of getting the same information without imposing academic-like constraints to the Web.
I hope this destroys all the scabby blogs that link everything to more of their own garbage - Mashable, Engadget, The Verge etc. Be nice to see them focus on not being shit instead of being rewarded for it.
The Google guys are so smart, with enough time they'll get all their algorithms so optimized they'll eventually exclude every site from ranking on the first page of results. It'll just be a blank white page, the ultimate in minimalism and speed. It'll be an engineering feat not rivaled in the 21st century.
To clarify for your benefit, I wasn't complaining about Google's optimization or lack-thereof (spammy vs not spammy). I think it's hilarious, watching the endless dance that Google is going through because their fundamental approach to search is broken and they're trying to drag that broken approach into the future. It's like watching Microsoft with each iteration of Windows & Office, trying to figure out how they can cheat death and drag 1980s software into the future.
On a business level, I don't care about Google's survival or their optimizations. My product doesn't benefit from SEO, nor from their search engine. I'm indifferent to them. If they make a good product, great; if they don't, someone else will eat their lunch eventually.
There is some truth to it, the approach was never really good. There is just an extreme need for it and they are still the best one around.
I wonder what happens 20 years from now on, Google can't win over SEO in the long term. There are just too many people on the other side. As long as they don't develop some AI the results will get worse. I use more often "site:" to get what i want.
Does anyone else find that announcement weird? "over -SEO", is just a kind of low quality content, so been a Google target for years. This announcement seems like it's just trying to scare off SEO fanatics, or an admission that Google has been using ridiculously naive linear scoring models for keywords.
The huge elephant in the room is that Google can kill so many startups and small businesses practically overnight with an algorithm change.
People can say "create quality content, don't worry, you can always file reconsideration request if it's an accident" don't realize how Kafka-like Google really is. If an algorithm penalizes your site by accident, and your traffic drops by 99%.. good luck trying to talk to an actual human being working in Google to get your problem fixed!
Disclaimer: I work for Google, but not on search. These opinions are my own and based only on the linked article.
I took a beating on the SEO blogs for calling SEO a bug a few months ago, but I'm glad the rest of the world is finally realizing that I'm right :)
Ultimately, Google is trying to rank you highly for providing the best content; you shouldn't be spending your time trying to figure out how to game Google by making superficial changes to the presentation of your content. Want to rank better? Write better!
The whole problem is rooted in the fact that Google is a leaky abstraction. It tries to be an omniscient Sherpa, guiding the wary Web traveler with its infinite understanding of the Web and the individual user's needs. The reality, though, is that Google is actually just a computer program. So there is a gap between the user's mental model of Google and what Google actually does, and it's this gap that SEO exploits for its own profit.
An infinite amount of exploitation would mean that Google would just return results randomly, and so it makes a lot of sense to detect signs of SEO and penalize the behavior before it further broadens the perfection/reality gap. Gaming the system is currently profitable, since the worst thing that can happen to you is nothing, but the best thing is that you get more traffic. A penalty aligns the risk/reward spectrum to favor "write better content" rather than with "spam a bunch of wikis".
There's a case to be made that detecting and penalizing SEO is making results worse, if, as you suppose, SEO for a sufficiently good search engine is equivalent to producing good content.
Obviously Google doesn't think their search is good enough, and I would agree -- piling more "inputs" and arbitrary branches into a ranking algorithm, however, is no solution. This will only devolve into an endless game of cat and mouse until a new search engine comes along and does to Google what Google did to Yahoo.
>Google is trying to rank you highly for providing the best content...Want to rank better? Write better!
The real world doesn't always work like that. I do a lot of work for an eCommerce site that sells wholesale. Their customers have little to no interest in reading text, all they want are pictures. Which means, the catalogue pages, which are optimised for actual human visitors and not Google robots, contain little to no text, only pictures. As a result, the catalogue pages (most important part of the site) do not rank anywhere with Google, and never will. Google is unable to handle websites that are - quite correctly - all about the pictures. Competitor sites that outrank this particular site design their catalogue pages for Google, not for humans, and rank well because of it.
Google and Matt Cutts, in general, have a larger problem. Cutts never acknowledges that Google works primarily on the language, which is one of the most complex things the human race has ever come up to.
And though language is always at stake in Google search, I don't see any serious semiotic study in whatever they make.
And, apart from that, Google totally sucks on languages that are different than English. They think of a new thing for the English based search and then they'll just propagate it everywhere with no substantial modifications whatsoever.
I've got many sites under me, all writing original content. They're not gonna win the Pulitzer, of course, but at least they're edited, accurately followed by teams of human authors that won't even copy PR as a measure to prevent non original contents to appear on the sites. Although all of these things, we got seriously pandalized.
The results where we once used to stay atop are filled in many occasions by scraper-sites who steal our contents and rank 2 or 3, sometimes up to 5, positions above us on the SERP. I can't even count the times I filled the anti-scraping form anymore. That's not enough, because many times popular sites rank higher than us just because they're pretty popular, although their contents are horribly written, short, totally uninformative.
That happens all the time. You know where we still go pretty strong? On Google News, where the human intervention sometimes really applies.
Google should see what's going on everywhere and their insistence on having matt cutts as its only public voice on this huge issue is becoming really frustrating.
He ends up looking like a fake good guy, perpetrating the hypocrisy of a corporation too big and too convinced they have the ability to solve all the hyper-complex search problems, laden with human generated unpredictability and the natural human tension towards deception, just with pretty algorithms.
EDIT: small clarification
reply