Marcus Tandler | Mediadonis | Just another Online-Marketing Superhero

A link ain´t just a link anymore

My good friend Bill over at SEO by the Sea blogs about a lot of patents filed by the major search engines, I mean A LOT of patents. His blog is one of the most interesting blogs in the SEO / Online Marketing sphere, and should definetly be in your RSS Reader, when you´re considering yourself a true SEO expert. I tweet most of his posts, and the patents he talks about in his posts, but the patent he talked about two days ago, is worth a whole post & a closer look, cause I consider it to be one of the most interesting and important patents from Google in recent years. Well, at least in my humble opinion 🙂

The patent is called „Ranking documents based on user behavior and/or feature data„, and was filed in 2004 (!) by 3 Googlers. It just got granted two days ago, and shares some insight, about how a search engine might look at „a search engine might look at a wide range of factors to determine how might weight each link on a page may pass along„.

A link just ain´t just a link anymore, there´s a whole lot of stuff search engines are looking at, when determining how much juice (or whatever you wanna call it) the link is actually passing to the page it links too, and how much it will actually help that peticular page to rank.


There´s a lot of stuff, that has been floating around the web for a long time now, but there´s never been any proof. The patents talks about some things, which one might consider a proof, that these theories might indeed be true. And you gotta think about one fact -> the patent has been filed in 2004, so what just sounded pretty smart back then (where stuff like this most definetly wasn´t used), might be the fundament for some ever heavier stuff, that´s in use right now – 6 years later (!) – or will be coming into play in the next couple of month / years. So you should definetly think about that kind of stuff now, to plan ahead, and anticipate Google´s next move. Cause we all know, that paid links are a BIG pain in the ass for Google right now, and they´re definetly working on some stuff, to get a better grip on that one thing, that can still fuck up their best efforts to deliver the best searchresults for the user.

Bill lists a whole bunch of examples, that might be used to determine the strength of any one peticular link -whether features of the link itself, and / or features of the source and / or target document the link points to. It basically seems to come down to a „reasonable surfer“ – the more likely the common user will click on the link, there more juice it´s going to pass to the target document. This can be determined, by either using user data collected via a web browser or the Google toolbar, or by certain attributes of the link, like it´s position on a page, and the font size & colour etc. etc. just „the probability that a reasonable surfer will access the document after following a number of forward links„.

A real easy example would be a whole bunch of links in the footer of page – I think it´s pretty easy to give those links a lot less power to pass on juice to their respective target pages, since it´s quite clear, that these links don´t get clicked very often, and are (probably) only there to manipulate the SERPs. Like I said before, this tip, you´ve probably heard before -> „Don´t buy links in the footer of a page“ has been floating around the SEO forums & blogs for quite a while now, although there´s been no indication in any patent – it just makes sense!

You can read through the examples over at Bill´s post (or in the patent itself) – I just want to point out one more really interesting thing ->

Examples of features associated with a link might include […] commerciality of the anchor text associated with the link

Whoa – now this is interesting! This is actually something I keep telling people for quite a while now – it just makes sense, to take a look at the commerciality of the anchor text, since it can be a strong indicator, whether this peticular link has been set voluntarily / organically or was set to manipulate the ranking of the target document in the SERPs for that anchor text. If you have a website talking about linux, and 98% of all links from that website point to linux related websites („a topical cluster with which the anchor text of the link is associated„), BUT there´re also some links with non linux related anchor texts, that are commercial („payday loans„, „play poker“ etc.) and are also in a specific area without much other content around it (so basically just a bunch of links in the footer or sidebar, like you still see them all around the internet) – there´s a good chance, these links were just put there to manipulate the SERPs. I think it´s even so clear, that you can devalue these links algorithmically without having to worry all too much about any collateral damage. Ok, this is a pretty easy example, but there´s a whole lot of stuff listed in the patent, that can make other somewhat more difficult cases look pretty clear just as well.

So read Bill´s post, scan through the patent, and try to think about that kind of stuff the next time you´re buying getting a link organically 🙂

70 Comments on "A link ain´t just a link anymore"

  1. Bill sagt:

    Hi Marcus,

    I’m in total agreement with you. This is definitely one of the most interesting patents I’ve seen from Google in quite a while, and while it covers some things about links that many people have been talking about, it brings a new level of credibility and meaning to those conversations.

    It also raises a bunch of new questions, like the commercial text issue that you point out.

    My thought on footer links – on some web pages, footer links actually are useful as navigational links to other pages on a site or other sites, if used with some restraint and without looking like spam.

    If those footer links on those pages are actually used by visitors to the pages (and user behavior data is one aspect of the patent as well), then they may pass along more value on those pages than on other sites which may stuff footers with large amounts of commercial looking footer links to topics and pages that might not be very related.

    One of the points behind this patent filing and the features that they consider is that the features they review for a link are considered together rather than in isolation.

    So, a link with commercial anchor text placed in the main content area of a page that is relevant to the content of that page may pass along more value than a link with noncommercial text in the footer of the same page in a smaller font size that isn’t very relevant to the page.

  2. mediadonis sagt:

    Total ACK Bill – especially to always consider a bunch of features together, instead of just looking at one peticualar feature by itself. Like always with Google – whether it´s webspam spam detection or even the ranking itself – it´s always a bunch of factors, and not one isolated factor, that will get your site banned or ranked better. It´s always in the mix 🙂

  3. Gretus sagt:


    die Power eines Links anhand seiner tatsächlichen Klickrate zu regulieren klingt interessant. Da das WWW allgemein gesehen immer `aktueller´ wird, frage ich mich jedoch, was dann zum Bespiel mit Links in alten Blogpost bzw. auf Unterseiten passiert, deren Besucher gegen null laufen?

    Keine Ahnung, aber nicht gerade wenige Inhalte im WWW haben wahrscheinlich gar keine Besucher bzw. Klicks auf dort verlinkte Seiten !?



  4. mediadonis sagt:

    Wie ich in meiner Antwort auf Bill bereits schon gesagt habe, ist User Data, also bspl. die von Dir angesprochene Klickrate, nur einer von vielen Faktoren, den Google bei der Evaluierung hinsichtlich der Stärke eines Links heranziehen könnte. In dem Patent finden ja eine ganze Menge Faktoren Erwähnung, die gerade in Kombination miteinander ein stimmiges Bild abgeben.

    Auch Bill sagt ja in seinem Comment:
    „One of the points behind this patent filing and the features that they consider is that the features they review for a link are considered together rather than in isolation.“

    Ich persönlich glaube, dass User & Clicktrough Data in diesem Szenario am wenigsten Einfluss haben werden, da die Datenbasis einfach immer noch am unzuverlässigsten, und auch am „einfachsten“ zu manipulieren ist. Vielleicht wird gerade User Data auch eher nur als Lackmustest bei (algorithmisch gesehen) strittigen Entscheidungen herangezogen.

  5. Nedim Sabic sagt:

    Google is all into spam elimination and I´m ok with that. It doesn´t infect a normal SEO, who ich watchin´ out for some topic relevant links out there.

    Gettin the CTR of Links, even if they are not on some Google Analytics service, might be hard. The Google Toolbar is relevant but might also be risky, because not all are usin it. It would be also hard on pages where you can change the template as a user and also on news pages where new content is linked via title and dissapears in a day from the homepage, but is still relevant.

    Tracking down topic relevant external paid links is a hard mission to handle with and I think Google will struggle for long with this topic. Because claiming for something to be paid is much harder than to say that is topic irelevant. CTR doesnt look as a solving method.

    Cheers and thx for sharing,
    Nedim Sabic

  6. mediadonis sagt:

    Again, it´s NOT just about CTR & User Data! There´s a whole lot of stuff Google has mentioned in the patent. Why does everyone just seem to obsess about User Data? In my eyes User Data is the most unreliable, and most easy to game metric mentioned in the patent.

  7. Dass sich themenrelevante Links aus Verticals besser machen, bezweifelt sicher kaum noch jemand. Und wieviel Einfluss User und CTR haben, dürfte mittlerweile jedem klar sein, der einen Blick in die neusten Features der GWMT geworfen hat…

  8. webSimon sagt:

    Das Google durch diese Präsentation will, dass das jedem klar ist, dürfte auch jedem klar sein…

    Wäre doch sehr geschickt von Google so zu tun, die CTR zu berücksichtigen, selbst wenn es nicht so wäre. Alle würden versuchen ihre CTRs zu verbessern, Spamming hätte dabei gar keinen Effekt. Äußerst praktisch.

    Wird alles nicht so heiß gegessen wie es gekocht wird.

  9. Arne sagt:

    Great stuff, Marcus! As always your posts are fundamentally great and juicy for most SEOs and always edgy and early signs of fundamental changes!! Thanks buddy! 🙂

    I think we c the uprise of a new quality level of SEO. Like you said, Marcus, still there are some folks who are linking poker anchors to sites from absolutely irrelevant contentpages (your example of the Linux-Site). This will bail out soon. We have gained more and more clients who neeed better and more relevant and organically grown links and have seen what bad SEO-Messies did to their pages. So I think the badfooter SEOs with their bad quality linking parks will extinct soon in Evolution. 😉

  10. mediadonis sagt:

    LOL – „SEO-Messies“ 🙂

  11. Kralle sagt:

    Letztendlich handelt es sich um nichts Anderes als die Weiterentwicklung des „random surfer models“ des guten alten PageRank-Algorithmus nach qualitativen Kriterien und nicht mehr nach nur rein Quantitativen.
    Zahlen wie CTR sind ja ohnehin nichts Anderes als der Versuch diese qualitativen Kriterien objektiv messbar zu machen. Ebenso Linkposition, Schriftbild (Farbkontrast etc.), Themenrelevanz usw. – letztendlich alles was messbar ist und auf die Wahrscheinlichkeit ob ein User auf diesen Link klickt oder nicht Einfluss nimmt.
    CTR ist ohnehin nur mehr oder weniger eine Kontrollgröße um andere Faktoren entsprechend gewichten und verifizieren zu können und macht nur deshalb Sinn weil Google hier entsprechend repräsentatives Datenmaterial hat.
    Google hat schon immer versucht diesen „random surfer“ bei der Linkbewertung möglichst realistisch abzubilden und verfeinert sein Modell eben anhand von Faktoren sobald diese messbar oder algorithmisch abbildbar werden.

  12. This is the best post I’ve read on link building in a long while, so thanks for this first of all.

    I’ve been almost basing my work life around some of these factors, especially watching anchor text variance, and of course the „commerciality“. I can only imagine the meetings they’re having about this now

    The patent has me quite excited as it should work nicely for my client’s natural link building. Hopefully bringing down the copious link buyers against me 🙂

  13. irio sagt:

    Great, really cool articles, as usual, but please, please, do a spell check or hire an editor 🙂

    Otherwise, I read your articles religiously! 🙂

  14. Dirk sagt:

    Das ist ein guter Artikel. Google wertet immer mehr nach gutem Content und das ist auch richtig so.

  15. Franko sagt:

    Ich bin gespannt wie das mit dem Panda Update in Deutschland ausfällt

  16. Karen sagt:

    This is my first time i visit here. I found so many entertaining stuff in your blog, especially its discussion. From the tons of comments on your posts, I guess I am not the only one having all the enjoyment here! Keep up the excellent work.

Trackbacks for this post

  1. Linkbuilding patent uit 2004 bevestigd -
  2. Top 10 der Woche 19/10 - Blog

Got something to say? Go for it!