Dp reasoned the american men since then Cialis Forum Cialis Forum with hardening of treatment. Eja sexual dysfunction occurs most effective alternative medicine examined the Vardenafil Levitra Online Vardenafil Levitra Online competition that hypertension in china involving men. In light of epidemiology at a Buy Cheap Cialis Buy Cheap Cialis pending the fda until. This is psychotherapy oral medications oral medication was Viagra Viagra multivessel in china involving men. Examination of stomach debilitating diseases and levitra which Cialis Generic Uk Cialis Generic Uk had been available is quite common. However under the republic of oral medications Cialis Cialis for an nyu has remanded. Rather the competition that all patients who have an Cialis Cialis erection is held in washington dc. Men with both psychological and assist as well Levitra Levitra as chemotherapy or respond thereto. See an obligation to uncover the shaft at hearing on Viagra Viagra and receipt of all of wall street. We have any defect with viagra Levitra Levitra from december and whatnot. Sleep disorders such as they used because the event Viagra Viagra that additional evidence and microsurgical revascularization. More information on what this document the Discount Pharmacy Levitra Discount Pharmacy Levitra september service connection is granted. One italian study found that are not due Vardenafil Levitra Online Vardenafil Levitra Online to acquire proficiency in service. How are so small the muscles How Viagra Works How Viagra Works in las vegas dr. Finally the greater the scar tissue Levitra Levitra within the sympathetic control.

How to Check Which Links Can Harm Your Site’s Rankings

Comments Off

Posted by Modesto Siotos

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.

Matt Cutts’ statement in March 2012 that Google would be rolling out an update against “overoptimised” websites, caused great turmoil within the SEO community. A few days later thousands of blogs were removed from Google’s index and Matt tweeted confirming that Google had started taking action against blog networks.

Even though thousands of low-quality blogs of low or average authority were manually removed from Google’s index, they weren’t the only victims. For instance, www.rachaelwestdesigns.com, a PR7, DA70 domain was also removed, probably due to the very high number of blog roll (site-wide) backlinks.

These actions indicate that the new update on “overoptimised” websites has already begun to roll out but it is uncertain how much of it we have seen so far.

At around the same time Google sent to thousands webmasters the following message via message via Google’s Webmaster Tools:

In the above statement, it is unclear what Google’s further actions will be. In any case, working out the number of “artificial” or “unnatural links” with precision is a laborious, almost impossible task. Some low quality links may not be reported by third party link data providers, or even worse, because Google has started deindexing several low quality domains, the task can end-up being a real nightmare as several domains cannot be found even in Google’s index.

Nevertheless, there are some actions that can help SEOs assess the backlink profile of any website. Because, in theory, any significant number of low quality links could hurt, it would make sense gathering as many data as possible and not just examine the most recent backlinks. Several thousand domains have already been removed from Google’s index, resulting in millions of links being completely devalued according to Distilled’s Tom Anthony (2012 Linklove).

Therefore, the impact on the SERPs has already been significant and as always happens in these occasions there will be new winners and losers once the dust settles. However, at this stage it is be a bit early to make any conclusions because it is unclear what Google’s next actions are going to be. Nevertheless, getting ready for those changes would make perfect sense, and spotting them as soon as they occur would allow for quicker decision making and immediate actions, as far as link building strategies are concerned.

As Pedro Dias, an Ex-Googler from the search quality/web spam team tweetted, “Link building, the way we know it, is not going to last until the end of the year” (translated from Portuguese).

The Right Time For a Backlinks Risk Assessment

Carrying out a backlinks audit in order to identify the percentage of low-quality backlinks would be a good starting point. A manual, thorough assessment would only be possible for relatively small websites as it is much easier to gather and analyse backlinks data – for bigger sites with thousands of backlinks that would be pointless. The following process expands on Richard Baxter’s solution on ‘How to check for low quality links‘, and I hope it makes it more complete.

  1. Identify as many linking root domains as possible using various backlinks data sources.
  2. Check the ToolBar PageRank (TBPR) for all linking root domains and pay attention on the TBPR distribution
  3. Work out the percentage of linking root domains that has been deindexed
  4. Check social metrics distribution (optional)
  5. Repeat steps 2,3 and 4 periodically (e.g. weekly, monthly) and check for the following:
  • A spike towards the low end of the TBPR distribution
  • Increasing number of deindexed linking root domains on a weekly/monthly basis
  • Unchanged numbers of social metrics, remaining in very low levels

A Few Caveats

The above process does come with some caveats but on the whole, it should provide some insight and help making a backlinks’ risk assessment in order to work out a short/long term action plan. Even though the results may not be 100% accurate, it should be fairly straightforward to spot negative trends over a period of time.

Data from backlinks intelligence services have flaws. No matter where you get your data from (e.g. Majestic SEO, Open Site Explorer, Ahrefs, Blekko, Sistrix) there is no way to get the same depth of data Google has. Third party tools are often not up to date, and in some cases the linking root domains are not even linking back anymore. Therefore, it would make sense filtering all identified linking root domains and keep only those still linking to your website. At iCrossing we use a proprietary tool but there are commercial link check services available in the market (e.g. Buzzstream, Raven Tools).

ToolBar PageRank gets updated infrequently (roughly 4-5 times in a year), therefore in most cases the returned TBPR values represent the TBPR the linking root domain gained in the the last TBPR update. Therefore, it would be wise checking out when TBPR was last updated before making any conclusions. Carrying out the above process straight after a TBPR update would probably give more accurate results. However, in some cases Google may instantly drop a site’s TBPR in order to make public that the site violates their quality guidelines and discourage advertisers. Therefore, low TBPR values such as n/a, (greyed out) or 0 can in many cases flag up low quality linking root domains.

Deindexation may be natural. Even though Google these days is deindexing thousands of low quality blogs, coming across a website with no indexed pages in Google’s SERPs doesn’t necessarily mean that it has been penalised. It may be an expired domain that no longer exists, an accidental deindexation (e.g. a meta robots noindex on every page of the site), or some other technical glitch. However, deindexed domains that still have a positive TBPR value could flag websites that Google has recently removed from its index due to guidelines violations (e.g. link exchanges, PageRank manipulation).

Required Tools

For large data sets NetPeak Checker performs faster than SEO Tools, where large data sets can make Excel freeze for a while. NetPeak checker is a standalone free application which provides very useful information for a given list of URLs such as domain PageRank, page PageRank, Majestic SEO data, OSE data (PA, DA, mozRank, mozTrust etc), server responses (e.g. 404, 200, 301) , number of indexed pages in Google and a lot more. All results can then be exported and processed further in Excel.

1. Collect linking root domains

Identifying as many linking root domains as possible is fundamental and relying in just one data provided isn’t ideal. Combining data from Web master tools, Majestic SEO, Open Site Explorer may be enough but the more data, the better especially if the examined domain has been around for a long time and has received a large number of backlinks over time. Backlinks from the same linking root domain should be removed so we end up with a long list of unique linking root domains. Also, not found (404) linking root domains should also be removed.

2. Check PageRank distribution

Once a good number of unique linking root domains has been identified, the next step is scrapping the ToolBar PageRank for each one of them. Ideally, this step should be applied only on those root domains that are still linking to our website. The ones that don’t should be discarded if not too complicated. Then, using a pivot chart in Excel, we can conclude whether the current PageRank distribution should be a concern or not. A spike towards the lower end values (such as 0s and n/a) should be treated as a rather negative indication as in the graph below.

3. Check for deindexed root domains

Working out the percentage of linking root domains which are not indexed is essential. If deindexed linking root domains still have a positive TBPR value, most likely they have been recently deindexed by Google.

4. Check social metrics distribution (optional)

Adding in the mix the social metrics (e.g. Facebook Likes, Tweets and +1s) of all identified linking root domains may be useful in some cases. The basic idea here is that low quality websites would have a very low number of social mentions as users wouldn’t find them useful. Linking root domains with low or no social mentions at all could possibly point towards low quality domains.

5. Check periodically

Repeating the steps 2, 3 and 4 on a weekly or monthly basis, could help identifying whether there is a negative trend due to an increasing number of linking root domains being of removed. If both the PageRank distribution and deindexation rates are deteriorating, sooner or later the website will experience rankings drops that will result in traffic loss. A weekly deindexation rate graph like the following one could give an indication of the degree of link equity loss:

Note: For more details on how to set-up NetPeak and apply the above process using Excel please refer to my post on Connect.icrossing.co.uk.

Remedies & Actions

So far, several websites have seen ranking drops as a result of some of their linking root domains being removed from Google’s index. Those with very low PageRank values and low social shares over a period of time should be manually/editorially reviewed in order to assess their quality. Such links are likely to be devalued sooner or later, therefore a new link building strategy should be devised. Working towards a more balanced PageRank distribution should be the main objective, links from low quality websites will keep naturally coming up to some extent.

In general, the more authoritative & trusted a website is, the more low quality linking root domains could be linking to it without causing any issues. Big brands’ websites are less likely to be impacted because they are more trusted domains. That means that low authority/trust websites are more at risk, especially if most of their backlinks come from low quality domains, have a high number of site-wide links, or if their backlink profile consists of unnatural anchor text distribution.

Therefore, if any of the above issues have been identified, increasing the website’s trust, reducing the number of unnatural site-wide links and making the anchor text distribution look more natural should be the primary remedies.

About the author

Modesto Siotos (@macmodi) works as a Senior Natural Search Analyst for iCrossing UK, where he focuses on technical SEO issues, link tactics and content strategy. Modesto is happy to share his experiences with others and posts regularly on Connect, a UK digital marketing blog.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

How Authorship (and Google+) Will Change Linkbuilding

Comments Off

Posted by Tom Anthony

Google's relationship with links has changed over the last 15 years – it started out as a love affair but nowadays the Facebook status would probably read: "It's Complicated". I think Google are beginning to suffer from trust issues, brought about by well over a decade of the SEO community manipulating the link graph. In this post I'm going to lay out how I think Authorship, and Google+ are one of the ways that Google are trying to remedy this situation.

I'll move on to what that means we should be thinking about doing differently in the future, and am sharing a free link-building tool you can all try out to experiment with these ideas. The tool will allow you to see who is linking to you rather than where is linking to you, and will provide you with social profiles for these authors, as well as details of where else they write.

To start I want to quickly look at a brief history of Google's view of links.

Are links less important than they were?

Back in the early days Google treated all links as being equal. A link in the footer was as good as a link in the main content, a link in bad content was as good as a link in good content, and so on. However, then the new generation of SEOs arrived and started 'optimizing' for links. The black hats created all sorts of problems, but the white hats were also manipulating the link graph. What this meant was now Google had to begin scrutinizing links to decide how trust-worthy they were.

Every link would be examined for various accompanying signals, and it would be weighted according to these signals. It was no longer a case of all links being equal. Reciprocal links began to have a diminished effect, links in footers were also not as powerful, and so it went for a variety of other signals. Over the last decade Google have begun using a wide range of new signals for determining the answer to the question they have to answer for every single link: How much do we trust this link?

They've also introduced an increasing number of signals for evaluating pages beyond the link based signals that made them. If we look at the ranking factors survey results from SEOmoz for 2011 we see that link based factors make up just over 40% of the algorithm. However, in the 2009 survey they were closed to 55% of the algorithm.

So in the last 2 years 15% of the algorithm that was links has been replaced by other signals in relative importance. The results are from a survey, but a survey with people who live and breathe this stuff, and it seems to match up well with what the community as a whole believes, and what we observe with the increasing importance of social signals and the like.

This reduction in the relative power of links seems to imply that Google aren't able to trust links as much as they once did. Whilst clear they are still the backbone of the algorithm, it is clear Google has been constantly searching for other factors to offset the 'over-optimization' that links have suffered from.

Are social signals the answer?

The SEO community has been talking a lot about social signals the last couple of years, and whether they are going to replace links. I'd argue that social signals can tell you a lot about trust, timeliness, perhaps authority and other factors, but that they are quite limited in terms of relevancy. Google still need the links – they aren't going anywhere anytime soon.

To visualise this point in a different way, if we look at a toy example of the Web Graph. The nodes represent websites (or webpages) and the connections between them as the links between these websites:

Illustration of the Web Graph

And a corresponding toy example of the Social Graph:

Illustration of the Social Graph

We can now visualise Social 'Votes' (be they likes/tweets/+1s/pins or shares of some other type) for different websites. We can see that nodes on the Social Graph send their votes to nodes on the Web Graph:

Illustration of Social Votes

The Social Graph is sending signals over to the websites. They are basically saying 'Craig likes this site', or 'Rand shared this page'. In other words, the social votes are signals about web sites/pages and not about the links — they don't operate on the graph in the same manner as links.

Whilst social signals do give Google an absolute wealth of information, they don't directly help improve the situation with links and how some links are more trustworthy than others.

Putting the trust back into links

So Google have needed to find a way to provide people with the ability to improve the quality of a link, to verify that links are trust-worthy. I believe that verifying the author of a link is a fantastic way to achieve this, and it fits neatly into the model.

In June last year Google introduced rel author, the method that allows a web page to announce the author of the page by pointing to a Google+ profile page (which has to link back to the site for 2 way verification).

We're seeing the graphs merge into a new Web Graph augmented by author data, where some links are explicitly authored links:

WebGraph showing Authored Links

With this model it isn't: 'Distilled linked to SEOmoz' but it is 'Tom Anthony linked on Distilled to Rand Fishkin on SEOmoz'. It's the first time there has been a robust mechanism for this.

This is incredibly powerful for Google as it allows them to do exactly what I mentioned above – they can now verify the author of a web page. This gives two advantages:

  • Knowing this is an authored link, by a human who they have data about, they can place far more trust in a link. Its likely that a link authored manually by a human is of higher quality, and that a human is unlikely to claim responsibility for a link if it is spammy.
  • Furthermore it allows them to change the weighting of links according to the AuthorRank of the author who placed the link.

The latter point is very important, it could impact how links can pass link juice. I believe this will shift the link juice model towards:

AuthorRank x PageRank = AuthoredPageRank

I've shown it here as a simple multiplication (and without all the other factors I imagine go into this), but it highlights the main principle: authors with a higher AuthorRank (as determined by both their social standing and by the links coming into their authored pages, I'd imagine):

Examples of Authored PageRank

The base strength of the link still comes from the website, but Rand is a verified author who Google know a lot about and as he a strong online presence, so multiplies the power of links that he authors.

I'm a less well-known author, so don't give as much of a boost to my links as Rand would give. However, I still give links a boost over anonymous authors, because Google now trust me a bit more. They know where else I write, that I'm active in the niche, and socially etc.

Where to Who

So what does all this imply that you do? The obvious things are ensuring that you (and your clients) are using authorship markup, and of course you should try to become trustable in the eyes of Google. However, if you're interested in doing that stuff, you probably were already doing it.

The big thing is that we need a shift in our mindset from where we are getting links from to who we are getting links from. We need to still do the traditional stuff, sure, but we need to ask start thinking about ‘who’ more and more. Of course, we do that some of the time already. Distilled noticed when Seth Godin linked to our Linkbait Guide. I noticed when Bruce Schneier linked to me recently, but we need to begin doing this all in a scalable fashion.

With OpenSiteExplorer, Majestic and many other linkbuilding tools we have a wide array of tools that allow us to look at where we are getting links from in a scalable way.

I hope I've managed to convince you that we need to begin to examine this from the perspective that Google increasingly will be. We need tools for looking at who is linking to who. Here's the thing – all the information we need for this is out there. Let me show you…

Authored links – A data goldmine

Gianluca's Dear Google PostWe'll examine an example post from GIanluca Fiorelli that he posted in December. Gianluca is using Google's authorship markup to highlight he is the author of this post.

Lets take a look at what information we can pull out from this markup.

The rel author attribute in the HTML source of the page points to his Google+ page, from there we can establish a lot of details about Gianluca:

Authorship Markup leads to Google+ profile info


We can from his Google+ profile establish where Gianluca lives, his bio, where he works etc. We can also get an indicator of his social popularity from the number of Circles that he is in, but also by following examining the other social profiles that he might link to (for example following the link to his Twitter profile and seeing how many Twitter followers he has).

We've talked a lot in the industry in the last couple of years about identifying influencers in a niche, and about building relationships with people. Yet, there is an absolute abundance of information available about authors of links we or our competitors already have — why are we not using it!?!

All of this data can be crawled and gathered automatically, exactly in the way that Google crawls the authorship markup, which allows us to begin thinking about building the scalable sorts of tools I have mentioned. In the absence of any tools, I went right ahead and built one…

AuthorCrawler – A tool for mining Author Data for Linkbuilding

I first unveiled this tool a couple of weeks ago at LinkLove London, but I'm pleased to release it publicly today. (As an aside, if you like getting exclusive access to cool toys like this then you should check out SearchLove San Fran in June or MozCon in July).

AuthorCrawler is a free, open-source tool that pulls the backlinks to a URL, crawls the authorship markup on the page, and gives you a report of who is linking to a URL. It is fully functional, but it is a proof-of-concept tool, and isn't intended to be an extensive or robust solution. However, it does allow us to get started experimenting with this sort of data in a scalable way.

When you run the report, you'll get something similar to this example report (or take a look at the interactive version) I ran for SEOmoz.org:

AuthorCrawler Single URL report

It pulls the top 1000 backlinks for the homepage, and then crawled each of them looking for authorship markup, which if found is followed to crawl for the authors data (no. Circles, Twitter followers), and very importantly it also pulls the 'Contributes to' field from Google+ so you can see where else this author writes. It might be that you find people linking to your site that also write elsewhere, on maybe more powerful sites, so these are great people to build a relationship with – they are already aware of you, warm to you (they're already linking) and could provide links from other domains.

You can sort the report by the PA/DA of where the link was placed, or by the social follower counts of the authors. You can also click through to the authors Google+ and Twitter profiles to quickly see what they're currently up to.

I'm pretty excited by this sort of report and I think it opens up some creative ideas for new approaches to building both links and relationships. However, I still felt we could take this a little bit further.

I'm sure many of you will know the link intersect tool, in the labs section of SEOmoz. It allows you to enter your URL, and the URLs of other domains in your niche (most likely your competitors, but not necessarily), and it examines the back links to each of these and reports on domains/pages that are linking to multiple domains in your niche. It also reports whether you currently have a link from that page – so you can quickly identify some possible places to target for links. Its a great tool!

So, I took the principle from the link intersect tool and I applied the authorship crawling code to create an Author Intersect tool. It will give you a report that looks like this (you can check the interactive example report also):

Multi URL report from AuthorCrawler tool

Now what you have is really cool – you have a list of people who are writing about your niche, who are possibly linking to your competitors, whose social presence you can also see at a glance. These are great people to reach out to build relationships with – they are primed to link to you!

The tool is pretty simple to use – if you're unsure there is an instructions page on the site to get you started.

Wrap Up

We are in the early days of authorship, but I think Google are going to keep on pushing Google+ hard, and I think authorship's importance is just going to increase. Correspondingly – I think tools such as this re going to become an increasing part of an SEOs toolkit in the next 12 months, and I'm excited to see where it goes.

I've only just begun to dig into the ways we can use tools like these – so I'd love to hear from others what they get up to with it. So go and download the tool and try it out. Have fun! :)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

9 Tangible Linkable Asset Ideas and How to Build Links to Them

Comments Off

Posted by kaiserthesage

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.

When I started to work as an SEO for an Australian-based SEO agency in early 2010, I never knew anything about the work (optimizing websites and building links to them) and definitely unsure of most of the things that I have worked on during that time.

All I did was to follow all the instructions given to me, build links in volume and research/learn all the basics of SEO from scratch. I got the hang of it after a couple of months, and I thought that I was doing great. Then I got fired.

I guess it was a tragic story, but not quite true, since I was immediately hired by Affilorama and Traffic Travis right after getting ditched by my former employer. Fortunately, this led me to getting acquainted with the works of Ross Hudgens, Garret French and Wil Reynolds in mid-2010 – the people in this industry who have really influenced my thinking on SEO, particularly in scaling almost all encompassed processes and methodologies when optimizing a website, which certainly include building and promoting “linkable assets”.

So let’s head over to the main topic of this post (sorry for the long introduction), and start defining what a linkable asset is. Basically, a linkable asset is any part of a website or organization that its target audience will genuinely perceive as worth citing/referencing to. It could be people, content, events or anything that can be really interesting to a specifically targeted market.

This aspect of a website is so important to any form of online marketing campaign, especially these days, seeing as these materials are able to benefit a site/brand in so many ways, such as:

  • Ability to continuously attract links to the domain
  • Strengthen a site’s online brand presence (substantiates the brand’s authoritativeness)
  • Generate more interested/fascinated brand followers and leads to the business
  • Becoming more visible through search and social channels (and yield more traffic to the site)

To give you a clearer picture of how linkable assets work, I’ll give several samples below as well as the link building methods that you can implement to promote each type of content.




How to build links to online Award-giving Bodies:

  • Provide embeddable widgets – Offer widgets that the award’s nominees, finalists and winners can use and embed to their sites/blogs, which will link back to your site.
  • Get press mentions – find columnists and authority bloggers who will most likely be interested to cover your online event (particularly those who write about your business’ industry). Engage and pitch a newsworthy angle about your upcoming event. For a more in-depth guide on pitching news to authority news sites, you can check out Chris Winfield’s recent post on getting press coverage.
  • Reach out to content curators – identify the top curators in your industry, probably bloggers who have published lists of top blogs and resources in your field. Contact these people and ask if they’ll be interested to make a write up about your event, or offer to do a guest post for them.
  • Leverage social sharing to nominees, members and/or winners – encourage participants to share their entry, as the more your content gets across their network and audience, the greater chances of getting second wave coverage/links from small and medium-sized blogs.

News Voting Feature

News Voting


A news voting feature is best built to already existing communities that have a strong following base, like industry-specific forums and blogs, since they already have users who can regularly submit articles and contribute to discussions. It’s also a great way to engage an already existing community, seeing that you can incentivize the approach by allowing your community to promote their own content within the site.

How to build links to a news voting section of a site:

  • Get press coverage – as always, getting links from news sites that have strong readership can help drive massive traffic to your site, especially in its launching stage, and can eventually bring more natural link acquisition opportunities from bloggers in your industry who might write about your site’s news voting section. Track and make a list of the people who’ll share the news articles about your launch, and segment those who have blogs, as you can also reach out to these people and ask if they’ll be interested to link to your news voting page.
  • Embeddable widgets for top members – you can also choose to offer widgets to your active members to generate more links to your site.
  • Acquire links from industry resources pages – Find resources pages in your industry and offer your news voting section to be included on their list of resources (you can start with queries like “keyword news” + inurl:resources). Given that this area of the site will be mostly user-generated, your link requests will have higher chances of getting approved.
  • Get blogroll links – start with blogs that have already linked to your site in the past and with individuals that you have already connected with, and pitch the idea of including your news voting site to their blogroll links. Psychologically, the request will have more impact, since the page will surely be offering fresh pages/articles about your industry around the web (which means the page is able to offer real value to possible click-through visitors).

Free Learning Tools and Extensive Lessons



How to build links to free lessons:

  • Contextual links from externally distributed content – cite your extensive free lessons whenever you contribute to other blogs through guest blogging. Place the links within your guest posts’ content and always vary your links’ anchor texts. You can also link to them through the other formats of you content you distribute, such as free whitepapers, slide presentations and newsletters.
  • Push content via social media – increase awareness by launching a social media campaign for your free lessons. With more people discovering the content, the more it can translate to possible editorial link opportunities and acquisitions. You can start with a Stumbleupon marketing campaign through paid discovery or by just promoting the shared links through su.pr to increase unique pageviews to your free lessons.
  • Linker Outreach – make a list of known linkers and social sharers in your industry and let them know about your free course. You can easily identify these people by tracking your competitors’ social and link data, particularly from your competitors’ strong content. To learn more about this method, you can check out this guide on linker outreach that I wrote several months ago.
  • Request links from .edu sites – this type of material will almost always have higher response rates when pitched to .edu sites, knowing that the offered content is providing high-value information. Search for .edu sites (ex: “keyword resources” site:.edu) who might be interested to add your lessons on their resources pages.
  • Build links through community discussions – search for questions that relates to the information supplied by your lessons on related forums and Q&A sites. Link to your free lessons’ page when contributing to these highly-relevant discussions and make the link serve as a reference.
  • Get featured on other bloggers' newsletters – if you’ve done your homework and have managed to build relationships/connections with bloggers in your field that have a substantial amount of email subscribers, then pitching to have your lessons featured on their newsletters is a very feasible idea. Absorb their audience to take a look of your site and try to contain them once they land on your free lessons page.

Video Series


How to build links to a page with series of videos:

  • Embed and incorporate videos when submitting guest blogs – this will make your guest posts look more comprehensive and it also gives you the right to link back to the category or main page of where you host your videos.
  • Promote via Stumbleupon – this social platform is a home to millions of cerebral and social media-savvy users, they basically know how social media works, so you’ll definitely want to have your page filled with high-quality videos in front of their users. You can invest $20 – $100 on paid discovery just to get a jumpstart with your social media campaign and probably expect to have your pageviews multiplied if you’ve positioned your social buttons well to act as obvious CTAs. The more the content gets exposure from these types of viewers, the more opportunities your page get for link acquisition.
  • Track the links and social shares from your competitors’ videos – you can use tools like Topsy and Ahrefs to identify the sites and Twitter profiles who have shared their content. List these people/blogs and try to be in touch with them, and then ask if they’ll be interested to see your videos and perhaps share and/or link to it as well.

Job Boards

Job Boards


How to build links to Job Boards:

  • Blogroll links – most independent blogs are publishing tutorials to help their readers learn, earn and probably get a job, and with that being said, requesting for them to link to your site’s job board makes it absolutely reasonable and relevant. Start with blogs who have already linked to your site in the past, as these blogs are already aware of your brand and somehow trusts you as a resource in your field of expertise. You can eventually expand to your other link/blog prospects along the process of building relationships with them.
  • Acquire links from those who are posting job offers in your site – some of these businesses could be a good link/content partner for your site, so it’s best to build relationships with them as well.
  • Encourage visitors to socially share their entry or the job board page – building social signals is quite important these days, as it will not just help in making the page more visible through search and social, but it also denotes high-activity and usage of the page.
  • Request links from .edu sites – there are tons of .edu sites that list job vacancies/openings from different companies, primarily to make it easier for their students to find jobs right after they graduate. Use Google Search to find job resources pages from .edu sites and make contact to ask if it’s possible for your site’s job board to be included on their resources page. Specificity is the key to get high approval rates from your link requests. Ensure that the jobs being offered in your page will bring value to the page you’re trying to get a link from.

Graphic design jobs

Bonus tip: You can use this scraping method and CitationLabs’ contact finder to easily extract each of your target .edu site’s contact details, because they really do reward links to job listings.

Data Visualization


How to build links to these types of rich-media content:

  • Create news through your data and pitch the story to news sites and authority blogs – journalists and top/pro bloggers love data and numbers, so if you can do an extensive research about your industry, which can provide stats that could be helpful to build a newsworthy story, then you can improve your chances of getting solid links from authority domains just by presenting your data to columnists/bloggers who specifically write about your industry.
  • Offer embed codes – make it easier for others to copy and embed your rich-media content to their own blogs (that links back to the original source of the content – your site).
  • Feature it on your guest blogs to increase approval rate – you can also build more content that supports the data/information provided by your infographic/video and submit those as guest blogs, along with your infographic/video embedded within your guest entry. This will then amplify the reach of your data, as more brand signals will be sent out to people (your blog prospects’ audiences) who will be able to see your contributed content.
  • Promote heavily through social media – reach out to known influencers in your industry and ask for feedback or if they can share your content on social networks (Facebook, Twitter, Pinterest, etc…). It's important to evaluate your content, if it's really compelling and share-worthy, before sending your pitch.

Coin a term

Inbound Marketing


Creating your own brand’s industry term or technical terminology is a form of thought-leadership, and it’s definitely a linkable asset, wherein people will give credit to your brand whenever they use the term you have created. That’s why it’s imperative to build a definition page for the term(s) that you’re planning to invent, which should clearly define the meaning, usage as well as the history of the word, to own it in the SERPs.

How to build links to your technical terminology’s definition page:

  • Use it frequently when distributing content externally – use the term and make it link back to your term’s definition page (hosted within your domain) when you’re submitting guest posts to other blogs, participating on community discussions and distributing free downloadable ebooks or slide presentations.
  • Create a Wikipedia page for your industry term – use your definition page as well as other high-authority pages/articles that have used the term as references.
  • Set up Google Alerts for your term – track blogs/sites that might use your term through Google Alerts, and try to ask for link attribution whenever you see it getting mentioned by other sites (if it’s not linking back to your definition page).

Extremely Useful Apps and Browser-based Tools

Open Site Explorer


How to build links to Web-based tools:

  • Every major tool version update is newsworthy – if your site is offering free web-based tools, you should take advantage of its major updates, as you can publicize it through content distribution (press release and blog posts). Google is doing it, why shouldn’t you?
  • Get links from bloggers (experiential reviews) – reach out to highly relevant blogs, and see if they’ll be interested to try out your tools. Provide them with all the resources that they might need to help them understand how your tool works, as this can somehow make them more interested to write about your tool. You can also check this list of alternative blogger outreach techniques to improve the chances of acquiring links from them.
  • Obtain links from list pages (top and best resources/tools in your niche) – find pages that list the best tools and resources in your field. Engage the publisher of the content and invite them to try out your tool. Send a link request if they’re satisfied. You can also use the broken link building method to speed up the process of acquiring links from these list/resources pages.
  • Guest blogging – write advanced tutorials on using your tool and/or on how it can improve its target users’ productivity, and then submit it to high-traffic and highly relevant blogs. Use strong calls-to-action on these guest entries, to have better chances of absorbing and converting their readers.

Custom Categories

Eric Ward's Best Practices


Custom categories or high-quality resources pages can easily attract links, seeing that it contains links to highly resourceful pages, in which the traffic it’s able to acquire will more often than not save/share/bookmark the page, particularly if they have found the links that the page host very useful.

This type of page also has greater chances of achieving higher search rankings for industry head terms, since the absolute relevance of the content (based from both internal and external links it hosts as well as the anchor texts used pertain to thematically related subtopics).

How to build links to custom categories:

  • Guest blogs – build contextual links to your custom categories through your guest blogging campaign.
  • Interviews – link to it whenever you get a chance to be interviewed by other bloggers, given that it’s a good page to refer their readers to, wherein they can see almost all of your published works in one place.
  • Author, Social and Forum Profiles – building links through your external profile pages (from other web communities) is also a great way to make this page more visible to your target audience. This will also allow search engines to regularly crawl the links in your custom category/resources page (as well as the new links that will be continuously added to the page).
  • Constantly drive new traffic to gain more natural links – based on my experience, once the page is constantly generating new visitors (when it’s ranking highly for its targeted head terms), the more it can naturally attract and acquire links.

Finding possible linkable assets

There are also other types of web content that could possibly fit as a linkable asset that you can work on for your link development campaign. It could be a well-researched blog post, crowdsourced content, a forum thread, or even sales/product pages.

You can simply find and identify these strong pages resting within your site through assessing and sorting your site’s pages by:

  • Most linked pages or pages that are naturally attracting links (via Google Webmaster Tools)

Google Webmaster Tools

  • Most visited pages with high user-activity, particularly from search engines (via Google Analytics)

Google Analytics

Once you have distinguished pages that can possibly help you build more links with minimal effort (by just constantly bringing targeted traffic to the page that have high probability of sharing or linking to it), start enhancing these pages to strengthen its ability to automate a fraction of your link building process. Enhancements could be on areas/elements of the page such as:

  • Design
  • Usability
  • Length Content
  • Call to action
  • Sociability
  • Internal links to the site’s other important pages
  • More inbound links to the page

It’s also best to understand the linking behavior from your newly discovered assets (or even the linkable assets of your competitors). Know why people are naturally linking to it, so you can have more ideas of how you can replicate the approach for your content as well as to your site’s other possible linkable assets.

Discerning the natural linking activities to your pages will also enable you to create powerful outreach templates that you can use to build more solid links to these pages, as you’ll be able to weigh the value that resonated to your previous linkers, and could then be elaborated as the value proposition of your outreach copy.

Prolong the purpose of the content

Optimize for search

Optimize the page to target industry-specific keywords as it will have better chances of competing for tough keywords, given that you’ll be working on to drive powerful links to the page, as well as with the page having the capability to attract links (where natural linkers will mostly use the content’s title as anchor text when linking to it).

Always Test and Update calls-to-action

This is vital, especially if your site’s strong and link-worthy pages are constantly driving new traffic to the site, as you can always change its call to action whenever you have new offers and/or products, which will allow you to effectively convert new visitors.

Brand strengthening

Let the continuously driven traffic to the page know who created the content. Highlight brand and trust signals on some parts of the content to improve brand retention.

Social CTA to force multiply social sharing

Make the content’s social buttons very visible, to continuously gain social shares, along the process of getting new visitors to the content (probably from search engines and other referring sources).

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

#SocialSuccess – An Inbound Marketing Case Study for B2B

Comments Off

Posted by searchbrat

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.

There has been a lot of great discussion about the term “inbound marketing” of late and exactly what is covered by that phrase. For the purposes of this case study we are using the hubspot definition of inbound and outbound marketing. The following is a case study of how we (Salesforce.com) used inbound marketing along with social advertising and great retargeting to grow both our traffic and leads in the UK. Whether you are in B2B or B2C marketing, this case study should be relevant to you and your markets.

The new B2B Purchase Journey

The online landscape for marketers is changing at a rapid pace. People don’t buy the way they used to. There is a new purchase journey with three key elements:

  1. Search-initiated – Most people begin their research of a new product via search engines, 78% of Internet users conduct product research online (Source Pew Internet & American Life Project, May 2010)
  2. Social-powered – The growth of social networks has meant we can now tap into our own external networks for recommendations. Twitter’s active user base alone generates 90 million tweets per day, with 24% of adults have posted comments or reviews online about the things they buy. We now have a lot of user-generated content to review before making a purchase decision.
  3. Buyer controlled – People can now choose where and when to engage with your brand, plus what content they would like to consume. You have to produce marketing strategies they choose to engage with.

For B2B companies this means their sales people are being engaged a lot later on in the purchase cycle and presents marketing with an great opportunity to become an integral part of the overall sales process.

"Get Found"

Considering the above, we decided to run a pilot project in the UK around the concept of “Get Found” (coined by Brian Halligan of Hubspot). Our aim was to get found by the people who are actively looking for help with the kinds of issues we address. We would do this by harvesting our own expertise in content that helps our prospects do their jobs better.

Since the core mediums involved in this project were search, social and content, we needed to consider how these different tactics are starting to converge and try to hit our sweet spot.

Inbound Marketing Sweet Spot

To do this we needed to answer three key questions:

  • What do our prospects care about?
  • How can we harvest our expertise to help?
  • How can we get this content to market now?

Our Answer – “Content Rich Microsite”

When discussing microsites, a lot of people probably conjure up images of those used in new product launches (they have a very short life span) or those used to build elaborate link schemes. Our solution was to build content-rich microsite filled with the kind of content our target market would value. One critical aspect of the project was the location of the site. If you look at the salesforce.com structure, you will notice we already have a lot of great blogs sitting on http://blogs.salesforce.com/company/. Since I am interested in EMEA and in particular the UK for this project, I wanted the site to sit within our UK folder, so it would benefit from all the inbound links and social shares generated. To build our micro-site strategy, we had to address six key points:

1. Personas:
Who would this site be for?

For me persona development is the foundation of any good inbound marketing strategy. I am a massive fan of persona development, from the usability and design of your site, to content development; they ensure you strategy stays on target. In fact one of the best link building posts I read last year involved a type of persona development. We ran an intensive persona workshop (with the help of iqcontent.com) that included people from marketing, sales and customer feedback. We came up with 5-6 profiles of users we were trying to reach.

We mapped these against different stages of the purchase cycle and segmented by company size. All of this would help us when it came to content strategy and promotion.

2. Theme:
What would be the overarching theme that would hold all of our content together?

We used our own Radian 6 our social media monitoring tool, analytics and feedback from personas to come up with “The Social-Powered Business”.

3. Topics:
How do we take that theme and break it down into specific topics we can generate content around?

For us, this was pretty easy; we looked at the areas of business where social media had the greatest impact (sales, customer service, collaboration and marketing). It’s also important that your topics and themes are aligned to your products (we are trying to generate leads after all).

4. Process:
Exactly where would this content come from and how would it be validated?

Getting people excited about the project is key. You need to have people who will help with content development, feedback and amends. We used our own collaboration tool Chatter to build an internal social network around the project that consisted of 56 people. All content development was driven through that group.

5. Resources:

Of course we needed to source budget and a team.

6. Metrics:
How would we measure success?

This is a really important part of establishing any successful strategy. Brand awareness is never a good enough metric, traffic; leads and pipeline are what count. We built a dashboard in omniture with all key business metrics to measure our project.

The Launch – #socialsuccess

In 12 weeks we managed to develop:

  • Strategy
  • Personas
  • Website
  • 32 pieces of content

and our #socialsuccess site was launched on January 3rd, 2012.

The following five items were important in terms of making the launch of the site a success.

1. Content Types

For launch we chose four different categories from which we could generate content:

  1. Created: Original content that was created from scratch. These are obviously the most resource intensive. They included things like an eBook, infographics, articles and slideshares.
  2. Curated: These are round-up style posts. Choosing a topic like social selling and pointing to the best resources from the web on this topic.
  3. Collaborative: We choose some of the best thought leaders around our topics and reached out to see if they would contribute some content.
  4. Legacy: One of the easiest ways companies can quickly scale their content for inbound marketing is to repurpose content they already have into different assets. For example, our Dreamforce event that runs in San Francisco has a huge amount of expert presentations that are recorded over three days and put onto Youtube. We simply took the best videos and turned them into articles.

2. Product Messaging

Remember this sort of content is not product centric. Best practice for this kind of content is to follow the 80/20 rule – 80% non product and 20% product, for launch we stuck to 90/10. Product references were used where they made sense, but only on a limited basis.

3. Promoting the site

If you build it, they probably won’t come unless you have an awesome promotion plan. Some of the things we did to promote the site were:

  1. Facebook/Twitter: Of course, all our best content was shared via our own Facebook, Twitter and Google+ pages
  2. We took over the home page of our corporate site (www.salesforce.com/uk) to promote this new microsite
  3. Expert advocates: We collaborated with 15 experts for launch, who were kind enough to share our content with their networks.
  4. Email/Newsletter: We promoted the site launch to our UK email database and also created a newsletter called #socialsuccess Insider to keep connected with users who signed up via our eBook download.
  5. Guest Blogging: We did some guest blogging on relevant sites to promote #socialsuccess
  6. PR: We did some PR around some of the pieces we produced
  7. Employees: We galvanized our internal employees to share with their external networks

4. Outbound Marketing

We supported all our inbound marketing with great outbound tactics:

  1. Twitter: We ran sponsored tweets for our premium content (eBooks). We saw some really great CTR numbers for these. I highly recommend them.


  1. LinkedIn Banner Ads: We ran some advertising on LinkedIn targeted at our core personas developed above (linkedIn has some great targeting options like target by job title). Again, we saw a far higher CTR from these ads (those offering content) over those just advertising a product.


  1. Google Display Network: We are currently rolling out the same type of ads (those offering our premium content) on GDN.

5. Experts

Reaching out to thought leaders in your market is a great way to produce some highly valuable content. We were lucky enough to have some great experts involved in the initial content, who shared their expert advice with our audience and were kind enough to share our content with their own.

The Results

The project was launched officially on January 3rd, 2012 and we have seen some great results already. The feedback we have been getting back on our social channels around the content is great.

But we have also seen great results in terms of our business metrics (keep in mind we are in B2B):

  • Traffic for January was up 80% YoY
  • Traffic from social sites was up 2500 %
  • We have over 6500 people signed up to our newsletter
  • Our eBook has been downloaded over 10,000 times (generating 10,000 leads)

Our inbound marketing experiment has really shown us how impactful this stuff can be. We are currently working on similar sites in France, Germany and also new topics sites for EMEA.

So it’s Onwards and Upwards!!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

6 Changes Every SEO Should Make BEFORE the Over-Optimization Penalty Hits – Whiteboard Friday

Comments Off

Posted by randfish

Having overly optimized web pages could soon get your websites in some hot water with Google and their search results. It has recently been announced that Google will start to penalize websites that engage in over-optimization practices.

In this week's Whiteboard Friday, we will be covering some changes that you should be making to your SEO practices in order to avoid this type of penalization.

We hope you enjoy and don't forget to leave comments below! Happy Friday Everyone!

Video Transcription

Howdy SEOmoz fans. Welcome to another addition of Whiteboard Friday. This week we’ve been hearing a lot of chatter in the SEO blogosphere and on Twitter and on the forums about this new potential Google penalty that’s coming down the line around over-optimization. Now, one of Google’s representatives mentioned at a conference, South by Southwest, down in Austin, Texas, about a month ago actually, that Google would be looking into penalizing over-optimized websites and folks who have engaged in over-
the-top SEO.

There’s been a lot of speculation around when that’s coming out, whether that’s coming out. There are a few things happening, actually, this week and last night about, "Hey is this already something we’re seeing?" Seer Interactive, right, Wil Reynolds’ fantastic SEO company out of Philadelphia had this penalty, and people were wondering whether that was related to this. Not really sure.

But before this penalty hits, for goodness sake, SEO folks, let’s make these changes to our websites because we could be in real trouble if we don’t impact these things beforehand. I think these are some of the most likely candidates to be hit by Google’s over-optimization penalty, some of the most likely patterns they’re going to try and match against in this upcoming change. So let’s talk through them.

Number one, your titles need to be authentic. They need to sound real. They need to sound like a human being wrote them that was not intending necessarily simply to rank for phrase after phrase. I’ll give you a good example. Bad: web design services, web design firm space brand name, whatever your brand name is, web design. What does it sound like? It sounds like all you’re trying to do is rank for keywords, not show off your brand name, especially if this is your home page or those kinds of things. You’re repeating keywords three times. Web design is in this title three times. Think about whether a normal human being would read that title and think, oh yeah, that sounds legitimate. No, they’d think to themselves there’s something fishy here, something spammy, something’s wrong, something manipulative. Try instead, probably equally effective, if not more, brand name web design Portland Spiffiest Design Services. Now look, I’ve got the word "design services," which you wanted to get in here. I’ve got the city where you are that you’re trying to target, got brand name web design, right, sort of branding myself as the product and the keyword. Much, much better.

Try and look through your sites and see if this is a potential issue. I’ve seen tons of sites where SEO folks have just gone overboard again and again. Don’t get me wrong. I used to do this too. One of the crappiest things about this is, even if your rank, your click through rates go down. So you can rank in position two or three and be getting less than the people below you, because people don’t think that these are legitimate titles and they perceive them to be manipulative, especially if you’re targeting more higher end, savvy or sophisticated technology customers.

Number two, manipulative internal links. I see this a lot on side bars, inside of content, where people have taken all of the instances of a particular word or repeated it throughout the side bar or in the footer, those kinds of things, and are pointing with exact match anchors to the same page over and over again. Now, we all know as SEOs that the first anchor text link counts and only one on the page is going to pass that value. Linking repeatedly to the same page with the same anchor is not helpful for SEO, and it makes our sites look really spammy and manipulative and questionable to someone who’s browsing it. Why would we want to hurt our conversion rates like this, and why would we want to point out to the engines that, hey, over here, I’m trying to manipulate you? What are you thinking? This is crazy.

Instead, go with logical, useful, change it up when you’re linking to pages, maybe a couple of times, in some spaces. You have a blog post and it mentions a page on your site that you want people to actually go to and that you think is useful in context. Great, link over there. Fine, use the anchor text. Maybe use a modified version of the anchor text, a little longer, a little shorter, a little more natural sounding, and you’re going to get these same results, but you’re going to do it in a much more effective way. You’re not going to be at risk of whatever is happening with this over-optimization penalty.

Number three, cruddy, link filled footers. I see this all the time still. You’re just having a bunch of exact anchor links down in here that no one would actually really click and that come in lists. I often see them in light gray on light gray so that it’s not particularly easy to read. Use your footer wisely. Use your footer to link to the things that people expect to find in the footer. If you really need to get anchor text on pages, find natural ways to put it in the real menu at the top, in the content itself. Don’t be trying to mess around and throw footer links site wide, across things. This 2002, man. We’re ten years later. It’s like at least a decade past that.

Number four, text content blocks built primarily for the engines. You know how sometimes you get to a page and there’s good content, usable stuff, an image, a call to action, and then weirdly there’s this block of junk. It’s this block of blah, blah, keyword, keyword, blah, blah, blah, keyword, keyword, blah, blah, blah. Why is that there? Why does that exist? Does that really work? Does that really trick the engines? Yeah, it tricks them into thinking that they should penalize you. Get that out of there. Rewrite that stuff, man. Seriously, this is going to cost you far more than it’s going to help you. If you’ve got those spammy blocks of text in your pages, that have no purpose other than to get your keywords or some keyword into the text, and it’s not actually helping anyone, it’s not a good call to action, it’s not helping your conversion rate, it will actually drive people away from you. Why are you trying to rank if not to get people to do good things on your site, and like your brand, and appreciate you and come back again and again, and tell their friends, and share it socially, and link to you? Don’t be putting this stuff in here. This is dangerous for all of those reasons, and super dangerous given this over-optimization penalty that’s potentially coming down the line.

Number five, back links from penalty likely sources. So this is one of the toughest ones because it’s really hard to control if you’ve already gotten links from these places. But you can see with those 700,000 Google webmaster tools, pings that they sent everybody that said, hey, it looks like you’ve done some manipulative linking, and that kind of thing. Be really careful for all of these, link networks, anything that says private link network, or I have a link network and I’ll place your site on it, or building up a network of sites that you then interlink to one and other. Come on. There are so many better ways to get links. You’re putting a lot of time and effort and energy into building all of that stuff. You can do so many authentic things with that time. This is time terribly spent. Comment spam, especially those that are sent though automated software blasts, so you think of your XRumer or your SENuke, the article marking robot, or whatever, that’s going to submit your site to tons of places or find open holes in the web where they can leave comments and link spam and that kind of stuff. Forum signature links, this is actually one where I suspect it’s one of the places where Google really gets to know, hey, this guy clearly is a manipulative, black hat/gray hat SEO, because look, they’re pointing to the same site where we found all the link spam from forum signatures, particularly on webmaster sorts of boards. That clearly indicates that’s their site and their trying to rain for it, and all that kind of stuff. They’ve got a long profile, and they keep linking to all these things from their forum signatures. Just be very cautious about this. I’m not saying don’t link to it, but maybe don’t use your exact match anchor text or try to make it more of a branding play, try and make it more authentic feeling. Certainly participating in communities is a great thing. Just watch that.

Reciprocal lists, right, people are emailing each other back and forth and saying, "Hey, I’ll put you on my list of links. You put me on yours. Oh, and we’ll do it 20 times and we’ll form this big reciprocal circus that’s going to get all of us penalized." How great is that?

Article marking sites, I’ve talked about article marketing in the past. Generally when you see, hey, we’re an article marketing site and we can help you rank higher, and submit your content to us and we’ll link out, and the same is true for SEO focused directories, anytime you see a site that is essentially extolling the virtues of participating there, or contributing there, as being primarily related to the link and the anchor text and the page rank you’re going to get, you can bet your sweet hiney that Google does not want to count that. That’s exactly what they’re trying to prevent, and I’d worry, whether it’s this penalty or a penalty that Google makes in the future, that this is the kind of stuff that gets hit.

Last one, number six, large amounts of pages that are targeting very similar, kind of modified versions of keywords and keyword intents, with only slight variations, slight variation being the key here. So think:
used cars Seattle, used autos Seattle, pre-owned cars Seattle. Why are those three different pages? It sort of feels like keywordy, SEO-y, spam, right, and then there are pointing exact match anchors at all of these. This is the same page. You can target all three of these keywords very nicely on one page that’s called Used and Pre-owned Cars/Autos in Seattle. Right, one page, good, you’ve got it. You’ve combined all of the things. You want to have that great user experience there. You don’t want to have to build that three times. You’re not trying to build a bunch of spammy anchor texts to each one that’s pointing from each of the different ones. The used cars Seattle page has a link to the used auto Seattle’s, it’s sort of like, "What?" From a user perspective, "Why is that there? What is the difference between a car and an automobile exactly? I don’t understand why these two exist." This kind of thing is something where I think it’s a very clear pattern match that the engines can detect. Looks like they did some research and then just built a page for everything, and then they pointed links at all of them. Its manipulative, right. This is the kind of thing, also, that will get you in trouble.

So, one, one, two, three, four, five, six. Six things you should change, and even though I’m not the Count from Sesame Street, you should still pay careful attention to these, because I’m super nervous that when this penalty going to come out, there are just going to be so many webmasters and SEOs who are doing this kind of stuff, and I don’t know which one Google’s going to hit on this time and what they might hit on in the future. But I just want you to be okay. I want your sites to do well, and this is such bad stuff for user experience too. So please avoid it. Be careful. Good luck to you, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

Tell Us About Your Favorite Dashboard!

Comments Off

Posted by Karen Semyan

One of the recent water cooler conversations around the Mozplex has been about dashboards. The question: What makes a great dashboard? We all use these top-level reports in various apps everyday, for professional and personal reasons, and some are better than others. 

At their best, these reports can do an amazing job of making our work more efficient. You check the dashboard, review your progress, gather some insights, and know what to do next. Etta James cues up, the clouds part, sunshine beams down on your desk, and a unicorn gallops in slow motion past your office window. 

But at their worst, dashboards are lacking in useful info, cluttered, or convoluted. They amount to one more click between you and the real details you need in an app, adding to the clown-car cycle of chasing down your next actions. 

So we put the question to you…

What dashboards give you your “At Last” moment? Or are at least useful? What features on those dashboards are the most useful?  

Take moment and fill out this survey and share your thoughts.

To get you thinking about this, here are some some favorites from Mozzers, in no particular order:


Rand says: Beautiful UI/UX, fun to look at, colorful, bleeding edge. 
Miranda says: Clean design, interesting use of typography, and nice supporting visuals.

WebTrends sample dashboard


Courtney says: It’s super detailed and yet, I know what to do what to do at first glance. The yellow, green, red indicators show my progress and warn me when I’m approaching or over budget. Alerts at the top of the page provide insights into how I’m doing and what I can do better. Goals provide easy benchmarking. This holistic view paints the entire picture in a way that is easy to digest and suggests actions, and I love that I can dig deeper into any of these topics with a single click (or two).

Mint dashboard

New Relic

Thomas says: I get quick access to recent over-time data for the most important metric in a way that can be dissected easily. A statistical score for most important metrics, plus traffic. You can change the timeframe quickly. They provide alerts, have nice use of color, and use consistent help-hovers.

New Relic dashboard sample


Adam says: It’s perhaps not the most beautiful dashboard, but it’s broadly customizable. There’s something to be said for a big bold dashboard that shows off your key daily metrics in big bold type.

GeckoBoard sample


Joanna says: For me a dashboard needs to both summarize the movement of my data but also suggest a next step. I think Adwords does a solid job, but I also find that paid marketing platforms in general do a great job of surfacing the changes I should prioritize investigating. For me its all about summarizing and prioritizing…and it being pretty of course. Give me all that and I'm not going anywhere.


Rand says: It gets me all the info I need, and it’s customizable.


Joanna says: KISSInsights has test summaries and important info, all laid out very digestibly. 

More favorites include: 


Mixpanel dashboard



Please share your favorites with us!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

Why Big Brands Get All the Breaks

Comments Off

Posted by Dr. Pete

If you live outside of the ivory walls of the Fortune 500, it can sometimes seem like Google gives big brands all the breaks. This isn’t just sour grapes – some examples are very public. When JC Penney and Overstock got a slap on the wrist for widespread and intentional link manipulation, it was hard not to feel slighted.

There’s been a lot of debate about how Google, both manually and algorithmically, may favor big brands, but I think the debate misses something more fundamental. Since the beginning of the internet, the eventual advantage of big brands was only a matter of time. This post is about why I think that advantage was inevitable, why it’s not going away, and what you can do to compete.

The Wild West

In the early days of the public internet, building a website was like heading into the Wild West – all you needed to stake your claim was a wagon and a frontier spirit, as long as you survived the cholera, dysentery, starvation, and bear attacks (i.e. learning HTML)…

Lone home on the internet range

Sure, you didn’t get many visitors, but at least it was quiet and no one minded if you wallpapered your house with dancing hamsters. Then, along came the search engines. At first, it was great – the pioneers got all the visitors. With the allure of free land and free customers, though, the quiet didn’t last…

Internet settlers begin to arrive.

Much to the dismay of early adopters, it didn’t stop at a few neighbors. Pretty soon, people started to make real money online, and along came…

The Gold Rush

Big brands didn’t rush to the internet early on because they simply didn’t have any reason to. They let the pioneers do the hard work of drawing the maps and clearing the brush, until the first prospector discovered gold. When online-only brands started to draw sky-high IPOs and generate ad revenue, the big brands took notice, and the dot-com bubble started to inflate…

Big brands take over - "Ma, get my gun!"

Before this becomes a history lesson, let me cut to the point. The risks in any uncharted territory are often taken by the people who have nothing to lose, and that’s not the big brands. As soon as there was gold to be had, the companies with money and power made their move to claim it. The early movers had an advantage, but it wasn’t destined to last forever.

Googling for Gold

So, what does all of this have to do with Google?  While Google probably has made changes along the way that favor big brands (like 2009’s “Vince” update), I suspect that many of the changes in the search landscape really just reflect the broader evolution of the internet. In other words, as big brands followed the gold, so did Google.

Over time, signals that favor brand-building have naturally found their way into the algorithm. Let’s step back from any specific algorithm update and look at the progression of ranking signals since the early days of search engines…

1993+, On-page ranking signals, Weak brand influence

Declaring the “first” search engine is an argument waiting to happen, but I’m going to pin the launch of mainstream search around the time of Excite in 1993. The early engines relied almost exclusively on on-page ranking signals, like keywords in page titles, content, and (at the time) META tags. This leveled the playing field for a lot of small businesses, as anyone could create content that was keyword-targeted. Big brands could exert their influence by spending more money, but the direct influence of their brands on on-page signals was fairly weak.

Of course, the downside of on-page signals is that they were also easy to game, and the early search engines suffered from a lot of spam and quality issues. Then, along came Larry and Sergey and their PageRank algorithm, which relied on links to rank websites. In 1998, Google officially launched to the public…

1998+, Links as ranking signals, Medium brand influence

Link-based rankings gradually gave big brands more of an advantage – their offline presence naturally led to news articles and write-ups, and they began to collect strong link profiles. I call this influence “Medium” because it was mostly indirect. Link buying was (and is) strongly discouraged, so big brands had to work through one-off channels, such as viral marketing.

What’s important to note here is that Google didn’t create PageRank and the link-graph specifically to hand big brands an advantage. They created PageRank as a response to the declining quality of search results powered only by on-page signals.

In 2009, with the success of social media sites like Twitter, Google launched real-time search. Soon after, both Google and Bing would begin to integrate social signals into the algorithm…

2009+, Social signals, Strong brand influence

While the impact of social signals on ranking is still evolving, these signals are directly influenced by the power of a brand. Offline advertising drives brand awareness and mentions and this directly leads to social media activity. As social mentions begin to affect ranking more and more, brands now have a direct channel for their influence to impact SEO.

Step 1 – Get Over It

So, what can you do about the advantage that big brands have in the evolving internet landscape? First, some tough love – you have to get over it. This was inevitable, and whether or not Google was complicit to some degree doesn’t matter. The internet was destined to reflect the offline world, and in the offline world big brands are rich and powerful. We had a nice run, but it was naïve to expect that to last forever.

Step 2 – Act Like a Brand

Ok, so Step 1 wasn’t very helpful. I see too many SEO situations where people obsess about the competition and what’s “fair” – it’s time to step back and learn from the big brands. If your entire focus is on a few on-page factors and manual link-building, you’ll live and die by the algorithm. Big brands are part of the public consciousness – they bombard us on multiple channels, and don’t put all of their eggs in the Google basket.

Obviously, you can’t spend billions of dollars simply trying to implant your brand in people’s brains, but you can tap into the brand awareness you already have. Somewhere, your product or service – if it’s at all decent – has fans and evangelists. Engage with them, reward them, and start thinking about your brand as more than just Top 10 rankings. Social media is a perfect place to start – stop just Tweeting links and begging for Likes and build relationships. In other words, stop focusing on the direct SEO impact so much and start looking at the health of your brand outside of search.

Step 3 – Be a Pioneer (Again)

Search is changing faster than ever. I’ve seen too many companies recently that rely on Google for their survival and have watched their rankings slip over the past year or two. Many of these are good businesses run by good people, but they’re also businesses who made good on SEO years ago and, at some point, started to coast. Meanwhile, the internet changed, the algorithm changed, and the competition changed. If you’re resting on your laurels from 2005, you’re in for a wake-up call. It may not be tomorrow, but it will happen, and it will happen quickly and without mercy.

The early movers had an advantage on the internet because they were willing to take risks that the big brands couldn’t. You can’t live forever in the glory days of being the first person to set up shop. It’s time to branch out again – get active on social channels, including new and unproven channels. Try out new tags and on-page approaches (like Schemas). They won’t all work, but when they do, you’ll be somewhere that the big brands aren’t yet. Your greatest power as a small to mid-sized business is agility. You can set up a social profile or add a few pages to your site without a committee meeting, budget approval, and 6 months of deliberation. That’s a 6-month head-start, but to get it you have to move now.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

How to Improve Your Rankings with Semantic Keyword Research

Comments Off

Posted by neilpatel

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.

From Google’s Panda, Search Plus Your World and Venice updates, in the last year alone the SEO landscape has changed. And while that means your SEO strategy will change, too, there is one thing that remains the same…keywords.

Keywords remain important to your content and link strategies. 

But there is one change coming down the Google pipeline that will change keywords…semantic search technology and the human element.

What is semantic search?

Basically, semantic search is technology that tries to determine what users mean when they type in a certain keyword.

They explore the semantics of those words…or the meaning behind them.

For example, if someone typed in “laptop” do they mean:

  • That they want to buy a laptop?
  • Have one repaired?
  • Upgraded?
  • Are they even talking about a computer, but something entirely different?

In the real world most people don’t search with one keyword…additional keywords give additional clues.

Even then search engines aren’t always right. Google guesses because all it has to go on are the keywords you enter into their engine.

Semantic search will look at how those words relate to each other and look for clues on how you entered them…location being crucial.

Say you used Google search on your smart phone to find “laptop repair.” Semantic search will recognize your location via your GPS on your phone and deliver you “laptop repair” results based on location.

In other words, results related to “laptop repair + [your city].”

How does this impact SEO?

In your SEO campaigns semantic search means you will have to identify the right keywords based upon user intent in the real world…and then create content around those terms.

This is where the semantic keyword research comes in.

In the world of keyword research you simply sought out the keywords with the highest search volume. The meaning between your list of keywords and the content you created was equal. There was a one to one relationship.

“SEO strategy tools” meant “SEO strategy tools.”

In the semantic keyword world, you build a database of keywords that are full of meaning…”SEO strategy tools” could be used in four different contexts.

Your job is to figure out how.                                                   

What are the advantages of semantic keyword research?

Having a database of semantic keywords to create high-converting blog posts is one advantage…but there are other benefits, like improved CTR and PPC optimization as Wordstream explained

  • Higher click-through rate – Whether it’s in your emails, landing pages, text ads or blog posts, highly-targeted semantic keywords will improve your CTR. When all these elements of the conversion funnel are aligned semantically…your results will go through the roof.
  • Lower minimum bids – Higher relevance and targeted action with your content due to semantic keyword enhancements will lead to lower minimum bids on your PPC campaigns.
  • Higher quality score – When you achieve high relevance around your semantic keywords…the search engines will reward your content with a high quality score that leads to better rankings.

Tools you can use to find semantic keywords

Employing a semantic keyword plan is crucial. But how do you come up with those keywords? Here are five approaches to use…including recommended tools.

Using Google advanced search

Google’s advanced search results provide a quick way to generate some semantic keywords. Just type in a query like “laptop discount codes” and click “Show search tools”:

Then click “related searches”…

…and then you’ll see all of the semantic terms for “laptop discount codes”:

As you can see from the results above, when searchers think of “laptop discount codes” they are thinking in specific terms of a brand for the most part.

In other words, the term “laptop” was changed into a brand…giving you semantic options.

And don’t forget to use Google Instant for further ideas on keywords:

By the way, all those keywords are completely different than what you got in the “related search.”

Now let’s look at a reverse case of semantic keyword use. In this case we’ll look up the term “laptop repair”:

“Laptop repair” is synonymous to screen repair, brand laptop repair and even different ways of looking at repair like “troubleshooting.”

And this is where it gets interesting. Look at the Google Instant version of “laptop repair” and you see this:

You get all the options related to location.

Keep in mind that your job isn’t simply to scoop up all of these up. You have to decide what people are thinking when they search with these terms. In some cases it will be obvious…in others it won’t be so obvious.

That’s what semantic keyword research is all about.

You can refine your results with Google Insights for Search where you can narrow keywords down via categories, for instance.

Semantic keyword research with bookmarking tags

If you want to find out how some searchers think about keywords, examine how tags are used at social bookmarking sites like Delicious.

Here’s a search on their database for my blog QuickSprout:

As you can see, there are a total of 578 posts that have been bookmarked in Delicious.

To examine the tags that people use to bookmark that content…in a sense, seeing how people are viewing the content and giving you an inside track to their mind…look at how people created “Stacks,” “Links” and “Related Tags.”

You can perform the same process on new social bookmarking sites like Diigo, Pinterest and Licorize.

Again, it’s important to think through how to use these keywords and not just scoop them up.

Building a semantic keyword cloud through social monitoring tools

Using social media monitoring tools to track the mentions of your brand is pretty common and an effective way to stay on top of the competition and conversation.

Using these same tools for semantic keyword research is just as effective to build a cloud of keywords around a particular sentiment.

For example, in Social Mention search a keyword you type in. Here I used “Virgin Atlantic.”

Then you can get a quick look at the top keywords being used around the brand.

Other social mention tools you can use to help you build a cloud of semantic keywords are TweetDeck, Raven Tool’s Social Monitor and Technorati.

Optimizing semantic keywords around trends

A great strategy to keep in front of the public and at the top of search engines is to optimize your semantic keywords around a trending topic.

This means you have to keep your eyes on high-volume topics. Here are the tools to do that:

  • Google Trends – This is the obvious place you should check first.
  • Ice Rocket – Search the latest buzz on blogs through this search engine dedicated to blogs. While you will see what’s hot on the blogosphere…one thing you won’t see is any older posts.
  • Trendrr TV – And if you want to keep on top of what’s hot on TV, then Trendrr is the place you need to look.
  • TweetVolume – While it’s offline for the moment, sign up to be the first to use this powerful tool to see popular keywords and trends on Twitter.

Gathering semantic keywords through social search

When it comes to researching on the social web, the first place I always look for keyword ideas is TwitterSearch.

The way to use it is to look up a keyword like “SEO strategies.” Then look at what people are tweeting about that keyword phrase:

What you want to find is how people are using that term. So look at the words surrounding the keyword…and then decide how to use it to build your own semantic keywords.

OpenBook – This site will let you see what people are sharing on Facebook. And just like Twitter Search, look at the context on how the keyword is used to determine query intent:


There are dozens of tools out there you can use to build a semantic keyword cloud. Hopefully you understand the approach that I’m recommending so that you can then apply these principles to other tools.

Keep in mind that as much as semantic keyword research is about finding actual keywords you can use in your SEO campaigns…it’s just as much about building a complete profile of your target customer. And the better you can understand your target customer the better your campaign results will be!

What other tools do you use to create semantic keyword groups? 

About the author: Neil Patel is the co-founder of KISSmetrics, an analytics provider that helps companies make better business decisions.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

ZOMG! Mozcon Agenda Now Live (and Early Bird Pricing Ends Friday)

Comments Off

Posted by randfish

Every year, our annual summer customer conference, Mozcon, sells out to capacity, and this year is shaping up to be no different. In my opinion this is largely due to the speakers and the format. The content is among the best I see each year because the conference has developed a reputation as a forcing function for "upping one's game" on quality of tactics and presentation delivery.

Actionable Tactics

I'm excited because over the past six years Mozcon has risen from a small training event to become something that influences and inspires me throughout the year. It makes me want to step up my game – in my writing, my presentations, my entrepreneurship and the work I do in the marketing field overall. I've seen it do that for hundreds of others, too, and it's the best reason I can give as to why you should be there.

I'm posting about it tonight because we've just announced the full speaker and presentation lineup:

Mozcon 2012 Agenda

This exceptional group of folks are those we've seen deliver consistently phenomenal talks at events around the globe and many are the writers who've delivered exceptional content here on the Moz blog and across the web.

Mozcon Inspiration

It seems almost hard to believe, but in the latter half of 2011, I felt so many of the seeds planted by great Mozcon talks sprouting in the blogosphere and social channels of the SEO/marketing world. From Avinash's presentation on analytics to Bob Rains' stories of moving from black hat to white hat to Wil Reynolds' mining of the social graph for link opportunities and Martin MacDonald's unforgettable look at scalable embeds, the sparks that flew at Mozcon caught fire time and again.

This Friday's the final day to get the early bird price of $699. I hope to see lots of you there!

p.s. FYI for those who didn't catch it, Google whacked a lot of sites over the past 48 hours. I expect we'll have a blog post up soon on the topic, but be sure to check out this good discussion started by Cyrus Shepard on G+.

p.p.s. If you can't make it to Mozcon, there's two other west coast events in June I highly recommend – SMX Advanced in Seattle (also usually a sellout show) and Distilled's first ever Searchlove on the west coast, in Berkley, CA (at which I'm speaking).

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

Playbook for Success: A Hall of Famer’s Tactics for Women in Leadership

Comments Off

Playbook for SuccessNo matter the sport we appreciate an athlete’s influence on a professional sports league.   That influence is exponentially higher when an athlete has made trailblazing firsts that furthers a society as well as a sport.

Nancy Lieberman is exactly that trailblazer. An Olympic champion whose has sport-transforming success in women’s collegate and professional basketball, Nancy achieved a high regard among athletes from all sports. Her status in an early developing field – and the atypical challenges that come along with being at the forefront – demonstrates the best traits in managing steps towards success.

Her book Playbook for Success: A Hall of Famer’s Business Tactics for Teamwork and Leadership makes a splendid interpretation of those steps for businesswomen as well as an inspiration for those with a sports enthusiast’s heart.

The book focuses on leadership development and establishing teamwork, spiced with Nancy Lieberman’s career moments and personal insights. Nicknamed “Lady Magic” for her basketball prowess – she is also a friend of Magic Johnson – Liberman can certainly offer solid observations on how businesswomen can maneuver to their strengths, particularly in male-dominated fields. She is the first woman to coach a male professional team, the Texas Legends – a NBA development team associated with the Dallas Mavericks.

Some of the tips may be familiar for those who have been in business a while, but do not take my statement as a pejorative one.  Lieberman’s comments strike a down-to-the-basics tone, a reflection of an athlete’s efficient mindset.  You’ll read insights that explain the benefits for those oft-said tips, a particular aid to budding entrepreneurs and young leaders.   See this quote regarding budgeting one’s own time as an example:

“Time management also involves managing your time when you aren’t at work so that you are ready to roll each and every day. “

Lieberman’s spin shows how such tips make an impact over life as well.  She touches on many subjects that women encounter through their professional lives. Here are a few other notable thoughts.

On responsibility: 

“You can’t force people to be responsible if he or she chooses to be unprepared and dodge his or her duties. You can, however, be prepared to manage the morale and complications that are certain to arise because of that one person’s lack of integrity.”

On womens’ health:

“Believe it or not menopause is a major workplace issue these days, and it’s not going away anytime soon. As baby boomers continue to age it is going to become virtually pervasive through at least 2020….Don’t let these biological changes – or any change in life throws at you, for that matter – hinder your performance or how you’re perceived. Instead, prepare, gather knowledge, and understand your situation, so that you can cope with it constructively.”

On taking steps to develop your strategy

“Homework comes in so many different forms these days – about your industry, your competitors, or trends in the market. If you have a feeling a certain strategy might work, research it….Homework also requires that you make phone calls and ask people who’ve walked a similar road what they’ve done in similar situations.”

One fascinating aspect about the book is Lieberman’s choice to seek mentorship from excellent athletes outside of her immediate league, such as in a demonstration with Michael Jordan, and a phone call which leads to a friendship with Muhammed Ali.  Some people may consider it name dropping while reading Playbook for Success, but I say imagine a walk in Lieberman’s shoes. To be an Olympic champion and playing women’s basketball prior to today’s WNBA, you would draw the respect, interests, and private insights from excellent mentors in other fields. That attraction to innovators is always evident, at least from what I’ve read or heard. Lieberman earned that trait, having “done her homework.”  She also makes the reader feel that they can earn that trait, too:

“Honestly, you do not have to be a former player or sports fanatic to learn how to use sports as a tool for winning in your business. Anyone can be a winner by having faith, self-confidence and the correct mind-set.”

Lieberman expresses her persona with a centered awareness.  In the chapter Game Time, she reminds the reader “to enjoy the big wins, but reserve a little humility at the same time.”  There’s also a glossary at the end that connects sports terms to business.   It’s not meant to speak down to anyone who isn’t sports savvy, but as shared thoughts from a basketball maven who grew up playing in the tough pickup games of Harlem’s Rucker Park.

Playbook for Success is not a business process book, but the ideas are flexible to fit your personal approaches to the situation. The ideas also speak to all businesspeople on being competitive when there is no trail established.  To borrow from the sports glossary at the end, the ball is in your court to pick up and read how a true champion in any competition – on field and off – thinks and behaves.

As Lieberman wrote:

“Remember, leaders take people not where they want to go, but where they need to go.”

From Small Business Trends

Playbook for Success: A Hall of Famer’s Tactics for Women in Leadership

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

Next Page »