Welcome to My Blog

Here i share some information about google and also about SEO, so please check out this, and give your suggestion regarding the same subjects.

Thursday, May 31, 2012

The 10 most common Landing Page Optimization mistakes

The first thing that most people think when they want to boost their online sales is to increase their website traffic. Thus they focus on getting more links, achieving better Search Engine Rankings, investing more on advertising, sending more emails and newsletters, finding more affiliates, becoming more active on Social Media and using other ways that can help them increase the number of their visitors. But is this always the right approach? Certainly not! Before spending all of your resources on trying to acquire expensive traffic, you should first ask yourself: “Are my landing pages optimized? Do they help me generate sales?”

The Landing Page Optimization is probably one of the most important but at the same time ignored part of the Online Marketing strategy. Have in mind that in many cases, increasing your traffic by 10% will not have the same impact on your revenues as improving your conversion rates by the same percentage. This is primarily due to the fact that increasing your traffic incurs several costs that can squeeze your profit margins. Thus as we discussed on a previous article building effective landing pages should be a top priority in order to improve your conversion rates and increase your sales.

That is why in this article we discuss the 10 most common Landing Page Optimization mistakes and we explain how these problems affect the online marketing campaign:

1. Hidden or Unclear Call to Action

As we said in a previous blog post, the Call to action is a simple way to interact directly with your visitors and encourage them to take an immediate action (such as buy, call, subscribe, register, download etc) after visiting your website or viewing your page. The call to action should be distinctive, it should be visible on the top of the page and it should communicate a clear message to your readers. Failing to use clear call to action or failing to focus the user attention on it, is something that can definitely lead to lower conversion rates.
eye_Tracking_Data

2. Having too much Text

Having too much text on the landing pages of your website is not helpful for your visitors. Keep in mind that the landing page should get the user’s attention and make him/her understand the basic features of your product/service. The rule “less is more” applies in this situation. Present to the user the most important aspects of your product/service and provide him a way to read more details if he wants. Use clever graphics/visuals to pass hidden messages and don’t forget that a picture is worth a thousand words.
Now I am sure that many SEOs will argue that such an approach will have a dramatic impact on the SEO strategy of the website. Think again! There are several ways to add this text below the fold (where it will not distract the users) or use JavaScript/jQuery to hide and show the content in case they want to get more information. There are several ways to solve this problem technically. Just think out of the box and have in mind that visitors have limited time and thus you have only few seconds to get their attention.

3. Too many Links/Choices

Providing too many choices to the users is not advised since it can confuse them and make them leave your website. Make your users navigate easier on your site and increase the odds of focusing them on the Call to Action by keeping the number of choices short. This applies especially when the visitor lands on the page from an Adwords or PPC advertisement.
your_landing_page
Again you don’t have to sacrifice your SEO campaign and destroy the Link Architecture of your website in order to create effective Landing Pages. You can resolve these issues easily by using several web development techniques and by implementing clever designs. The links can remain on the page as long as you focus the user attention on the choices that you want him/her to make.

4. Visual Distractions

Avoid at all costs any visual distractions that will make user ignore the call to action. Removing flashing advertising banners, floating boxes, popups or any other similar distractions is essential to increase the conversion rates. Keep in mind that using too many call to actions on the same page can also be considered as a distraction. Even though you can have more than one call to actions on the same page, make sure you prioritize them and not confuse/distract the user from your main goal.

5. Request too many information on the forms

In many cases the landing pages contain HTML Forms where the visitor has to fill in his/her information in order to proceed (download, buy, subscribe, register, contact sales etc). If possible eliminate the forms that are not necessary since they are considered as barriers by many users. In case they are needed, try to request from the user only the information that is absolutely necessary and avoid asking for things that are not useful/relevant or that he/she might be reluctant to provide immediately. Don’t forget that you can ask later for this information, once you gain the trust of the user.

6. Message Mismatch

It’s a common mistake to use Advertising campaigns with misleading texts in order to drive traffic to the landing page. A message mismatch between the ad and the landing page can lead to high bounce rates and low conversions. By using misleading ads not only you attract not-targeted traffic and you waste your resources but also you risk getting banned by the advertising networks since this is a clear violation of their Terms of Use. The same applies not only for ads but also for all of your promotional campaigns and links that you place across the web.
site_optimisation_landing_pages-ads

7. Lack of Trust

Another reason why you might have low conversion rates is because the potential clients that visit your website do not trust you. In order to resolve this problem first of all make sure you have a well designed website which is an indication that you are a serious company. Also make sure you include the logos and reviews of your satisfied clients or provide reviews from experts of the industry. Finally keep in mind that you should include in a visible position any Trust Seals or Badges that show that you website is safe for online transactions.

8. Not tracking the results of your landing pages

In order to be able to optimize a landing page, you must be able to know how well it performs. A major mistake that lots of people make is that they don’t monitor and track the results of their landing pages. Make sure you invest time and effort on reviewing your traffic logs, tracing the behaviour of the user within your website and above all track every click of the user by using Google Analytics Event Tracking or Virtual Pageviews or similar techniques. This information will be invaluable for you when you start evaluating and optimizing the results of each landing page.

9. Not testing the results of your landing pages

If you don’t test different versions of the landing page you can’t be sure that it works properly (or how well it works). By using A/B testing you can get lots of useful data about the behaviour and needs of the user and use this information to improve your pages and make the right marketing decisions. Don’t forget that in order to be able to evaluate the results of the A/B testing you need gather results over a period of time and of course to set all the necessary mechanisms that will help you monitor the results of each page. Google’s Website Optimizer is a useful service that can help you perform such tests, nevertheless there are several other solutions and tools that you can use.
a-b-testing

10. Not optimizing your Landing Pages

Webmasters and Online Marketers optimize their Ad/SEO/Social Media/Online Marketing campaigns all the time. Why should not we do the same for the Landing Pages? Redesigning the Landing Pages of the website by taking into account the results that we track, the A/B testing and the feedback from the users is absolutely necessary in order to improve the conversion rates. Don’t forget that it’s your client that should “design” your website and not your Web Designer or Online Marketing consultant. And since different clients have contradictory needs you need to find a solution that satisfies most of them. This can only be done by dedicating time and effort on monitoring, testing and optimizing your landing pages.

The Landing Page Optimization is a difficult and challenging task that requires you to present the information in an optimal way, to know your clients and understand their needs, to monitor closely the performance of the pages and to invest time and effort on improving their results and on trying new approaches. Still Landing Page Optimization is not rocket science and as a result in many cases by dedicating the appropriate time you can achieve very good results just by making few targeted changes on your marketing approach.

Source : http://www.webseoanalytics.com/blog/the-10-most-common-landing-page-optimization-mistakes/

Thursday, May 24, 2012

Internal Linking to Promote Keyword Clusters

In “Keyword Clustering for Maximum Search Profitability” we discussed the idea of clustering keywords and how doing so can increase the speed and effectiveness of your link building efforts. The article was based on the principle of passing internal strength between the pages of your website.

Today we'll discuss this subject in more detail. Be warned, it'll involve a little bit of math (we are dealing with Google's algorithm after all, so there's almost always going to an element of that). That said, I'm going to do the math for you to provide what's most important, an understanding of why the formulas work and simple ways to determine what needs to be done. If you understand the why, the math essentially does itself.

What Are Internal Links?

We all know that internal links are the links within your website that enable visitors to get from one page to another, a point we won't dwell on further. The question you want to answer here is, “What do internal links mean to a search engine and how are they weighted?”

At its core, an internal link adds value to your pages in a manner similar to third party links to your site. That said, this would be a poor attempt at education if I assumed knowledge so let's take it from the top and answer first the question… how does strength pass from one page to another? (Note: many of the principles of this apply to both internal and external links to a page.) From there we'll look at how external links impact the weight flow.

When I ponder the value of a link, either internal or from a third party, I consider the world of Orwell's "Animal Farm". The first and foremost thought is my head is that each page has a vote – a chance to cast their ballot in favor of other resources. Where it gets Orwellian is in his infamous quote which I will bastardize for my use here, “All votes are equal, but some are more equal than others.”

To put this in the context of internal links, a link from the homepage of a site or another strong page will be weighted higher than a link from a weak page 12 levels deep in its hierarchy. With this sentiment we know that the old real estate adage of “location, location, location” holds as true in SEO as it does in the “real world” however it gets even more true when we consider the other elements that come into play.

A picture is worth a thousand words, so below you'll find an image of a simple site hierarchy. Due to my complete lack of design ability, hopefully this picture is worth at least 75 words or at least doesn't draw from the overall word count of this article. It will serve the purpose needed here at least.

A Simple Website Structure

Below is a seven page website (six internals and a homepage). Now let's consider how the weight will pass from one page to another. In my calculations I am not factoring in the evaporation of weight that occurs with every link (Matt Cutts discussed this at second 40 in his video here).

Because this happen with every link on your site and your competitors as well, it can be viewed as a level playing field and negated for simplicity though it is reinforcement for limiting the number of links on a page to minimize evaporation. But back to the point at hand.

No matter what the site is, one can assume the homepage value is 100. This is because I'm only factoring in the link passing within the site, not valuing the site against others. So let's begin.

internal-link-example-1 
If the homepage value is 100, the value passes as:
  • Homepage – 100
  • One – 33.3
  • Two – 33.3
  • Three – 33.3
  • Four – 16.7
  • Five – 16.7
  • Six – 33.3
This assumes that each of these pages links only to the pages in the diagram and the weight is split evenly among all links. In the real world, however, it would be more realistic (though messy in the illustration) to assume that each page links to each page higher in the hierarchy plus the homepage. So let's look at what each page will pass.

The homepage starting value is 100 meaning that it will indeed pass 33.3 value to each of the pages one level down.

Rather than linking downward however these pages will also link back to the home page giving the following values:

The Home page passes:
  • 33.3 to Page One
  • 33.3 to Page Two
  • 33.3 to page Three
Page One passes:
  • 11.1 to Home
  • 11.1 to Page Four
  • 11.1 to Page Five
Page Two passes:
  • 33.3 to Home
Page Three passes:
  • 16.7 to Home
  • 16.7 to Page Six
Page Four passes:
  • 5.6 to Home
  • 5.6 to Page One
Page Five passes:
  • 5.6 to Home
  • 5.6 to Page One
Page Six passes:
  • 8.4 to Home
  • 8.4 to Page Three
So at the end we end up with the following values:
  • Home – 180.7
  • Page One – 44.5
  • Page Two – 33.3
  • Page Three – 41.7
  • Page Four – 11.1
  • Page Five – 11.1
  • Page Six – 16.7
So clearly we see a situation where the second level in a hierarchy gains value by having a larger number of pages as sub-sections of it.

Now let's look at a more realistic (albeit advanced) example and consider the weight passing if each page links to those within its cluster as well as all the pages above it.

For example, Page Four would link to pages One and Five as they're within its cluster and also the homepage and pages Two and Three. This mimics an environment where pages One, Two, and Three are in the main navigation along with the homepage.

Further to that, we'll consider the inclusion of breadcrumb navigation adding additional links to pages within the direct hierarchy and thus part of the cluster. As discussed in the video above (and witnessed many times in practical application) the addition of these links passes more “juice” to those pages (the simple math is, two links to a page gives it twice the weight – we'll go with that for our purposes here).

So let's look at how that weight passes this time:

The Home page passes:
  • 25 to itself
  • 25 to Page One
  • 25 to Page Two
  • 25 to Page Three
Page One passes:
  • 7.1 to Home
  • 3.6 to itself
  • 3.6 to Page Two
  • 3.6 to Page Three
  • 3.6 to Page Four
  • 3.6 to Page Five
Page Two passes:
  • 10 to Home
  • 5 to Page One
  • 5 to itself
  • 5 to Page Three
Page Three passes:
  • 8.3 to Home
  • 4.2 to Page One
  • 4.2 to Page Two
  • 4.2 to itself
  • 4.2 to Page Six
Page Four passes:
  • 0.9 to Home
  • 0.9 to Page One
  • 0.5 to Page Two
  • 0.5 to Page Three
  • 0.5 to itself
  • 0.5 to Page Five
Page Five passes:
  • 0.9 to Home
  • 0.9 to Page One
  • 0.5 to Page Two
  • 0.5 to Page Three
  • 0.5 to Page Four
  • 0.5 to itself
Page Six passes:
  • 1.2 to Home
  • 0.6 to Page One
  • 0.6 to Page Two
  • 1.2 to Page Three
  • 0.6 to itself
So at the end we end up with the following values:
  • Home – 153.4
  • Page One – 40.2
  • Page Two – 39.4
  • Page Three – 40.0
  • Page Four – 4.6
  • Page Five – 4.6
  • Page Six – 4.8
So we end up with a curious situation in the math. What one may conclude is that it's better to keep a limited number of sub-pages in your site, after all – Page Six is carrying more weight than Page Four so it must be a better structure.

The only takeaway I hope you draw at this stage is that it's a good idea to take out expired products or useless pages (for your users as much as the engines). To illustrate why I'm going to shift our site into the real world and imagine once again that we're selling bike parts (see last article for reference here).

Let's imagine the following page definitions:
  • Homepage – My bike store
  • Page One – Suspension forks page
  • Page Two – Privacy Policy
  • Page Three – Dual-suspension frames page
  • Page Four - Marzocchi 44 Rlo page
  • Page Five - Marzocchi 44 TST2 page
  • Page Six - Banshee spitfire page
To see the impact on ROI, let's imagine a scenario where I build a link to both pages Four and Six. The link built will give the same weight (let's give it the arbitrary value of 5 by our weight model above) and let's now calculate what happens.

I'm going to omit the math and simply list the final numbers. We'll assume that the starting value are those defined above so the weight and re-factoring on the engine's part will produce higher numbers (as the values from the pages low in the hierarchy add weight to the pages above it).
  • Homepage – 248.8
  • Page One – 131
  • Page Two – 125.6
  • Page Three – 129.4
  • Page Four – 25.4
  • Page Five – 20.4
  • Page Six – 26.2
So here we see that the weight given to Page Six is higher than either of the two other third-tier pages however there are two important points we need to consider before asserting that a hierarchy that puts a single path to each product is superior.

Had we divided the paths off into four from the homepage to facilitate each of the product pages the split in weight from the homepage would have yielded the following example.

internal-link-example-2 


If this is the case we would have found the following to be the final values (assuming the link additions to pages four and six as above and the same breadcrumb navigation):
  • Homepage – 228.5
  • Extra Page – 87.1
  • Page One – 85.1
  • Page Two – 80.6
  • Page Three – 87.1
  • Page Four – 10.8
  • Page Five – 9.8
  • Page Six – 10.8
The weakening of the links off the homepage weakened the entire site reducing all the potential rankings on the internal pages.

While the link to page six in the first example yielded a higher page weight on that specific page when it was in a single path on the third tier of the site, the benefit was limited to that page alone. When we linked into a cluster, the benefit on the individual page was reduced slightly as the weight shifted with the increased number of internal links however the weight of a number of pages improved (including the product category page).

By clustering your targeted keywords together you'll be building links to groupings of pages that will, by design, help the rankings and page weight of each other.

A Beneficial Issue With The Calculations

It's only fair to note when there are known issues and unknowns with data. In the above calculations where I added in weight from third party links I reduced the weight of those links along with the internal weight.
The treatment of weight from external sources by the major search engines is undoubtedly different than internal weight and it's likely that the target page of the link would hold the full weight or a larger portion of it and then pass the weighting along without diminishing from itself.

What I mean by this is that the link to page four in the initial example held a weight of 5 and was divided in our math by 8 (the total number of links on the page). It's far more likely that the page would keep all or something near all 5 and then proceed to pass on weight internally without diminishing its own or diminishing its own only slightly.

Essentially what this means is that building a logical clustered hierarchy is, if anything, even more effective than outlined in the examples above.

Final Word

The data matches closely to what the search engines are trying to push webmasters toward: provide a logical and well-coded site architecture that serves your visitors well and you'll be rewarded. Imagine if Amazon tried to apply the example from the second graphic above and provide a link to each product page on their homepage. Usability would be horrible, page load speed would be a disaster, and because math works well, their rankings would be non-existent.

I don't expect all the readers of this article to draw out diagrams and do the math behind this for their own sites, it's time-consuming enough with a seven page example – let alone a 1,000-plus page site. However, as the numbers expand, the math stays the same and the benefits only amplify. And that's why clustering keywords as discussed in last month's article works.

Source : http://searchenginewatch.com/article/2179376/Internal-Linking-to-Promote-Keyword-Clusters

Tuesday, May 22, 2012

Two Weeks In, Google Talks Penguin Update, Ways To Recover & Negative SEO

It’s been about two weeks since Google launched its Penguin Update. Google’s happy the new spam-fighting algorithm is improving things as intended. But some hurt by it are still wondering how to recover, and there remain concerns about “negative SEO” as a threat. I caught up with Matt Cutts, the head of Google’s web spam team, on these and some related questions.

Penguin: “A Success”

The goal of any algorithm update is to improve search results. So how’s Penguin been for Google?
“It’s been a success from our standpoint,” Cutts said.

What About Those Weird Results?

Of course, soon after Penguin was released, people quickly started citing examples of odd results. The official Viagra site wasn’t listed, while hacked sites were. An empty web site was listed for “make money online,” and there were reports of other empty sites ranking well. Scraper sites were reported outranking the sites they scraped.

How could Penguin be a success with these types of things happening?

Cutts said that many of these issues existed before Penguin launched and were not caused by the new spam-fighting algorithm.

Indeed, the Viagra issue, which has now been fixed, was a problem before Penguin hit. Penguin didn’t cause it.

False Positives? A Few Cases

How about false positives, people who feel they’ve been unfairly hit by Penguin when they weren’t doing any spam?

“We’ve seen a few cases where we might want to investigate more, but this change hasn’t had the same impact as Panda or Florida,” Cutts said.

The Panda Update was Google’s big update that targeted low-quality spam last year. The Florida Update was a major Google update in 2003 intended to improve its search quality.

I’d agree that both of those seemed to have impacted more sites than Penguin has, based on having watched reactions to all these updates. Not everyone will agree with me, of course. It’s also worth the regular reminder that for any site that “lost” in the rankings, someone gained. You rarely hear from those who gain.
Bottom line, Google seems pretty confident that the Penguin Update is indeed catching people who were spamming, as was intended.

Why Spam Still Gets Through

Certainly when I’ve looked into reports, I’ve often found spam at the core of why someone dropped. But if Penguin is working, why are some sites that are clearly spamming still getting through?

“No algorithm is perfect. While we’d like to achieve perfection, our litmus test is, ‘Do things get better than before?’,” Cutts said.

Cutts also explained that Penguin was designed to be quite precise, to act against pages when there was an extremely high-confidence of spam being involved. The downside is that some spam might get through, but the upside is that you have fewer false positives.

How Can You Recover?

One of the most difficult things with this update is telling people how to recover. Anyone hit by Penguin was deemed to be spamming Google.

In the past, if you spammed Google, you were told to file a reconsideration request. However, Google’s specifically said that reconsideration requests won’t help those hit by Penguin. They’ll recover naturally, Google says, if they clean the spam up.

However, one of the main reasons I’ve seen when looking at sites hit by Penguin seems to be bad linking practices. People have used sponsored WordPress themes, or poor quality reciprocal linking, have purchased links or participated in linking networks, such as those recently targeted by Google.
How do people pull themselves out of these link networks, if perhaps they don’t have control over those links now?

“It is possible to clean things up,” Cutts said, and he suggested people review two videos he’s done on this topic:

 

 


“The bottom line is, try to resolve what you can,” Cutts said.

Waiting On Penguin To Update Again

If you do clean things up, how will you know? Ideally, you’ll see your traffic from Google recover, the next time Penguin is updated.

That leads to another important point. Penguin, like Panda, is a filter that gets refreshed from time-to-time. Penguin is not constantly running but rather is used to tag things as spam above-and-beyond Google’s regular spam filtering on a periodic basis.

Is Penguin a site-wide penalty like Panda or page-specific? Cutts wouldn’t say. But given that Panda has site-wide impacts, I think it’s a fair assumption that Penguin works the same.

What that means is that if some of your site is deemed Penguin-like, all of it may suffer. Again, recovery means cleaning up the spam. If you’ve cleaned and still don’t recover, ultimately, you might need to start all over with a fresh site, Cutts said.

New Concerns Over Negative SEO

Before Penguin, talk of “negative SEO” had been ramping up. Since then, it seems to have gotten worse in some places. I’ve seen post-after-post making it sound as if anyone is now in serious danger that some competitor can harm them.

At the core of these fears seems to be a perfect storm of assumptions. Google recently targeted some linking schemes. That caused some people to lose traffic. Google also sent out warnings about sites with “artificial” or “unnatural” links. That generated further concerns in some quarters. Then the Penguin Update hit, which caused more people to lose traffic as they were either hit for link spam or no longer benefited from link spam that was wiped out.

These things made it ripe for people to assume that pointing bad links at a site can hurt it. But as I wrote before, negative SEO concerns aren’t new. They’ve been around for years. Despite this, we’ve not seen it become a major concern.

Google has said it’s difficult for others to harm a site, and that’s indeed seemed to be the case. In particular, pointing bad links at a good site with many other good signals seems to be like trying to infect it with a disease that it has antibodies to. The good stuff outweighs the bad.

Cutts stressed again that negative SEO is rare and hard. “We have done a huge amount of work to try to make sure one person can’t hurt another person,” he said.

Cutts also stressed again what Google said before. Most of the those 700,000 messages to publishers that Google sent out earlier this year were not about bad link networks. Nor were they all suddenly done on the same day. Rather, many sites have had both manual and algorithmic penalties attached to them over time but which were never revealed. Google recently decided to open up about these.

After Negative SEO Campaign, A Link Warning

Of course, new messages do go out, which leads to the case of Dan Thies. His site was targeted by some trying to show that negative SEO works. He received an unnatural link warning after this happened. He also lost some rankings. Is this the proof that negative SEO really works?

Thies told me that his lost rankings were likely due to changes he made himself, when he removed a link across all pages on his site that led back to his home page. After restoring that, he told me, he regained his rankings.

His overall traffic, he said, never got worse. That tends to go against the concerns that negative SEO is a lurking threat, because if it had worked enough to tag his site as part of the Penguin Update, he should have seen a huge drop.

Still, what about link warning? Thies did believe that came because of the negative SEO attempt. That’s scary stuff. He also said he filed three reconsideration requests, which each time returned messages saying that there were no spam actions found. Was he hit with a warning but not one that was also associated with a penalty?

I asked Cutts about the case, but he declined to comment on Thies’s particular situation. He did say that typically a link warning is a precursor to a ranking drop. If the site fixes the problem and does a reconsideration request quickly enough, that might prevent a drop.

Solving The Concerns

I expect we’ll continue to see discussions of negative SEO, with a strong belief by some that it’s a major concern for anyone. I was involved in one discussion over at SEO Book about this that’s well worth a read.
When it’s cheaper to buy links than ever, it’s easy to see why there are concerns. Stories like what happened to Thies or this person, who got a warning after 24,000 links appeared pointing at his site in one day, are worrisome.

Then again, the person’s warning came after he apparently dropped in rankings because of Penguin. So did these negative SEO links actually cause the drop, or was it something else? As is common, it’s hard to tell, because the actual site isn’t provided.

To further confuse matters, some who lost traffic because of Penguin might not be victims of a penalty at all. Rather, Google may have stopped allowing some links to pass credit, if they were deemed to be part of some attempt to just manipulate rankings. If sites were heavily dependent on these artificial links, they’d see a drop just because the link credit was pulled, not because they were hit with a penalty.

I’ve seen a number of people now publicly wishing for a way to “disvow” links pointing at them. Google had no comment about adding such a feature at this time, when I asked about this. I certainly wouldn’t wait around for it now, if you know you were hit by Penguin. I’d do what you can to clean things up.

One good suggestion out of the SEO Book discussion was that Google not penalize sites for bad links pointing at them. Ignore the links, don’t let the links pass credit, but don’t penalize the site. That’s an excellent suggestion for defusing negative SEO concerns, I’d say.

I’d also stress again that from what I’ve seen, negative SEO isn’t really what most hit by Penguin should probably be concerned about. It seems far more likely they were hit by spam they were somehow actively involved in, rather than something a competitor did.

Recovering From Penguin

Our Google Penguin Update Recovery Tips & Advice post from two weeks ago gave some initial advice about dealing with Penguin, and that still holds up. In summary, if you know that you were hit by Penguin (because your traffic dropped on April 24):
  • Clean up on-page spam you know you’ve done
  • Clean up bad links you know you’re been involved with, as best you can
  • Wait for news of a future Penguin Update and see if you recover after it happens
  • If it doesn’t, try further cleaning or consider starting over with a fresh site
  • If you really believe you were a false positive, file a report as explained here
Just in, by the way, a list of WordPress plug-ins that apparently insert hidden links. If you use some of these, and they have inserted hidden links, that could have caused a penalty.

I’d also say again, take a hard look at your own site. When I’ve looked at sites, it’s painfully easy to find bad link networks they’ve been part of. That doesn’t mean that there’s not spam that’s getting past Penguin. But complaining about what wasn’t caught isn’t a solution to improving your own situation, if you were hit.


Sunday, May 20, 2012

SEO checklist - 10 things to check right now

If you’re not used to SEO, and frankly, even if you are, there’s a lot to take in, a lot to remember. To make sure your bases are covered, here are ten things to check right now. (Start at the top of your web pages and work down.)

1) The blue text you see in the search results (see the picture further down) is called the ‘title tag’. What do yours look like? Are they enticing? Keyword rich? If someone read yours, would you want to click through?

2) Does your meta description, the bit that comes underneath the title in the search results, describe your web page accurately? Is it inviting? If I was a random user, would I click it? The answer to these questions should always be yes.

SERPs listing

3) Are all your URLs (eg, www.example.com) easy to understand? If a human read them, could they get a sense of what your page was about? Test it now. Pick any page on your website (not your home page), copy what appears in the address bar of your browser and email a relative. An old one. Could they roughly describe your page without seeing it?

4) If someone was to look at your page for three seconds, could they tell you what it's about? Make sure your headers, the headlines in the copy, and the H1 and H2 tags in the code, are descriptive and keyword rich.

5) Is your navigation clear and intuitive? If you asked a stranger to find something on one of your deep-level pages, let’s say, find a specific product or article, could they do it just by using their mouse? Great! OK, now test it. Ask a friend to do this now.

6) Are your images optimized? (Right click on an image and ‘inspect element’ and look at where it says “img alt” - is this an accurate description of the image? Does it even exist? The alt tag is what search engines ‘see’ when you upload an image. It’s also used for site readers for the visually impaired. Ensuring your alt tags are descriptive makes your site more accessible and better for search engines.

7) Does your copy (text) sing? Are the words on your pages inviting, informative and punchy? And do they use the language your customers use, or are they jargonized?

8) Is the copy on every page unique? (It should be.)

9) Does the footer (the bit at the bottom of the web page) offer some more navigation options? Navigation that’s keyword rich, and helpful for users (perhaps they have a slow internet connection - could they still get to where they want to go quickly?). If I just looked at the footer of your home page, would I see links to your most important pages?

10) Do you have clean code? (Right click on your web page and click 'view page source’ to see the code). Is it free of Javascript and Flash where possible? If you’re not a coder and you asked this question of someone who is, what would they say?

Source : http://www.wordtracker.com/academy/seo-checklist

Wednesday, May 16, 2012

Life After Google Penguin – Going Beyond the Name

In looking back at my recent posts here it seems, though not by design, there was a theme emerging. Have a look...
And that was all pre-Penguin no less. Seems my Spidey-sense was tingling. The world of search engine optimization just keeps getting more convoluted. Now more than ever, very little is clear.

To date I have not touched upon the Penguin update because, well, we just didn't know. There wasn't enough data to say much. Of course that really hasn't changed, but there are a few things we can certainly look at to help better understand the situation at hand.
But let's give it a go anyway shall we?

Penguins at the Googleplex

A Name is Just a Name

The first thing we need to consider is that there are numerous Google algorithm updates, some of which aren't named. In the weeks before the infamous Penguin rolled out, there was a Panda hit and another link update. The three of them, being within a five-week period, makes a lot of the analysis problematic.

And that's the point worth mentioning. Don't try too hard to look for dates and names. Look more to the effects.

We're here to watch the evolution of the algos and adapt accordingly. Named or not, doesn't matter. Sure, it can be great for diagnosing a hit, but beyond that, it means little.

Regardless of the myriad of posts on the various named updates, none of us really know what is going on. That's where the instinct part of the job comes in. Again, knowing the evolution of search, goes a long way.

What is Web Spam?

To understand how web spam is defined, you need to look at how search engineers view SEO. While there are many, I like this:
“any deliberate human action that is meant to trigger an unjustifiably favorable relevance or importance for some web page, considering the page's true value.” (from Web Spam Taxonomy, Stanford)
And:
“Most SEOs claim that spamming is only increasing relevance for queries not related to the topic(s) of the page. At the same time, many SEOs endorse and practice techniques that have an impact on importance scores to achieve what they call "ethical" web page positioning or optimization. Please note that according to our definition, all types of actions intended to boost ranking, without improving the true value of a page, are considered spamming.” (emphasis mine)
Well la-dee-da huh? We can intimate that Google has eased that stance by trying to define white hat and black hat, but at the end of the day any and all manipulation is seen in a less than favorable light.

The next part of your journey is to establish in your mind what types of activities are commonly seen as web spam. Here's a few:
  • Link manipulation: Paid links, hidden, excessive reciprocal, shady links etc.
  • Cloaking: Serving different content to users and Google.
  • Malware: Serving nastiness from your site.
  • Content: Spam/keyword stuffing, hidden text, duplication/scraping.
  • Sneaky JavaScript redirects.
  • Bad neighborhoods: Links, server, TLD.
  • Doorway pages.
  • Automated queries to Google: Tools on your site, probably a bad idea.
That's about the core of the main offenders. To date with the Penguin update, people have been mostly talking about links. Imagine that... SEOs obsessed with links!

However, we should go a bit deeper and surely consider the other on-site aspects. If not on your site, then on the site links are coming from.

On-site Web Spam

Hopefully most people reading this, those with experience in web development and SEO (or running websites), don't use borderline tactics with their sites. We do know there is certainly elements of on-site with both the Penguin and Panda updates... so it's worth looking at.
Here are some common areas search engines look at for on-site web spam:
  • Domain: Some testing has shown that .info and .biz domains are far more spam laden than more traditional TLDs.
  • Words per page: Interestingly it seems spam pages have more text than non-spam pages (although over 1,500 words, the curve receded). Studies have shown the spam sweet spot to be in the 750-1,500 word region.
  • Keywords in title: This was mentioned in more than a few papers and should be high on the audit list. Avoid stuffing; be concise.
  • Anchors to Anchor text: In other studies engineers looked at the ratio of text, to anchor text on a page.
  • Percentage of visible text: This involves hidden text and nasty ALT text. What percentage of text is actually being rendered on the page.
  • Compressibility: As a mechanism used to fight keyword stuffing, search engines can also look at compression ratios. Or more specifically, repetitious or content spinning.
  • Globally popular words: Another good way to find keyword stuffing is to compare the words on the page to existing query data and known documents. Essentially if someone is keyword stuffing around given terms, they will be in a more unnatural usage than user queries and known good pages.
  • Query spam: By looking at the pattern of the queries, in combination with other signals, behavioral data manipulation would become statistically apparent.
  • Phrase-based: looking for textual anomalies in the form of related phrases. This is like keyword stuffing on steroids. Looking for statistical anomalies can often highlight spammy documents.
  • Globally popular words: Another good way to find keyword stuffing is to compare the words on the page to existing query data and known documents. Essentially if someone is keyword stuffing around given terms, they will be in a more unnatural usage than user queries and known good pages.
(some snippets taken from my post "Web Spam; the Definitive Guide")
And yes, there's actually more. The main thing to take from this is that there are often many ways that the search engines look at on-site spam, not just the obvious ones. Once more, this is about your site and the sites linking to you.

A lot of on-site web spam that's a true risk, will be from hacking. Sure, your CMS might be spitting out some craziness, or your WordPress plug-in created a zillion internal links, but those are the exceptions. If you're using on-site spam tactics, I am sure you know it. Few people actually use on-site crap post-Panda, many times it's the site being hacked that causes issues. So be vigilant.

Link Spam

Is the Penguin update all about links? I'd go against the grain and say no. Not only do we have to consider some of the above elements, but also there seems to be an element of 'trust' and authority at play here as well. If anything, we may be seeing a shift away from the traditional PageRank model of scoring, which of course many may perceive as a penalty, due to links.
But what is link spam? That answer has been a bit of a moving target over the years, but here are some common elements:
  • Link stuffing: Creating a ton of low-value pages and point all the links (even on-site) to the target page. Spam sites tend to have a higher ratio of these types of unnatural appearances.
  • Nepotistic links: Everything from paid links to traded ones, (reciprocal) and three-way links.
  • Topological spamming (link farms): Search engines will look at the percentage of links in the graph compared to known "good" sites. Typically those looking to manipulate the engines will have a higher percentage of links from these locales.
  • Temporal anomalies: Another area where spam sites generally stand out from other pages in the corpus are in the historical data. There will be a mean average of link acquisition and decay with "normal" sites in the index. Temporal data can be used to help detect spammy sites participating in unnatural link building habits.
  • TrustRank: This method has more than a few names, TrustRank being the Yahoo flavor. The concept revolves around having "good neighbors". Research shows that good sites link to good ones and vice versa.
(some snippets taken from my post "Web Spam; the Definitive Guide")

I could spend hours on each of these, but you get the idea. With many people are theorizing about networks, anchor texts, etc... the larger picture often evades us. There are so many ways that Google might be dealing with 'over optimization' that we're not talking about.
The last 18 months or so we have seen a lot of changes including the spate of unnatural-linking messages that went out. Again, Penguin or not doesn't matter. What matters is that Google is certainly looking harder at link spam, so you should be too.

It wouldn't hurt to keep a tinfoil hat handy as well… Look no further than this Microsoft patent that talks about spying on SEO forums. Between that and the fact that SEOs write about their tactics far and wide, it's not exactly hard for search engineers to see what we're up to.

Google Groups Therapy

How Are We Adapting in a Post-Penguin World?

What's it all mean? Well I haven't a bloody clue. Anyone who says they've got it sorted, likely needs to take their head out of a certain orifice.

What you should do is become more knowledgeable in how search engines work and the history of Google. Operate from intelligence, not ignorance.

Have you considered the elements outlined in this post when analyzing data and trying to figure out what's going on? I know I didn't. It was researching this post that reminded me of the myriad of various spam signals Google might look at.

Here's some of my thinking so far:
  • It really is a non-optimized world: Don't try too hard for that perfect title. Avoid obsessing over on-page ratios. You don't need that exact match anchor all the time, in fact you don't even need a link (think named entities). In many ways, less-is-more is the call of the day.
  • Keep a history: Be sure to always track everything. And when doing link profile or other types of forensic audits, compare fresh and historic data (such as in Majestic).
  • Watch on-site links: From internal link ratios to anchors and outbound links, they all matter. From spam signals to trust scoring, they can potentially affect your site.
  • Faddish: Another interesting thing, how much it plays into things we know not, was that Google might have an issue of the tactic du jour.
  • Watch your profile: In the new age of SEO it likely pays to be tracking your link profiles. If something malicious pops up, deal with it and make notes of dates and contact attempts.
  • On site: Hammer it and make it squeaky clean. The harder links get, the more one needs to watch the on-site. Schedule audits more frequently to watch for issues.
  • Topical-relevance: When looking at links think about topical-relevance. Are the links coming from sites/pages that are overly diverse (and have weak authority)?
  • Link ratios: Watch for a low spread in anchor texts as well as total links vs. referring domains (lower the better, it means less site-wide links generally).
  • Cleaning up: When possible look at link profiles and clean up suspect links. And I wouldn't wait until you get an unnatural linking message or tanked rankings.
We've seen a ton of data (this one is interesting) since this all went down and while there are common elements, nothing is conclusive (again, there have been a spate of updates). What is more important is to understand what Google wants and where they're headed. It's just another step in the long road of search evolution, don't get caught up in the names.
Taking the easy way out rarely works for success in life. SEO is no different.

Understand how a threshold might be used. This thing of ours is like the old story of the two of us in the woods when a hungry bear appears. I don't have to outrun the bear; just you. Ensure your strategy is within a safe threshold and it should work out just fine.

It's About Time

To close out there is the one part of this that keeps nagging; history. If you've been squashed by the recent updates (including Penguin) it may not entirely be about recent activities. There is a sense that Google is indeed keeping a history and that this may be playing into the large scheme of things.

Some of the most interesting Google patents were the series on historical elements. Be sure to go back and read some of these older posts:
Sure, they're 3-4 years old, but it is probably some of the more telling parts of the mindset change many in the world of SEO need.
Source : http://searchenginewatch.com/article/2174997/Life-After-Google-Penguin-Going-Beyond-the-Name

7 Time-Saving Google Analytics Custom Reports

Google Analytics Custom Reports can be incredible time savers if you have the right reports. Instead of spending time digging around for important metrics, you can find what you need separated neatly into columns for some analysis that will lead to some actionable insight.

1. Content Efficiency Analysis Report

This report is from none other than the master of Google Analytics, Avinash Kaushik. Brands all over the world are starting to double down on content so it's important to answer questions such as:
  • What types of content (text, videos, pictures, etc.) perform best?
  • What content delivers the most business value?
  • What content is the most engaging?
content-effiency-report-google-analytics

The Content Efficiency Analysis Report comes in handy by putting all the key content metrics into one spot.

Here are the columns that the report will pull in:
  • Page title
  • Entrances
  • Unique Visitors
  • Bounces
  • Pageviews
  • Avg. Time on Page
  • Per Visit Goal Value
  • Goal Completions

2. Keyword Analysis Report

keyword-analysis-report-google-analytics

If you're doing SEO, you want to make sure that your optimization efforts are working as intended. Is the right keyword pointing to the right page?

This first tab of this report, Targeting, will break things down by placing the title and keyword side-by-side. The four metrics you'll see are:
  • Unique Visitors
  • Goal Completions
  • Goal Conversion Rate
  • Avg. Page Load Time (sec)
Using the 4 metrics above, you'll be able to judge whether you need to make adjustments to your campaign or not.

The second tab, Engagement, will tell you how effective each page is by looking at the following six metrics:
  • Unique Pageviews
  • Pages/Visit
  • Avg. Time on Page
  • Bounce Rate
  • Percentage Exit
  • Goal Conversion Rate
The third and final tab, Revenue, will tell you how much money a keyword is bringing you based on 3 metrics:
  • Revenue
  • Per Visit Value
  • Ecommerce Conversion Rate

3. Link Analysis Report

link-analysis-report-google-analytics

What websites are sending you the best traffic? If you're link building, what links are worth going back for more? Link building isn't all about rankings, it's about increasing traffic and conversions as well. If you find a few gems, it's worth looking into them more.
Here are the columns you'll see with the report:
  • Source
  • Landing Page
  • Visits
  • Goal Completions
  • Pages/Visit
  • Bounce Rate
  • Percentage New Visits

4. PPC Keywords Report

If you're paying for search traffic, you obviously want to discover high performing keywords. You can then take this data and use it for future SEO campaigns.

Here are the metrics in this report:
  • Visits
  • CPC
  • Goal Completions
  • Cost per Conversion
By breaking things down easily, you'll be able to hone in one which keywords you need to put on hold and which ones you need to pour more cash into.

5. Social Media Report

social-media-report-google-analytics

Ah yes, a report that tells you how different social media channels are performing for you. This is a simple way to figure out where you should consider investing more time into socially.
The social media report looks at:
  • Visits
  • Social Actions
  • Goal Completions
  • Goal Conversion Rate
  • Goal Value

6. E-commerce Traffic Report

ecommerce-traffic-report

If you run an e-commerce site, it's important to break down your different traffic channels to see which one performs best. Why is one channel performing better than the other? Is it worth it to invest more in a campaign that is trending upwards? Is your investment with paid advertising effective?

This report answers some of your e-commerce questions by looking at the following metrics:
  • Visits
  • Percentage New Visits
  • Bounce Rate
  • Pages/Visit
  • Revenue
  • Average Value
  • Per Visit Value

7. Browser Report

browser-report-google-analytics

This report will tell you how different browsers are performing for your site. You'll immediately see which browsers are your winners and which ones might have problems.

For example, if Chrome and Firefox seem to be doing OK but if Internet Explorer has extremely high bounce rates, you might want to look into Internet Explorer more. After all, Internet Explorer has x percent of the browser share. (research market share for internet explorer)

Bonus: Custom Reporting in Google Analytics

Jaime from SEOmoz created a wonderful realtime Google Analytics report. Here's what it looks like:

google-analytics-realtime-data
This spreadsheet allows you to compare different metrics of your choice with different start and end dates as well. You can easily see how your campaigns are performing from a high level all in the comfort of a clean Google Doc.

Want even more custom reports? Make sure to read Greg Habermann’s top five most used Google Analytics Custom Reports to learn about and get custom reports for Unique Visitors by Page; Conversion by Time of Day; Customer Behavior; Top Converting Landing Pages; and Long Tail Converters.

Conclusion

Google Custom Reports ultimately save you a lot of time and help you make actionable decisions that will help your bottom line. Take a few minutes to set these reports up and explore them. You won't regret it.

What are some useful Google Analytics Custom Reports that you use?

Source : http://searchenginewatch.com/article/2175001/7-Time-Saving-Google-Analytics-Custom-Reports

Sunday, May 13, 2012

Google: Can't Recover From Penguin? Start A New Site

Danny Sullivan published a new story yesterday named Two Weeks In, Google Talks Penguin Update, Ways To Recover & Negative SEO.

In that article, he interviews Google's spam lead, Matt Cutts on ways to recover from the Google Penguin update. There are some solid tips there but scary ones also.

Here is one that is scary for those who were hit by the Penguin update:
If you've cleaned and still don't recover, ultimately, you might need to start all over with a fresh site, Cutts said.
Yes, that is scary for someone who was hit, is trying to frantically make changes but has not seen any recovery. Now, if you have not seen a recovery yet - I wouldn't worry, I don't think they refreshed the update yet, so there wouldn't be any recoveries in my opinion.

But Google is not going to roll this back. Google's Matt Cutts said, "It's been a success from our standpoint." Were there false positives? Few Cutts said, "we've seen a few cases where we might want to investigate more, but this change hasn't had the same impact as Panda or Florida." Very interesting.

Key Take Aways:

(1) Google is not going to roll this update back.
(2) Google says it had less of an impact than Panda or Florida.
(3) Don't take drastic measures yet, do what you can now so when Google does refresh the update, maybe you can break free.

Source :  http://www.seroundtable.com/google-penguin-recovery-15136.html

Friday, May 11, 2012

How Google Creates a New Era for SEO

Google has been making substantial changes to its search engine algorithms. This has led many website owners to reconsider how they implement their SEO campaign. Over the last year, two major new algorithm changes are going to have a major impact on website owners.
Google Panda was enacted last year to help remove low quality content sites from the front page of its indexes. This update had a major impact on a number of websites, particularly content farms and low-level affiliates.

The other update has not been put into place yet and we know even less about it than Google Panda. Google chief engineer Matt Cutts had a discussion with Danny Sullivan of Search Engine Land last month to discuss some of Google’s practices. In this discussion, Cutts let it slip that Google was working on a new algorithm change to penalize sites that have engaged in too much SEO.
How Google Creates a New Era for SEO
Both of these topics are expected to have a major impact on search engine rankings in the coming months. Here is the lowdown on what they both mean.


Google Panda Update – Dreaded for SEO

Google Panda has really opened up a can of worms for many marketers. Almost every Internet entrepreneur I know has started panicking about how the Panda Update is going to impact them. That is completely unnecessary though. The Panda Update was intended to hit content farms like Ehow and many of the affiliate sites that use spun or stolen content.

After Panda, many of the leading content firms lost substantial amounts of traffic. One of the most extreme cases was Acesshowbiz.com, which lost 93% of its SEO traffic. Meanwhile, a number of leading content providers like Youtube increased their search engine traffic by 10%.

Google is clearly looking much more closely at quality content now. Internet entrepreneurs have got the message that they are supposed to update their site with fresh content as regularly as possible. Although fresh content remains a priority, entrepreneurs who have felt the Panda’s bite are likely to find that they are going to need to put more of their emphasis on creating insightful, fresh original content, than just updating regularly.

The Panda Update suggests that you should do the following:
  1. Make sure your site is written for humans more than search engines.
  2. Get rid of duplicate content as much as possible.
  3. Publish content that has never been featured elsewhere, such as article directories.
  4. Make sure your content is authoritative, rather than just rewrites of other people’s articles or spun content.
These tips could help you considerably as you try to keep your site on the top of the search engines.

Next Algorithm Change

The next algorithm change could mean any number of things for the site. Matt Cutts was pretty vague with his statement, but most SEO experts have at least some idea of what he was getting at.

The new algorithm change is supposedly intended to target over-optimized websites. Adam Audette of Rimm-Kaufman Group stated that SEO should be invisible. The sites that are most likely to get nailed by Google’s new update are those that don’t have any business model.

Audette said that SEO should be an “invisible layer” that is added to a site after creating value to the readers. Too many SEOs think that if they get to the front page of Google then the bucks will start rolling in. This notion is obviously flawed, considering how much readers hate over-optimized content. The new algorithm change is intended to keep these sites from even getting on top of Google.

What are some of changes you may need to make in your SEO model? I would follow these points:

Respect the Panda
 
Whatever new algorithm update Google has in the books isn’t meant to replace Google Panda. It is targeting over-optimized content, while Panda was directed towards content that provides no value. However, there is definitely an overlap between the two. Appeasing the Panda by creating great content will help you take your efforts away from over-optimizing your site for SEO.

Refrain from Black Hat Strategies
Many Internet marketers shun black hat SEOs like they are the worst kind of sinner. I personally don’t have any ethical standing against most black hat SEOs. However, I will say one thing about most black hat SEO tactics: they rarely lead to long-term results.

Google has been waging war with black hat SEOs from day one. However, Matt Cutts new statement showed that Google is clearly working even harder to boot black hat spammers off the front page. Here are some of the things he specifically mentioned:
  1. “Excessive link exchanges.” The reason Google evaluates backlinks is to assess how authoritative a website is in the eyes of others. Google hates seeing sites that just exchange links with each other, because they give no indication on the real value of the site. Personally, I don’t see anything wrong with sites linking back and forth between each other to network and share resources. I think Google can tell when sites are clearly trying to manipulate the algorithm and wants to ding anyone who does anything unnatural.
  2. “Overuse of keywords.” Keyword stuffing has been a no-no for a long-time. Avoid using keywords unnecessarily throughout your content.
  3. “Beyond what Google would normally expect.” Cutts could mean any number of things with this one. Some SEOs have ventured some guesses that I think were right on. One idea they suggest that he meant unnecessarily linking to the homepage in the body of the article of the footer. I have seen plenty of bloggers link to the homepage in the middle of their post for no reason with a random string of keywords. It gets really annoying to be honest and I am sure it looks weird to Google as well.
Most of the penalties should be clear to marketers by now. Google has been flagging unnatural use of keywords and linking for a long time. We will need to assess what Google is really doing differently this time around. Quite frankly, I think they are implementing some new changes that we may not have considered previously. Matt Cutts made a statement back in 2009 that Google doesn’t have any “over-optimization penalty” for websites. However, he specifically used that term in his most recent statement. Has he changed his mind or just took a new stance on the terminology he used?

It is much too early to see. Websites are going to need to see how their sites are ranked now and what they are going to need to do differently in the future. Carefully monitor your websites rankings before and after Google’s supposed algorithm change. This will give you some idea on whether or not you have over-optimized your website.

If your site has witnessed a drop in rankings over the next couple of months, you can pretty safely bet that your site is considered “over-optimized.” Sadly, that doesn’t give you any indication as to what you have done to overly optimize it.

I would say you should do a page-by-page analysis of your site. Analyze every single link, block of text and meta description to see what may look unnatural. You may need to play around a bit to figure it out. Anything that looks unnatural to you should probably be changed.

Just the same, I wouldn’t make any new changes just because your site drops a little for a while. Google often applies a heavy hand to its algorithm changes in the beginning, which harms innocent sites. It could reverse some of those mistakes later, which would help your site regain its ranking. There is no sense undoing a good thing just because Google has inadvertently penalized your site temporarily.

These two new algorithms may be just the beginning of the changes we are going to see with Google over the next few years. They should remind us that Google is constantly working to improve the quality of the user experience. Therefore, we will have to come to terms with the fact that our techniques to enhance search engine rankings may become increasingly obsolete.

As you build your blog, you may need to turn your efforts away from keywords usage and traditional linkbuilding strategies as you attempt to establish yourself on the front page of the world’s most popular search engine.

Sources : http://www.1stwebdesigner.com/design/google-creates-new-era-seo/

Tuesday, May 8, 2012

Google Announced 50+ Search Updates, Which Are Penguin Related?

In Google fashion, late on Friday, Google released their now monthly update on the changes they made to Google search over the past month. It is really great that Google does this and this time they shared 53 changes in April. Here is last months update.
Below I grouped and listed out the more important changes, at least the ones I find to me most important.

But let's try to see which items in this list are Penguin related. Can we even figure that out?

 Penguin Related?
  • Anchors bug fix
  • Keyword stuffing classifier improvement
  • More authoritative results
  • Improvement in a freshness signal
  • No freshness boost for low-quality content
  • Improvements to how search terms are scored in ranking
If I had to guess, these and maybe more, are all related to the Penguin update.
Here are some more that I find important but wouldn't specifically related to Penguin, Panda or others:

Ranking Changes:

  • Improvement in a freshness signal. [launch codename "citron", project codename "Freshness"] This change is a minor improvement to one of the freshness signals which helps to better identify fresh documents.
  • No freshness boost for low-quality content. [launch codename "NoRot", project codename "Freshness"] We have modified a classifier we use to promote fresh content to exclude fresh content identified as particularly low-quality.
  • Smoother ranking changes for fresh results. [launch codename "sep", project codename "Freshness"] We want to help you find the freshest results, particularly for searches with important new web content, such as breaking news topics. We try to promote content that appears to be fresh. This change applies a more granular classifier, leading to more nuanced changes in ranking based on freshness.
  • Improvements to how search terms are scored in ranking. [launch codename "Bi02sw41"] One of the most fundamental signals used in search is whether and how your search terms appear on the pages you're searching. This change improves the way those terms are scored.
  • Backend improvements in serving. [launch codename "Hedges", project codename "Benson"] We've rolled out some improvements to our serving systems making them less computationally expensive and massively simplifying code.
  • Keyword stuffing classifier improvement. [project codename "Spam"] We have classifiers designed to detect when a website is keyword stuffing. This change made the keyword stuffing classifier better.
  • More authoritative results. We've tweaked a signal we use to surface more authoritative content.

Link Analysis Changes:

  • Anchors bug fix. [launch codename "Organochloride", project codename "Anchors"] This change fixed a bug related to our handling of anchors.

Index Updates:

  • Increase base index size by 15%. [project codename "Indexing"] The base search index is our main index for serving search results and every query that comes into Google is matched against this index. This change increases the number of documents served by that index by 15%. *Note: We're constantly tuning the size of our different indexes and changes may not always appear in these blog posts.
  • New index tier. [launch codename "cantina", project codename "Indexing"] We keep our index in "tiers" where different documents are indexed at different rates depending on how relevant they are likely to be to users. This month we introduced an additional indexing tier to support continued comprehensiveness in search results.

Search Listings:

  • More domain diversity. [launch codename "Horde", project codename "Domain Crowding"] Sometimes search returns too many results from the same domain. This change helps surface content from a more diverse set of domains.
  • Categorize paginated documents. [launch codename "Xirtam3", project codename "CategorizePaginatedDocuments"] Sometimes, search results can be dominated by documents from a paginated series. This change helps surface more diverse results in such cases.
  • Country identification for webpages. [launch codename "sudoku"] Location is an important signal we use to surface content more relevant to a particular country. For a while we've had systems designed to detect when a website, subdomain, or directory is relevant to a set of countries. This change extends the granularity of those systems to the page level for sites that host user generated content, meaning that some pages on a particular site can be considered relevant to France, while others might be considered relevant to Spain.
  • Disable salience in snippets. [launch codename "DSS", project codename "Snippets"] This change updates our system for generating snippets to keep it consistent with other infrastructure improvements. It also simplifies and increases consistency in the snippet generation process.
  • More text from the beginning of the page in snippets. [launch codename "solar", project codename "Snippets"] This change makes it more likely we'll show text from the beginning of a page in snippets when that text is particularly relevant.
  • Tweak to trigger behavior for Instant Previews. This change narrows the trigger area for Instant Previews so that you won't see a preview until you hover and pause over the icon to the right of each search result. In the past the feature would trigger if you moused into a larger button area.
  • Better query interpretation. This launch helps us better interpret the likely intention of your search query as suggested by your last few searches.
  • News universal results serving improvements. [launch codename "inhale"] This change streamlines the serving of news results on Google by shifting to a more unified system architecture.
  • More efficient generation of alternative titles. [launch codename "HalfMarathon"] We use a variety of signals to generate titles in search results. This change makes the process more efficient, saving tremendous CPU resources without degrading quality.
  • More concise and/or informative titles. [launch codename "kebmo"] We look at a number of factors when deciding what to show for the title of a search result. This change means you'll find more informative titles and/or more concise titles with the same information.
  • "Sub-sitelinks" in expanded sitelinks. [launch codename "thanksgiving"] This improvement digs deeper into megasitelinks by showing sub-sitelinks instead of the normal snippet.
  • Better ranking of expanded sitelinks. [project codename "Megasitelinks"] This change improves the ranking of megasitelinks by providing a minimum score for the sitelink based on a score for the same URL used in general ranking.
  • Sitelinks data refresh. [launch codename "Saralee-76"] Sitelinks (the links that appear beneath some search results and link deeper into the site) are generated in part by an offline process that analyzes site structure and other data to determine the most relevant links to show users. We've recently updated the data through our offline process. These updates happen frequently (on the order of weeks).
  • Less snippet duplication in expanded sitelinks. [project codename "Megasitelinks"] We've adopted a new technique to reduce duplication in the snippets of expanded sitelinks.

Local Changes:

  • More local sites from organizations. [project codename "ImpOrgMap2"] This change makes it more likely you'll find an organization website from your country (e.g. mexico.cnn.com for Mexico rather than cnn.com).
  • Improvements to local navigational searches. [launch codename "onebar-l"] For searches that include location terms, e.g. [dunston mint seattle] or [Vaso Azzurro Restaurant 94043], we are more likely to rank the local navigational homepages in the top position, even in cases where the navigational page does not mention the location.
  • More comprehensive predictions for local queries. [project codename "Autocomplete"] This change improves the comprehensiveness of autocomplete predictions by expanding coverage for long-tail U.S. local search queries such as addresses or small businesses.

Images & Videos:

  • Improvements to SafeSearch for videos and images. [project codename "SafeSearch"] We've made improvements to our SafeSearch signals in videos and images mode, making it less likely you'll see adult content when you aren't looking for it.
  • Improved SafeSearch models. [launch codename "Squeezie", project codename "SafeSearch"] This change improves our classifier used to categorize pages for SafeSearch in 40+ languages.
Sources : http://www.seroundtable.com/google-updates-april-15111.html

Monday, May 7, 2012

Link Building A-Z Guide – Definitions & Terms

When those of us in search marketing talk and write about link building, we tend to use terms that we think are very commonly understood. We bandy around phrases like "CTR on page 1 of the SERPs is better than on page 2" and "god help me if my content gets deindexed."

However, for the new guys and gals out there (and that includes people who are both learning about building links and clients who seek link services) this link building guide will help define and explain some of the more common link building terms, from A to Z.

A - Anchor Text, AC Rank, Actual PageRank

Anchor text
The content inside of the anchor element ( < a>anchor text</a>) and is designed to give you an idea of what the content you are pointing to is about. The anchor element contains an href attribute where the target of the link is designated. The anchor element is, many times, called an anchor tag.

AC Rank
Majestic SEO's measure of a page's importance, on a scale of 0 to 10. It can be considered an alternative to Google's PageRank and is used in various link tool programs. The AC Rank stands for A Citation Rank.

The Actual PageRank
Google's value for your page, and it's not what you see on a tool or your toolbar, as that isn't updated frequently enough to reflect the true value.

B - Backlink Profile, Blog Network, Bing, Blekko, Bait

Backlink profile
A term used to describe the links coming into a site from sources other than the site itself.

Blog networks
Exactly what they sound like: networked blogs. Their importance in link building has recently been compromised as several high-profile and large networks (e.g., BuildMyRank) have been devalued.

Bing
The most popular alternative to Google's search engine at the current time, owned by Microsoft.

Blekko
Also a great alternative to Google and prides itself on being a spam-free search engine. It has some great features that can help you when link building.

Bait [link bait]
Content that is specifically designed in order to naturally attract links.

C - Conversion, CTR, Content

Conversion
A term used to describe an event where a user performs a certain action that is valuable to you as a site owner. Some webmasters view a contact email as a conversion, for example, while others simply view an actual sale as one.

CTR [click through rate]
A term associated with PPC but becoming more popular in the general SEO vernacular as some speculate that it may become more important in ranking. Your CTR is the number of times your listing is shown (triggered by a search and referred to as impressions with PPC) divided by the number of times it's clicked upon, calculated as a percentage.

Content
The subject matter, in text and images, of your site and its pages. Content is also used to describe anything that your brand produces, whether it's a guest post on another site, an article that you distribute, a press release, or an infographic.

D - Deep Link Ratio, Directories, Drain Rank, Deindexed

Deep link ratio
The percentage of links that go to your subpages vs. just your home page. Many different views abound about what number is ideal.

Directories
One of the most consistent ways that people have built links throughout the years. There are paid and free versions, directories that accept all submissions and many that are quite picky about what they'll accept, and while they have fallen out of fashion somewhat recently, they are still a valid source of traffic.

Drain rank
This refers to the idea that linking out to other sites drains your PageRank.

Deindexed
Refers to being thrown out of a search engine and removed from their database.

E - External Link, Equity

External links
Links that go from your site to someone else's site. Some people nofollow them in order to prevent them from receiving any link juice.

Equity
The group of links pointing to your site at a point in time.

F - Followed Link, Footer Link, Footprint

Followed links
Links that are allowed to send link juice to their targets. For ranking purposes, these are the kind of links that you want. A link without a rel=nofollow is a followed link.

Footer links
Links that appear in the footer of a site, generally on every page. These were originally so abused that many SEOs now consider a footer link to be very poor. However, there are still legitimate footer links.

Footprints
Ways of identifying patterns that you're using to build links. For example, if 75 percent of your links come from non-U.S.-hosted sites and are all on blogrolls, that's a big footprint. A "natural" backlink profile should not have many obvious footprints due to its organic nature, therefore having easily identifiable footprints is a potential bad sign for your site. However, you can have a good footprint too (such as if you had a lot of great and authoritative links from respected news sources because your site was constantly being cited there.)

G - Google, Guest Posting, Graph

Google

So powerful, it's now a verb. No matter what anyone says, almost all of us market to what Google wants.

Guest posting
A popular way of building links and creating new content. Many sites actively recruit for new guest posters and some are amenable to the idea when contacted. The whole idea of a guest post is to raise exposure for a brand on another site, but it's quickly becoming a spammy and abused method. However, when done correctly, guest posts can bring you some fantastic traffic.

Link graph
Generally speaking, the link graph is a representation of links for sites. It can be thought of as being the "normal" for a niche of sites but may also refer to links for a certain market sector/keyword/locality/etc. You can use a link graph for competitive research to define what everyone else is doing and see where you stand in relation to that. A complicated thing to define, as it's not a discrete concept.

Href, Hashtag, Hidden Link

href
An HTML attribute that lists the target of a link. An example is < a href="http://www.w3schools.com">Visit W3Schools < /a>.

Hashtags
Widely used on social network platforms in order to associate a tweet/comment with something. They begin with #. On Twitter, hashtags are used to help trend certain ideas. For link building purposes, hashtag searches on Twitter are useful for finding good potential link targets.

Hidden link
A link that is intentionally coded in order to not appear as a link. It can be hidden using a text color that is the same as the background, placed inside an irrelevant image, font size 0, etc. These are viewed as manipulative and deceptive and can cause Google to remove your site from their index.

I - Image Link, Internal Link, Inbound Link

Image link
An image that is linked to a target. Image links are part of a natural link profile and can pass link juice, but they do not include anchor text as regular text links do. Instead, they use an alt text (which is also used by screenreaders) to give information about the link target.

Internal Link
A link from one page of your site to another page on your site.

Inbound links
Links coming to your site from a site other than your own. The anchor text of an inbound link supposedly tells the search engines what your page is about, thus helping you rank for that term.

J - Juice

Juice
A term used to describe the benefit received from a link, also referred to as link juice.

K - Keyword

Keywords
Words or phrases for which you want to rank in the search engines. They should be present in your copy and in links pointing to your site.

L - Link Profile

Link profile
The collective group of sites that link to you.

M - MozRank

MozRank
A method of measuring the link popularity of a webpage by SEO software provider SEOMoz. Becoming a more important metric by the day, almost akin to PageRank.

N - Nofollowed Link

nofollowed link
These are indicated by placing a rel="nofollow" into the link code. A nofollow is designed to tell Google that the link should not pass value to the target. Nofollows are also used internally for PageRank sculpting and to indicate that a link is sponsored/paid. Nofollow links are not good for ranking purposes but they can be good for traffic.

O - Outbound Linking

Outbound linking
The practice of linking from your site to another. Many people nofollow these links in an effort to conserve link juice, but that practice is becoming a bit more frowned upon recently.

P - PageRank, Panda, Penguin, Paid Links

PageRank
Google's measure of a page's importance. There's a difference in what you can see as your PageRank and what Google thinks it is.

Panda
A Google algorithm update that can make grown men cry. It first struck fear into our hearts in February 2011 and was an effort to force higher quality sites higher up in the SERPs. After the first update, we've seen several more. There's way, way too much to go into here but you can read all the SEW articles about it here.

Penguin
A new search algorithm designed to detect, and boot out, spam. Like Panda, it made us cry and several sites were "accidentally" affected by it, so badly that there's actually a form to fill out if you think you're one of those accidental cases. Again, there's too much to go into so read about it here.

Paid links
Refers to links that are bought and placed on a website, with the intention of helping the buyer's website rank better. When not indicated as such, are a violation of Google's guidelines and are a risky tactic. Paid links can be problematic both for the site selling them and for the webmaster buying them as both practices can get you penalized. If a link has been purchased, it should be indicated as such with a nofollow according to Google.

Q - Query

Query
Simply a question that you ask a search engine or a database, whether or not it's in the form of a question. We refer to queries in terms of how many times someone searches for a keyphrase, and in manners related to seeing where you rank in an engine.

R - Rel, Robots, Redirects, Rot, Rank

Rel
An element that gives the role of a link. Current uses critical for link building are to say whether a link should be followed (the default) or nofollowed (rel=nofollow).

Robots
Search engine bots, but robots can be slang for the robots.txt file, which gives instructions to engines about what to do with your site. If you don't want certain pages to be indexed, you block them in the robots file. There are also meta robots tags (< META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">) A robots.txt file is also found at url.com/robots.txt.

Rot
A term used to describe what happens when there are links pointing to pages that are no longer available and not properly redirected or handled.

Rank
Where you show up on the SERPs.

S - Sitewides, Social Signals, SERPs, Spam

Sitewide links
Links that are on every page of a site. You commonly see them in sidebars and footers, and while they once were a pretty easy way to get good rankings quickly, they're no longer viewed so positively. You do tend to find them in almost any backlink profile though, as they are part of a natural profile.

Social signals
Signs that your site/post/article is doing well socially, on the main social network platforms. Social signals are thought to be an ever-increasing method of measuring importance in the search engines and may become a bigger part of algorithms.

SERPs [search engine results pages]
The pages Google, Bing, and others show you after you've performed a search.

Spam
Jokingly referred to as being "sites positioned above mine", but is defined as being anything that clutters the web and makes for a poor user experience. Spam links are considered to be links that are irrelevant and low-quality but pursued simply to improve rankings.

T - Twitter, Toolbar Pagerank (TBPR)

Twitter
A social media platform where users communicate through 140 characters or less. It's becoming more and more useful for finding good information as it happens.

Toolbar PageRank [TBPR]
The number from 0 to 10 that you can see that reflects the most recently updated idea of how important your site is to Google. It is not Google's true value of your site.

U - Underline, Unnatural Link Warnings

Underline
To signify most links, the linked keywords will be underlined. Links are commonly coded with underlining; style manipulations that do not underline a link can be considered to be a hidden link.

Unnatural link warnings 
Like lice, nobody wants to see them. They are messages received in Google's Webmaster Tools that indicate that some potentially unnatural links have been detected for your site.

V - Velocity

Velocity
Your link growth speed. It can be measured with Link Research Tools.

W - Webmaster Tools

Webmaster Tools
Google's free platform that you can use to keep an eye on your site in Google's eyes. It can be a first line of defense when you notice any negative changes with rankings and traffic.

X - Xenu

Xenu's Link Sleuth
One of those old-school things that anyone who's been involved in SEO for more than a few years probably loves. Xenu's Link Sleuth identifies broken links on sites.

Y - Yahoo

Yahoo
The other search engine. Many link builders will refer to being listed in the Yahoo Directory, which used to be one of those things that we all recommended. Today, Bing provides the search results you see on Yahoo.

Z - Zzzzz

Zzzzz
Sleep, which you definitely need if you're going to link build. It's tiring work!

Sources : http://searchenginewatch.com/article/2172916/Link-Building-A-Z-Guide-Definitions-Terms