The Google “Horde” Struck Me Down!

Posted: May 30th, 2012 | Author: | Filed under: Natural Language Search | No Comments »
Mongol Hoard Attacks

April was a month of changes with Google and the Penguin update, which included “Horde”, which does not make everyone happy with organic search, including me, as a Google stockholder, in this depressed economy.

Did My Business Really Suffer in Loss of Organic Rankings From Panda and Penguin, or Was it Something Else?

There were many updates to the list of April 2012 algorithm changes. One is called: codename “Horde.” Was it Penguin or Panda that caused my results to fall? Many of my social network friends, who are personal injury attorneys, are in a mad dash to pay anyone to get them more branded and naked anchors to comply with Penguin and Panda. They want to get back the multiple top slots for the same terms they were number one, page one, prior to the April algo update. But if your still number one for one of those terms, and page two for the other, you probably were hit by Horde, for being too damn good, now your a “Domain Crowder.” (Damn you!)

Full disclosure, I am a Google stockholder and am very happy with the launch of Horde.   No I am not, because it no longer matters as much, how terrific your domain, your authorship signals, all the great resources and with all its votes. At this point Google is not considering these normal ranking factors anywhere near as much as they had, in ranking organic search in the past.

This April launch can be found as [launch codename “Horde”, project codename “Domain Crowding”] at here.

What this change has done is make it impossible for a powerful website, even if it survived Penguin and Panda, that has hundreds of thousands of backlinks and thousands of pages of AWESOME content, not get the first page, number one slot on Google for every keyword that would “naturally” rank (This is because the algo now says this is no longer “natural”?). Even, if your domain has better information than the domain that is now ranking above it on the search engine, it just “don’t matter”.

This is the buzz all over the search community, and no one but me is really talking about Horde.  Inferior sites, that just happen to have survived Penguin, as they only have a few backlinks that just happen to fit some mathematical calculation of 2-5% exact match keywords weight, etc., are taking over that number one page one slot, and the better, older site with the same or similar “weight” gets knocked down.

For example:  If you had a site that ranked number one for “Los Angeles personal injury lawyers”, and “Los Angeles personal injury attorneys”, you will find that now you only rank for one of those terms, and an inferior, newer site will now occupy the other number one slot.

Is Google Horde Bad News for SEO People?

The complaint I am hearing from many of my attorney friends, is that: My SEO guy ranked me for all these exact match, and poof, they are gone and now he wants me to pay him to get me a bunch of ‘naked and branded anchors, whatever they are, so I can comply with Panda and Penguin.”

Not so fast, my research indicates that working hard to rank for keywords in the SERPs for many money keywords is IMPOSSIBLE due to Horde. In fact, either out of confusion, distrust of SEO people, Penguin, or all of the above, most agree that PPC is required, in order to try and rank for those money keywords that would normally be there, just due to having a better quality site. It does not even matter if the site has backlinks from trusted sites for those terms, and overall results. Am I wrong? I don’t think I am.  Read the update list for April and see for yourself.

Is PPC As Good As Organic for Getting Clients?

This is probably bad news for SEO people, small law firms, and other small businesses, who will now in all likelihood, be paying $50 to $100 per click to get the same calls as before [but I have to agree that the calls from PPC, are never anywhere near as good as people who researched the organic results to find me].

From personal experience, no matter what keyword is used, most of the calls my law firm has gotten off of PPC, are not viable injury claims. Usually it is an SEO company telling me I need to hire them to get better PPC rankings, a competitor clicking on my ad to knowck out my bids for the day, or a person complaining about an inconvenience, who wants money. Examples would include someone who had to wait in line too long at a fast food restaurant, believing they can file a lawsuit. In any event, each call is costing the firm at least $20.00 a click, and just try disputing those clicks!

Looks Like The Happy Times of Organic Search are Over

The writing is on the wall. It appears this change with “Horde”, is going to result in small law firms, and other businesses, who in the past have relied on one domain for organic rankings, is either a shut down, a lay off (people who would normally do in house SEO, write content, will not fit into the mathematical cost to benefit ratio, etc), or spend more to create many targeted domains.

Just to have a decent PPC campaign for a law firm, is going to cost at least $10,000 after tax dollars, a month, based upon my research and past testing. That will mean laying off two or three part time/full time employees just to pay for the campaign.

Is Horde Good for the User and Marketer

The good side to “Horde” if there is one, according to Google, is to make things “fair”. Surviving these updates to rid the internet of poor quality websites, is affecting every site, including those of us who have worked tirelessly to ensure we have strong backlinks and quality content. Where is that going to leave us? Closing down our online business, laying off employees or giving up on Google altogether? People that I have spoken with about this are beginning to turn to Bing, who has an excellent reputation for reversing suspicious PPC charges.

What About Microsites as a Way to Get Back My Old Keyword Slot?

I think this is one way to try and get back your rankings. The days of the megasite are apparently over. So you can create a microsite, and try and recover that way. Beware, if that site has substantially similar content as your other site, you may be in for a penalty, or filter. The landmine of the vague and ambiguous organic search anti spam rules, is also another reason PPC is looking more and more attractive. So hard choices are ahead for online marketers and SEO companies. Sure, there will always be people selling “Panda Recovery” “Penguin Recovery”, but the only way to recover from Horde, remains to be seen. With this change, hard choices like cost to benefit of hosting, content spinning, staff, etc. You may just want to take another look at PPC, or Bing PPC. My experience with Bing is that they readily reverse bogus clicks, whereas Google made the cost to benefit in disputing bogus clicks unmanageable. I will discuss Local Search and Places, as a method of partial recovery, in a later article. I would appreciate any comments or input, as this is a lot of theory.

Penguin Algorithm Update 1.1 Pushed by Google

Posted: May 27th, 2012 | Author: | Filed under: Natural Language Search | No Comments »

Earlier, I talked about the 40 changes to Google Search in February 2012.  And before I could even  discuss how to get back your rankings after suffering from the Panda 3.3, 3.4, 3.5 and Penguin 1 updates, all of a sudden, on Friday, Penguin 1.1 comes along.  The better news here, is that there is a form you can fill out to send to Google, if you feel your site was unfairly de ranked due to this particular update to the Google organic algo.  So just when you thought you could enjoy your vacation weekend, whammo!   Late Friday, Google put out the first update to its campaign on web spam with Penguin algorithm.

Yep. Matt Cutts announced recently on Twitter that it was a type of data refresh. Cutts said that it would impact less than one-tenth of a percent of English language searches on Google.  Webmasters and SEO people have been contemplating in recent weeks if Google had already released an update or even more than one. Cutts said this is the first update released by Google, since Penguin was launched on April 24th.

Cutts and Google said that Penguin is an algorithm change in Google that is targeting web spam and specifically websites they feel are violating Google’s quality guidelines.  The SEO industry was fast to question if the release of Penguin made search results better or if it actually made them worse. Since it is an algorithm change, Google said it was not going to reconsider requests made via Webmaster Central, but they did setup a form for webmasters to use, if they felt that Penguin targeted their sites by mistake. After the Friday update, the form still remains online for webmasters.


Google’s 40 Highlighted Changes that Improve Search Quality

Posted: May 26th, 2012 | Author: | Filed under: Natural Language Search | No Comments »

This took awhile, and I am still writing my article on the new weighted scores we will need for the most recent version of Penguin and Panda.  I am sure many of your have always focused on exact match keywords, and saw a drop in rankings in the last three months, and maybe an increase for pages that were not that heavily optimized.  That is for a later discussion, but let’s look at some of the other changes to organic search on Google since February of 2012.

There have been several different improvements made during February alone, that some SEO companies hate, and many webmasters like me love. It was a record month for what I have seen Google doing over the years to improve on search quality, with a “reported” 40 changes. Our search systems are continually being improved upon, providing you with the best UI elements, sitelinks, indexing, related searches, SafeSearch, synonyms, autocomplete, and so much more. While the changes that are made may seem simple, they are vitally important and make for a search engine that is radically improved.  I am glad, seeing as how I just bought some shares of Google.

Google’s February’s list of changes:

  • Update for SafeSearch – Due to improvements made on the way we deal with adult content, it is much more accurate and robust. Now it is less likely that you will have to deal with adult content that is irrelevant and showing up on various queries.
  • Expanded sitelinks that are duplicated less often – Signals have been adjusted which helps to lower duplication in snippets of expanded sitelinks. Snippets that are relevant are now generated, which are centered much less on the query and much more on the content of the page. [‘thanksgiving’ is the launch codename, ‘Megasitelinks’ is the project codename].
  • Two old query classifiers have been disabled – For search results to be ranked higher, new classifiers and signals are applied and the search process grows. Due to this process, some of the procedures become outdated. Two of the old classifiers are disabled with this improvement, which relates to the freshness of queries. [‘Mango is the launch codename, ‘Freshness’ is the project codename].
  • Spelling of the Korean language is improved – When a Korean query is performed in the wrong keyboard mode by the user (known as an input method editor or ‘IME’), they will find the correction of spelling to be improved. To be more specific, this change helps a user that accidentally enters Latin queries in Hangul mode and vice-versa.
  • A lot more coverage for searches that are related – A new source of data is brought to you with this launch, which helps the generation of the section ‘Searches related to’.  Coverage is increased considerably, which means it is a feature that will appear frequently for a larger amount of queries. An added advantage is your searches are more refined with the search queries provided. [‘Fuzhou’ is the launch codename].
  • An interim based tracking of history for easier indexing – The procedures for document tracking that are used are changed with this improvement. [‘Intervals’ is the project codename’].
  • Spell correction for English has been improved – The quality of English spelling correction has been improved, especially for queries that are rare. This has been done by simply making just one of the functions of scoring much more accurate. [‘Kamehameha’ is the launch codename].
  • YouTube relevant predictions are more local – The ranking for YouTube predictions has been improved, providing users with local queries that are more local. For example, when the query [lady gaga in] is performed on YouTube’s United States version, the prediction [lady gage in Times Square] might be predicted. However, the prediction [lady gaga in India] may be provided on YouTube’s Indian version of the same search. [‘Suggest’ is the project codename].
  • The procedures for autocomplete have undergone a minor tune-up – Inappropriate and offensive terms fall under policies that are very narrow for autocomplete. The procedures that are used to implement our policies on this subject continue to be refined with this improvement. [‘Suggest’ is the project codename].
  • The expanded sitelinks categorizer has been tweaked – In regards to the process we use to identify snippets that are duplicated, this is an improvement that makes an adjustment to the signal we use. The categorizer we were using was not performing very well for the expanded sitelinks we provide. For this reason, we no longer apply the categorizer in such cases. The end result is sitelinks that are more relevant. [‘Snippy is the launch codename and ‘Megasitelinks’ is the project codename].
  • Local results that are improved – The new system we’ve launched for locating a user’s city is much more reliable. This gives us the power to spot when documents and queries are local to a user.
  • In relation to error pages, three new classifier languages have been added – Crypto 404 pages are easy to detect with the signals we use, which are also referred to as 404s. These are pages that provide the browser with valid text. However, an error message is all that is contained in the text. An example is ‘Page not found’. A particular classifier is extended to Dutch, Portuguese, and Italian.
  • Thumbnail sizes appearing on the results page are more consistent – For a large portion of image content that shows up on the results page, the size of thumbnails has been adjusted. This improvement provides an experience that is more consistent across tablet and mobile, and result types. The new sizes of thumbnails applies to movie posters, news results, application and recipe results, book results, shopping results, and more.
  • Images that are fresher – Our signals have been adjusted for surfacing fresh images. When images appear on the Internet, we are now able to surface fresh images much more frequently. [‘tumeric’ is the launch codename].
  • Link evaluation – Characteristics of links are frequently used because they help figure out the topics of a linked page. The way in which links are evaluated has been changed. A method we have used for numerous years for link analysis is actually being turned off. Parts of our scoring is frequently turned off so our system can be kept easier to maintain, easier to understand and clean.
  • Mobile flight search – Coverage has been increased, features have been improved, and finding flights is as easy as locating them directly on your desktop from With this improvement it is easier than ever to locate flights on your mobile device that are departing from the U.S.
  • Official pages are detected more accurately – More accurate identifications can be made due to the adjustment that has been made with the way official pages are detected. This means pages will no longer be misidentified as being official pages. [‘WRE’ is the launch codename].
  • The data refresh signal for related searches has been updated – The queries that many users ‘type in succession’ is one of the signals we use to generate the section ‘searches related to’. If ‘apple’ is frequently searched directly after ‘banana’ is searched, it is often a sign that the two words are related. Refinements that are generated by the models we use have been updated in this update, which provides users with queries that are more relevant.
  • SafeSearch detection in image search has been improved – The signals we use to detect adult content are improved in image search. This aligns other signals with the ones we use for other search results. [‘Michandro’ is the launch codename and ‘SafeSearch’ is the project codename].
  • Searches that are travel related have been improved – Improvements for triggering various search queries that are flight related have been improved. The result, is users are able to benefit from an improved experience using the Flight Search Feature, which also provides flight results that are more accurate. [‘nesehorn’ is the launch codename].
  • URL country information has been refreshed – Country associations for URLs have been updated and use data that is more recent. [‘country-id data refresh’ is the project codename and ‘longdew’ is the launch codename].
  • Update to 35Spam – Weaknesses in spam protections have been located and fixed.
  • An update to ‘Site’ query has been made – Query ranking is improved with this change for those using the ‘site’ operator. This is done through an increase of diverse results. [‘Semicolon’ is the launch codename; ‘Dice’ is the project codename].
  • Freshness improvements – New signals have been applied; helping to surface content that is fresh much quicker than before. [‘iotfreshweb’ is the launch codename, ‘Freshness’ is the project codename].
  • Future concert dates – Relevant data is aggregated to provide upcoming concert tour dates from many websites.
  • Shopping rich snippets launched internationally – Sites that are the most likely to provide products that are the most relevant to a consumer’s needs is extremely helpful.
  • Snippets that are shopping rich have been launched internationally – This makes it easier to identify sites providing products that are relevant to the consumer’s needs. Things like availability, product prices, review counts, and ratings are highlighted. [‘rich snippets’ is the project codename].
  • In Universal Search our image size can be expanded – The quantity of images we show has been expanded in Universal Search. This provides images that are relevant to sets of searches that are larger. [‘terra’ is the launch codename, ‘Images Universal’ is the project codename].
  • Account settings page now has visual refresh – A higher level of page consistency is provided with the completion of the account settings page visual refresh.
  • Google Korea search results are more organized – Korea searches have been improved which organizes searches better and putting them into blog, news, and homepage sections. [‘smoothieking’ is the launch codename, ‘Sokoban4’ is the project codename].
  • Synonyms for foreign languages are improved – This expands the improvement previously made for English and is now applied to other languages. The results for query terms is relevant pages containing synonyms. [‘floating context synonyms’ is the launch codename, ‘Synonyms’ is the project codename].
  • Health searches have been improved – Searches for health symptoms now result in related health conditions that may apply, making it easier to refine searches.
  • Web History can now be obtained in 20 new countries – This allows users to search and browse through webpages previously visited. This improvement has been launched in Morocco, Malaysia, Estonia, Pakistan, Nigeria, Philippines, Herzegovina, Republic of Moldova, Belarus, Kuwait, Jamaica, Kazakhstan, Azerbaijan, Trinidad, Iraq, Bosnia, Sri Lanka, Luxembourg, Tunisia, Tobago, Lebanon, and Ghana. Web History must be enabled and users must have a Google Account to turn on Web History.
  • An update has been made to the Google bar – The drop-down Google menu has been replaced in this update with set of links that have been expanded on and are more consistent. [‘Kennedy’ is the project codename].
  • Signals that spike topics have been consolidated – This improvement makes it easier for spiking topics to be detected and it eliminates system redundancy. Several signals are used to detect popular topics that are spiking. Signals can now be computed in real-time rather than having to be processed offline. [‘news deserving score’ is the launch codename, and ‘Freshness’ is the project codename].
  • Related searches for images are much better – The update on related searches allows users to explore topics visually, making it simpler to locate perfect images.
  • Search feature for Turkish weather is better than ever – Turkish weather forecasts are presented with more accuracy and more frequency directly on the search results page. This is a result of giving the signals that are used a fine-tuning.
  • Ranking for local search results are improved – By putting more reliance on the results for main searches and using this as a signal, Local Universal results triggering has been greatly improved. [‘Venice’ is the launch codename].
  • Update to Panda – Panda system’s data has been refreshed in this launch, providing users with web changes that are significantly more accurate.
  • Video channel snippets have been improved – This is an improvement that provides direct link coverage that is expanded and a higher level of quality rich video snippets.
  • Universal news coverage is improved – In this launch the bug has been repaired that caused the appearance of News Universal results to fail in instances that its usefulness was indicated in testing.

2010 Google PageRank Update Happening Now | April 3, 2010

Posted: April 3rd, 2010 | Author: | Filed under: Natural Language Search | No Comments »

Yes that’s right, the April 3, 2010 Google PageRank update is happening right now. Install the Google Toolbar to learn how much trust and authority your web site has with G. I was happy, since my main personal injury lawyer site went back to its rightful place as a PR5. (Page Rank 5) You will recall that an SEO company had threatened to report my sites to Google and have them de indexed, unless I used the SEO company. (“we work with Yahoo! and Google to guarantee you first page result, … blah blah blah!”)

Sure enough, several of my sites went to PR 0 and my main site went to PR 4. But luckily, I did a site reconsideration request and was able to get my blog back to a PR3 and now, finally, my main domain is back up!! YES!!!

Many people say Google PageRank doesn’t matter, but I disagree. It is a direct reflection on how good of a webmaster you are and how good you are at following the Google webmaster guidelines. So tell me, did your site gain or lose PR today?

How Tweets Are Ranked In Google Real-Time

Posted: March 24th, 2010 | Author: | Filed under: Natural Language Search, Tech, Twitter | Tags: | No Comments »

Twitter has tweets that are irrelevant to the internet. In order for Google to filter those tweets out they have a new tool that is called the real-time search tool. This tool will right away provide blog posts and tweets from the web. While this is going on the tweets that are not relevant become filtered.

This video from Google goes into more detail:

Algorithm is the main way the search engine that Google uses ranks pages, and PageRank is vital with this search engine. A tweet can be marked of importance depending on the followers of the tweet. The rank can increase or decrease by how many followers a tweet has. The Twitter members decide what tweet is liked the most and important when they follow them.

With this new tool that Google with the ‘latest results” it very well could change information on the web that is real-time.

Did My Site Lose Google Page Rank Due To Slow Site Speed?

Posted: March 22nd, 2010 | Author: | Filed under: Natural Language Search | Tags: , | No Comments »

Did my site lose PR because Google is now including site speed as a ranking factor? This is a question many, at least me, were asking who had lost PR in the last January 2010 update.

The Buzz in 2009

Matt Cutts announced: “Historically, we haven’t had to use it in our search rankings, but a lot of people within Google think that the web should be fast,” and “It should be a good experience, and so it’s sort of fair to say that if you’re a fast site, maybe you should get a little bit of a bonus. If you really have an awfully slow site, then maybe users don’t want that as much.”

Matt Cutts Later Clarifies Speed as a Ranking Factor Issue, Noting that Relevance is Still the “Primary Component”

Matt Cutts set the record straight in 2010 to some degree from my perspective. It does appear that there will be some preference, all things being equal as far as number of votes pointing back to a site, to give better ranking to a site that is faster. Cutts is reported to have said “…If you have two sites that are equally relevant (same backlinks…everything else is the same), you’d probably prefer the one that’s a little bit faster, so page speed can be an interesting theory to try out for a factor in scoring different websites…”

So it is clear then, that you should be concerned about site speed, as it could be one of the many Google ranking factors for both Local Search and Organic search ranking factors that Google could utilize in a super competitive industry. And lately, everything is super competitive when it comes to ranking in Google.

The Yellow Pages seem to be tanking as a form of advertising. I never use them. Do you? I mean other than as a doorstop? So as you can see, you need to keep an eye on what is coming and be ready. Site speed is important to some people at Google, as it should be. Slow sites are a bummer. Slow connection speeds add to the problem. So find a way to speed up your site.

What Is Google PageRank And The Google PageRank Toolbar?

Posted: February 19th, 2010 | Author: | Filed under: Natural Language Search | Tags: , , , , , , , , , , | No Comments »
Exploring PR.

Exploring PR.

There are Several Ways to create HTML documents. First of all, you-can type in HTML code and content yourself instructions, using a text-editing program. Simple text editors are included with all Microsoft Windows operating systems Word Pad and Notepad, and on the Mac. Thanks for sharing information. But owning a website does not mean anything till you get Google’s trust. When owning a website it is important to know what Google PageRank is, and what it does. Pages of sites on the web have numeric values to them and those values make a huge difference with how a page might “rank” with Google. PageRank, or “PR” is a method Google uses to assign how much trust they give your website, and/or its individual pages. Google uses a voting system. PR is arranged in your Google Toolbar as a green meter. If your run your mouse cursor over the Toolbar, you will see a numeric of 1 all the way to ten, otherwise it is concealed and all you will see if the green, or in some cases, white, or clear bar. Typically, if the PR bar is green, but tiny green, and to the left, it is a PR 1-3. PR 4-10 gets darker green as the PR numeric value increases. So a PR 10 will be solid green. See below example:

PageRank 2

PageRank 2


Example of a PR7 Value

Example of a PR7 Value

When someone “votes” for a website’s content by providing a “backlink”, this gives it a higher PR rating and thus, the PageRank goes up. A vote is typically cast, when a “do follow” site provides a link on their site, pointing back to your website. This is called a “backlink”. So if I put a link on my website with your url (your web address), and it points back to a page on your site that I like (maybe one of your child pages has an article I like?), that counts as a vote because I backlinked to your site.

If the page on my site that I place your link on has a high PR, then it will pass higher PR to your page and thus, that page will be viewed by Google as a more trustworthy source. If the rating is low, then it wont increase as much. When a page has just one PR7 vote from a high PR site – for example a home page link on the Matt Cutts Blog above – that PR7 backlink may be viewed as very valuable by Google. It may take hundreds or even thousands of backlinks from PR2 and PR3 sites to get the same PR boost to your site.

But Google tweaks the PR algo (“algorithm”) constantly. (With the advent of social media and Google Buzz, my theory is that traffic to your site is becoming more significant a rating factor than mere number or PR value of backlinks.) That is why staying up to date with the latest rules with PageRanking is a vital part of your task as a webmaster. If a website owner does not stay up to date they could get a penalty, or simply grow stale in the eyes of Google, without knowing and understanding why. Many people look at a sites PageRank because they can then tell if a site or page matters or not. Having a high PagePank with Google is not as easy as it sounds, assuming you even think it sounds easy. Buying backlinks will get you wiped out by Google. But your competitors making false, and unfounded reports to Google that you are buying back-links may hurt you just as bad. There are sites that Google can penalize for a variety of reasons.

In all events, the Google Toolbar PageRank alerts people by telling them what a website’s current PageRank was since the last Toolbar update. If a person lands on a page that is listed 0 with the toolbar then that means they do not have a high PR listing and may have a long way to go before reaching PR number one with the Google Search Engine. Typically, a website that is number one on the Search Engine Results Positioning will have a higher PR than those that position lower. But that is not always the case.

The Google Toolbar is not updated often. As a matter of fact, it is updated perhaps less than five times every single year. The last update the Google Toolbar had, was on February 13 & 14th, 2010. This gives website owners enough time to add in content to their websites that might be valuable and to promote their site so when the next update happens they might have a chance to make the PR toolbar greener. During that time of waiting, the owners would still need to research and study the rules of Google to make sure that they have not changed anything.

One downfall about the Google Toolbar PangeRank value, is that sometimes it can be spoofed; or given a false ranking. This is because it can be manipulated by people like hackers, who want people to think their site has high PR. That way, they can try and sell backlinks to webmasters from a “high PR site”. Get it?

This is nothing new to Google and they consider this a bug with the current toolbar. That said, Google has been working hard on trying to patch up the bug so people cannot cheat other people out with the system. Google keeps a close watch on webmasters who try to do this and Google has warned them not do it.

There are some who don’t like using the toolbar because they feel it is not the same as it once was and the PageRanks can bounce all over the place. What seems to matter these days is how pages on a website are linked. If they are linked and have unique content then it should be easy to get good PageRankings without having to use or depend on updates with the toolbar.

Website owners should only be focused on getting to the top of Google the right way and not worry too much over what one of the Toolbars say or do not say. Keep on indexing new, informative pages, and keep on promoting the way you should and in the end, it’ll pay off. Remember, the green toolbar is just one of many factors Google looks at when assigning the positioning of your site in the SERPS.

Content is king so don’t chase PR. Let PR chase your content. If you have great content, people at sites like Amplify and Digg will vote for you. You can get great, trustworthy backlinks from high PR sites with good articles, news and blogs. Good luck!

Read my feed