miércoles, 5 de diciembre de 2012

| StuntDuBL

| StuntDuBL
Hit by traffic…not cars.
SEMMY for Social Media
Feb 2nd 2010, 01:56

Thanks to everyone who helped me by voting for the post 7 Reasons Your Social Media Failed (and how to fix it).for the 2010 Social Media SEMMY. I’m honored that my writing is held in such high regard by people. Getting something like this is a bit of a big deal to me – not because I really am this vain (okay maybe a little vain), but because I know I was nominated and voted by search and social community folks who appreciated me taking the time to write something decent to learn from and help make their day to day a bit easier. I’m very lucky to be a part of a great community of search and social marketers who have inspired me to a level of success I really never thought was possible (I mean seriously, writing an acceptance speech? I’m trying to be cool here, but I love you guys:) Thanks to everyone in the community who has ever helped me with something. Keep helping folks, one day it always pays off (if only in beers at conferences and shiny badges:)

The post SEMMY for Social Media appeared first on | StuntDuBL.

Cracking the Google Algorithm, and Understanding Search Patents with Ted "tedster" Ulle
Jan 28th 2010, 22:48

Ted Ulle aka "Tedster" surpassed a mind-blowing 26,000 posts at Webmasterworld where he has been an adminstrator for years as one of the unsung heros of SEO as a gracious contributor to one of the web’s most comprehensive and informative SEO forums. Tedster was credited with discovering the possible cause of the Google’s 950 penalty, among a multitude of other discoveries and assertions that have helped to shape the thinking of the community over the last decade.

I’m really excited to announce a video that I got a chance to record with Ted, an good friend from WebmasterWorld.com for years who I refer to as "my google oracle". The video has been dubbed "Cracking the Google Algorithm", and deals with the the influential search patents the search engines have released. If you don’t know Tedster, you better get to know Ted Ulle, because there are few folks around that understand search algos at such a high level, and can create actionable strategy from this understanding. I was really thrilled with how both this and the video came out. Ted is in a very elite group of people who I hold in the highest regard with regards to their opinions on search algos.

What were the top 5 most significant algorithm changes in the last 5 years?

    1. The Jagger Update and the Big Daddy infrastructure that it prepared the way for was a major watershed. When this happened near the end of 2005, everflux began to show in the SERPs. Rather than once a month ranking updates, the ranking shuffle became continual.

      Monthly Google History: http://www.webmasterworld.com/google/3801699.htm

    2. Google’s war on paid links that began as far back as 2005 raised quite a ruckus. At first Google’s negative actions were taken manually and then algorithmically. Algorithmic false positives began to confuse things even more, and I wish they would have just stopped with showing false PageRank on the toolbar.
    3. Phrase-based indexing, as described in the 2006 patents, brought a deeper level of semantic intelligence to the search results. This power continues to grow today. One big effect – it makes over-emphasis on keywords, especially in anchor text, a problem when it used to be an asset. But there was a major advantage for the content writer who could now throw off the rigidity to major degree and vary their vocabulary in a more natural way.

      reference: http://www.webmasterworld.com/google/3247207.htm

    4. Geo-located results began to create different rankings even for various areas of the same US and UK city somewhere around 2005 or so. Anyone who was still chasing raw rankings as their only metric should have quickly learned that the time for a change was long overdue.
    5. Google’s user "intention engine" has had a major effect, and that rolled out in a big way in 2009. This was coupled with a kind of automated taxonomy of query terms. Now, sometimes a certain kind of site will just never rank for a certain keyword, no matter what they try. The site’s taxonomy has to line up with the taxonomy of the query term.

      reference: http://www.webmasterworld.com/google/3980481.htm

  1. Which will be the top 5 changes in the next 5 years?

    1. Google will improve their algorithm that tries to identify "bad", or manipulative, links. They can already nullify links at the level of the site, the page, or even the individual link, and the heuristics will get more precise going forward. It’s also hard for webmasters to reverse engineer this approach, because there is no major symptom that stands out, like there is with a penalty.
    2. So "bad" link nullification becomes a kind of stealth action for Google. Many have noticed that PR on the whole seems to be harder to get. Sites that used to have a home page PR 8 may now be a PR 6, for instance. This is partly due to the bad-link wipeout that Google is already rolling with.
    3. The beginnings of sentiment analysis may begin to show up in the next few years. I expect to see it first on the level of rating for where content falls on a fact-to-opinion spectrum. Full sentiment analysis (rating content on a "favorable-to-critical" opinion spectrum) is already in use for some social media monitoring, but that is probably too big a technical challenge to expect Google to go with it in the general search results. For example, how can an algorithm recognize irony, where the author is writing words with the opposite of their true meaning?

      However, Google will be rolling with sentiment analysis in some areas. For example, I wouldn’t be surprised to see it employed in Adwords Quality Scores at some point within five years.

    4. Another place Google might experiment with sentiment analysis is in their experimental "real time" search – Twitter integration and so on. However, the pitfall with sentiment analysis is that Google would also begin to INFLUENCE opinions, rather than just making them findable. In areas like politics this could be a very slippery slope.
    5. Finally, there’s one area where Google may legally need to integrate some sentiment analysis, and that’s in the Search Suggestions that tempt the search user with ideas as they type. Google lost a court case in France in 2010 and they were required to remove the word "scam" from on brand’s search suggestions.

      Those search suggestions are easily spammed, especially on brand names, and Google needs to find a good algorithm to limit their exposure for slander and libel. Sentiment Analysis could be at least part of the answer.
      reference: http://www.bigmouthmedia.com/live/articles/google-scam-suggestion-condemned-by-high-court.asp/6680/

    6. Additionally, site speed will be included as a factor over this coming year, and will be refined going forward. This might give a ranking advantage to sites that can afford a Content Delivery Network (CDN).

      However, you don't have to fork over the big bucks for Akamai and the like any more, just to gain a speed advantage and overcome latency on the web. There are a number of solid peer-to-peer CDN options these days. In fact, the CDN industry may be a strong growth area as this new ranking factor takes root. Google has a vision for what the web SHOULD be like, and they are pushing it quite actively.

      Caffeine will have a major effect in the speed of processing updates to the rankings. A lot of the factors that Google has been mentioning in patents, such as Historical Factors or Phrase Based Indexing, sound good but don’t seem to be very active right now. With new and speedy Caffeine infrastructure, a lot of those become computationally more feasible, and will be updated more frequently.

      The wild card for me is HTML

    7. Google is leading that charge, and how they will treat early adopters will be very interesting to watch. There are many features of HTML 5 that will allow a web author to send VERY clear signals about the page, what’s the content, what’s the menu, what’s just auxiliary information, and so on.

      reference: http://www.smashingmagazine.com/2009/07/06/html-5-cheat-sheet-pdf/

Which penalties, filters, or bannings are most pervasive?

Keyword stuffed anchor text (internal and external) and backlink manipulation penalties are the most common. Links have long been Google’s ticklish underbelly, and if you mess around too much in there, they will scratch you right back.

What are 4 caveats to tripping filters or incurring penalties?

Watch your backlink profile. If you’re not gaining natural backlinks, then don’t try to prop up your lack of natural citations with a lot of manipulation. Instead, put on your thinking cap and understand why no one wants to link to you – and fix that.

 

Beware of overdoing any single keyword. You no longer need to yell at Google to get the point across – and if you do, they’re likely to shout you down. So vary that vocabulary in a natural way. The information retrieval concept to understand is "keyword co-occurrence".

reference: http://www.webmasterworld.com/google/3336435.htm

  • Focus more on your visitors than you do on the latest SEO methods. Nothing avoids penalties like building for your visitors rather than Google.
  • Beware of duplicating content on the same type of TLD. If you can access the content on one international TLD (com, net, org, etc) then that’s enough. But don’t worry about duplicates across country-code TLDs, since they don’t compete with each other.
  • What are some filters that can be easily identified and overcome and how?

    Backlink manipulations, whether paid links or merely strategic alliances, usually result in very obvious ranking drops. This jumps out for a site owner because page 1 rankings fall to page 5 overnight. The cure is to back out of it, and submit a reconsideration request where you come clean.

    Overdoing it on internal anchor text is another dramatic re-ranking. You can fall from page 1 to deep in the SERPs overnight. Again, the fix is just to back off, but in this case you don’t need to submit the request. As soon as the threshold violations are recalculated you should pop out of trouble. The phrase-based patents, again, detail the way these thresholds are calculated.
         
    Sometimes a ranking loss is just a gradual slide because you’re not getting any new, fresh link juice. If that’s the case, put on your marketing hat and let the world know what you’ve got.
         

    What factors most impact sitelinks?

    Today, I’d say the single biggest factor is traffic. In the beginning sitelinks had more to do with your menu structure and internal linking. But now we see Google surfacing the popular sections of the site into sitelinks, whether they are on the main menu or not.

    For some sites, a steady flow of fresh content into a section also seems to be a factor. But it’s hard to isolate the freshness factor from the influence of traffic — the two just go together.

     

    Which people at WebmasterWorld (or elsewhere) were the most influential in you becoming the Google prognosticator and successful consultant that you have become today?

    It’s difficult to single anyone out – because really it’s been the whole community. I was influenced early on by people like Brett Tabke, Dave Naylor, Greg Boser (webguerilla), Shakil Khan, Todd Friesen, and Bob Jordan – but there are many others.

    It was Nick Wilson, for instance, who kicked my butt into learning CSS. Since then, using source ordered content has been a big, long-term win for me. Edward Lewis (pageoneresults) got me serious about taming the IIS server. Without that knowledge, I would be unable to work with many major corporate clients.

    In terms of consulting, I really owe Neil Marshall in a big way, because he set me straight early on about valuing my knowledge and being willing to put an appropriate price tag on it.

     

    If you could tell every webmaster one thing they shouldn't miss out on doing for their site, what would it be?  

    Build an Information Architecture that supports your marketing objectives. That is, build the site template from your market’s point of view rather than your company’s internal point of view. And having done that market research and gained a solid understanding, then be sure your site architecture makes it clear and easy for the visitor to take the actions you most want to see.

    Said another way, conversions start with the page template – and there’s a great deal of accumulated wisdom out there to tap into. Having taken your best shot at launch, then you can refine it with A/B or multivariate testing. And from there your off into analytics land. It’s a long and happy trip, but it all starts with the IA.

    Thanks Ted for taking the time, and being so wonderfully open and helpful, and for doing such great things at Webmasterworld. You are the type of person in this community that makes me proud to be a part of the search optimizer and webmaster community. For more great insights from Ted, check out the google forum at WebmasterWorld (the awesome community where I learned so much of what I know), or the full video interview on "Cracking the Google Algorithm" here.

    The post Cracking the Google Algorithm, and Understanding Search Patents with Ted “tedster” Ulle appeared first on | StuntDuBL.

    You are receiving this email because you subscribed to this feed at blogtrottr.com.

    If you no longer wish to receive these emails, you can unsubscribe from this feed, or manage all your subscriptions

    No hay comentarios:

    Publicar un comentario en la entrada