With a limited number of recoveries nearly a year after Panda, the first bite might seem like a big concern, however the "too many ads" algorithm updates far more frequently than Panda does:
If you decide to update your page layout, the page layout algorithm will automatically reflect the changes as we re-crawl and process enough pages from your site to assess the changes. How long that takes will depend on several factors, including the number of pages on your site and how efficiently Googlebot can crawl the content. On a typical website, it can take several weeks for Googlebot to crawl and process enough pages to reflect layout changes on the site.
And for those who got hit by Panda then tried to make up for those lower ad revenues with more AdSense ad units, they probably just got served round #2 of Panda Express. ;)
Is it Screen Layout, or Something Else?
In the past Google suggested to a nuked AdWords advertiser that more of his above-the-fold real estate should be content than ads.
However Google has such a rich data set with AdSense that I don't think they would just look at layout. If I were them I would factor in all sorts of metrics like
average page views per visitor
repeat visits & brand searches
clickstream data from Chrome & the Google toolbar (so even if you are using other ad networks, they can still sample the data)
Some sites are primarily driven off of mobile views while other sites might be seen on large monitors. When Google sees every page load & measures the CTRs, tacking actual user response is better than guestimating it.
They could come up with some pretty good metrics from those & then for any high traffic/high earning site they could manually review them to see if they deserve to get hit or not & adjust + refine the "algorithm" until those edge cases disappeared. Google's lack of credible competition in contextual & display ads means they can negotiate pretty tough terms with publishers that they feel are not adding enough value to the ecosystem.
There was no claim of click fraud, copyright issues, or anything like that.
There was no claim of advertiser complaints.
Google offers no customer support phone number, no "you might want to work on this" advice, doesn't list which of the sites in the account they felt could be improved, and RETROACTIVELY nuked past "earnings" ... depending on where it is in the schedule that can amount to anywhere from 30 to 50+ days (I remember Teeceo mentioned how they waited until the day before the AdSense payday to smoke his stuff way back in the day to have maximum impact!)
On Google's latest quarterly earnings call they highlighted how year on year Google's revenues were up 25% but the network revenues only grew at 15%. They also explained the slower network revenue growth as being associated with improved search quality & algorithm updates like Panda.
Left unsaid in such a statement was that until those algorithms rolled out, Google admitted they funded spam. ;) The whole AdSense & content farm problem was created through incentive structures with unintended consequences.
Is the Garbage Disappearing, or Just Moving to a New Landfill?
If you track what is going on with the Google+ over-promotion (long overdue post coming on that front shortly!) or how Google is still pre-paying Demand Media to upload video "content" to Youtube, Google still may be funding the same model, but doing so while gaining a tighter control of relevancy so they can better sort good stuff from crap (when you host content & track user response you have all the metrics in the world to determine how relatively good you think it is). If they over-promote these sites then in the short run they create the same skewed business model problem.
Sure hosting the user experience makes it easier to sort the wheat from the chaff, but the other big risk here is the impact on the rest of the publishing ecosystem. There will be lots of thin spam from popular people on Google+ (anyone launched a celebrity-focused Pay-Per-Plus site yet?) & in-depth editorial content might not be economically feasible in certain categories where there literally is no organic SERP above the fold.
I will complement them on their efforts to clean up some of the worst offenses (from the prior generation of "bad incentives"). If you were hit by it, Panda was every bit as big/brutal as the famous Florida update. If this update is anything near as significant as the Panda update (in how it impacts smaller independent webmasters) then it is going to force more of them/us to move up the value chain.
That may mean pain in the short run, but (for those who take it as a wake up call to develop brand & organic non-search traffic streams) far more rewards in the longrun for those who remain after the herd is thinned.
My grandfather was an autoworker, and I have a weapon he manufactured to protect himself from the company that he would carry to work. It's a big iron pipe with a hunk of lead on the head. I think about how far we've come as companies from those days, where workers had to protect themselves from the company.
I think for many SEOs the idea of starting over is painful, but the best SEOs often enjoy the forced evolution & the game of it all. They don't roll over & play dead or forget SEO. And if Google didn't put hard resets in every once in a while, then the big hedge funds would be mopping up the SERPs and cleaning our clocks with the help of Helicopter Ben.
Areas For Improvement
Of course this could be taken as a positive post toward Google (and it mostly is), but I don't want to come across as a fanboi, so I thought I should do a shout out to a couple things they still need to fix in order to be consistent:
If Google is going to tell people that thick deep content is needed to gain sustainable exposure then they shouldn't be ranking thin + pages in the SERPs just because it is a Google product. Even people who have *always* given Google the benefit of the doubt (full on fanbois) found the Google+ placement in the SERPs distasteful.
Google's AdSense is still sending out some of those automated "you are leaving money on the table" styled emails reminding publishers to use 3 ad units. If such behavior may lead to a smoke job, then the recommendation shouldn't be offered in the first place. Right below the "use 3 ad units" there needs to be a "proceed with caution" styled link (in red) that links to the recent "too many ads" post.
Old case studies that are no longer in line with best practices in the current market should have some sort of notice/notification added to them so new webmasters don't get the wrong idea.
Some of the AdSense heatmaps are roadmaps to penalization. These should have been fixed before yesterday's announcement, but if they are still up there next week then Google is willfully & intentionally trying to destroy any small business owner that follows that "best practice" advice.
Your Feedback Needed
Since this update impacted far fewer sites than the Panda update, there are fewer sample/example sites. Did any of your websites get hit? If so, how would you describe ...
BBVA, Spain's second-largest bank by assets, is teaming up with Google to use its search engine results to provide advanced forecasts of hotel and tourism demand in the country, part of a plan to market real-time economic indicators to its clients.
The bank and internet group will announce on Monday a scheme called the "BBVA-Google tourism activity in Spain indicator". The first pilot project has focused on measuring advance demand for hotel stays and tourism interest in Spain by using search engine data.
Private investors get to see that search data before anyone else does. If you have a retirement plan invested in stocks, then you are at an asymmetrical information disadvantage because Google is providing an in-depth look at that search data to competing investors who can trade on the information before it is public.
Is search traffic a big deal? Is there enough signal there to matter? Yes. And yes.
I read an investment report earlier today about a company where the hedge fund's rating & valuation was largely based on / justified by the SEO strategy of the underlying company & their current Google rankings...the report even had keyword ranking charts in it!
Was Google paid for giving BBVA access to the above data? Or was it thrown in as a freebie in exchange for getting over 100,000 BBVA workers to switch to the cloud & go Google on the enterprise software front?
If Google has over 90% search marketshare in many EU countries & is willing to leverage proprietary search data to win contracts in other fields, how does anyone compete against that data bundling?
Further, think of all the damage hedge funds & huge banks have done to societies the globe over this past decade & now Google is directly helping the bad guys.
That is Google's approach to their proprietary information: if you invest in their ecosystem and use their analytics tools you can't get your own analytics data (as they have to protect "user privacy"), but they will gladly sell that same data off to someone else.
If there is no public outrage at this "test" then the data units will start getting more granular. Rather than measuring categories Google may sell data on a per-site or per-company basis. Looking at how Google has consistently disintermediated "partners" everywhere else, if Google is feeling bold they may suggest that selling the data to others also permits Google to trade on the data as well.
What's far scarier than an angry search engineer looking at your large paid link buy or a rogue Google "contractor" hacking up your site? A Google hedge fund with a substantial short position on your stock. :D
Recall that Eric Schmidt has stated:
"One day we had a conversation where we figured we could just try and predict the stock market..." Eric Schmidt continues, "and then we decided it was illegal. So we stopped doing that."
Wow...this is pretty...um...transparent.
According to this post, Google was caught scraping Mocality, calling the listed businesses, soliciting that they move to Google "Get Your Business Online", disparaged the directory they were scraping in the client call, and then lied about having the permission of the directory they were scraping to try to con businesses into working with Google.
A few select quotes:
There are absolutely no costs, and this will be agreed on before it's put on… No one will come and tell you like Mocality used to do, someone tells you it's free and then they come to ask for money. You know that Google doesn't fool around here. ... Mocality used to charge people and many of the people who used to be in Mocality we have taken them and transferred them here. Didn't we also find you on Mocality? ... Ai…they used to…but some people didn't used to pay. They [Mocality] used to go and ask people to pay them around Ksh. 20,000 and people refused. It was things like that.)
Google's business model *is* buying or building things that are free and then later pulling back features and/or sneaking costs in on them. Whether it be clubbing Android carriers with compatibility, saying search ads are evil then placing them everywhere, Google Maps API terms changes, terms changes on the Google AdWords API, Google hotel place listings with endless price ads, or keyword (not provided) in web analytics while trying to force you to register in Google Webmaster Tools to get any keyword data at all!
As if that wasn't bad enough, when the fake business asked Google if Mocality was ok with this, this was the exchange:
My question is does Mocality know that you're getting their con…our contacts from their directory? ~~~ Yah. They know. They know that very well. They have agreed with Google when they were on that thing.
I have long stated that the difference between spam and quality content is who is spamming. With the recent widely criticized over-promotion of Google+ in the search results and this sort of scrape, lie & disintermediate the source Google's true character is shining through.
Facebook & Twitter are smart not to leave the barn door open for Google.
All information wants to be free and wrapped in Google's ads. Or so the saying goes. But until they can be trusted it won't be. They have done A LOT of brand damage to themselves in the past couple months. Update: Google was mortified that they got caught doing this:
We were mortified to learn that a team of people working on a Google project improperly used Mocality's data and misrepresented our relationship with Mocality to encourage customers to create new websites. We've already unreservedly apologised to Mocality. We're still investigating exactly how this happened, and as soon as we have all the facts, we'll be taking the appropriate action with the people involved.
Sometimes it's the little things in life....Boomerang for Gmail (and Outlook) is an incredibly useful, lightweight, powerful link outreach app.
Link building has a special place in the SEO industry. Beyond being one of the harder skill-sets to master and acquire, link building is likely the most important element of an SEO campaign.
Link building can also be the most difficult job to:
Scale internally and externally
Train someone to do efficiently
Hire someone for
How to hire link builders and how to train them are certainly worthy of their own (upcoming) blog posts but this post is going to sing the praises of a Gmail and Outlook plugin that is essential for my link building workflow.
Boomerang for Gmail (and Outlook)
Outside of the really cool name this plugin makes my workflow much more streamlined and efficient.
I don't use Outlook so I'll be focusing on the Gmail plug-in here. The Outlook plugin has most of the functionality of the Gmail edition (minus the Send On options) and you can check out the Outlook version here.
The key benefits to using Boomerang (referencing the Gmail app going forward) are:
Schedule emails to be sent at a later date/time
Set reminders on emails so they pop back up at a specified time
Set email reminders from your smartphone
Send Emails Later
You can install Boomerang for Gmail here. You can use this for Gmail and Google apps and you'll need to use Firefox or Chrome.
You'll manage Boomerang in two places; you can get to it in your Gmail toolbar:
From here you can access your scheduled messages to make any changes and access various help and how-to's.
The other area where you access Boomerang is in the email dialogue box. When you go to compose a new message or click to reply to one you'll see the Boomerang button and see all the options available for sending the message:
If you click on anything other than the specific time option at the bottom, the message is scheduled straight away.
If you need to access your Boomerang-ed messages, just go back to the top Gmail toolbar, click Boomerang, and click access Scheduled messages.
The other cool option when composing a new message is listed right below the subject line. From here you can have Boomerang return the message to your Inbox if no one replies or even if they do (marked as unread, starred, etc; these options can be changed in the "access scheduled messages" option on the top Gmail/Boomerang toolbar option):
You have the exact same option when replying to messages as well.
This is incredibly useful for a variety of link building actions such as:
Tracking the effectiveness of email pitches
Scheduling a bunch of pitches to line up with various promotions and outreach campaigns, in one shot
Using in conjunction with Gmail's canned responses for scalable link outreach and management
Never forget about a link prospect
Make Gmail a self-contained link outreach system for staff members
Avoid awkward time zone issues on email deliveries if you have staff outside your targeted market's location
While the Send On features are the most useful for link outreach, the Reminder functions can be useful as well.
Boomerang has Gmail-like functionality in the way it auto-offers a solution. Here you can see I've got a Staples coupon that expires on January 16th. Boomerang is asking me if I'd like to return this to my inbox on that date:
Outside of that functionality you can click the Boomerang reminder icon in the toolbar to get the reminder options available to you:
So rather than setting something in your calendar or in your task management application, you can use Boomerang to re-populate the email when needed.
You can add a condition to this and say that you only want to be reminded of the message at the selected time "IF" no one responds, simply by checking that option above. Otherwise, it will come back whether someone responds or not.
You can also use your iPhone, Blackberry, or Android to set up a message for yourself to arrive in your inbox at a certain time with their mobile option.
Letting an app access your data on mail.google.com shouldn't be taken lightly. Here is what they say about privacy:
Why does Boomerang for Gmail need access to my email account?
Like most other Gmail plugins, we need access to the full email data to be able to move and send messages. In our queries, we only store the headers of the message (subject, sender, time) so that we can uniquely ID the message you want to schedule. We don't store any message text. Does it mean you have my Gmail password?
No, we don't have access to your Gmail password. You are authorizing through Google's official OpenID system.
Sign Up for Boomerang
You can get a full-featured pro account trial for free, for 30 days here. I am anxious for them to release the open/click tracking for even deeper link outreach analysis.
If you are looking for a more enterprise level solution, with team-wide tracking and monitoring, please check out our reviews of Buzzstream and Raven Tools.
"All things are subject to interpretation. Whichever interpretation prevails at a given time is a function of power and not truth." - Friedrich Nietzsche
Everyone Except Me Should be Open
Being labeled as open or transparent is a great public relations strategy. Executed effectively it gets ditto heads to feel like they are part of a movement and spread your propaganda.
However actually being transparent is often a poor business strategy.
When WordAds opened up someone in the comments suggested that they should win by being open like Google. I read that and laughed. Where Google is losing you can count on them pushing the open label in order to build momentum & destroy the asymmetrical information advantages of existing market leaders. But where Google leads non-transparency is the norm.
A few examples & comparisons:
Claiming to run an open auction, while running obfuscated quality metrics that price gouge advertisers.
At the same time Google is trying to push social sites to offer transparent data, they decided to block some Google search referral data (unless you are paying for the clicks, then you get that data).
When planning some of the features behind Google+ one of their employees wrote a book about the social circles concept with Google's blessings. Then, after he wrote the book, Google revoked permission to publish it!
Nuking affiliate links of some websites & then investing in Viglink, a network that automatically turns links into affiliate links.
Burning some networks of websites for being doorway pages & then investing in the Whaleshark Media roll up & launching Google Places.
Nuking some UK financial comparison sites for link buying & then buying BeatThatQuote.
Suggesting 60 or 90 days of penalty is a reasonable penalty for sketchy links & allowing BeatThatQuote to rank 2 weeks after penalizing it without cleaning up any of the paid links.
Android is open but internal Google emails revealed that carriers were getting wise to Google using compatibility as a club.
Not sharing revenue share stats with AdSense partners for a half-decade.
When websites are nuked they are frequently given no explanation. Worse yet, their content often re-appears in the search results on some other domain that stole it, in many cases while being wrapped in AdSense ads.
Arbitrarily making it hard to export AdWords campaigns to other services (& making it against the TOS to do same via the API).
The Panda update was needed to rid the web of garbage content. And yet Google is pre-paying Demand Media to post videos on YouTube. Since the Panda update downstream Google traffic to YouTube has more than doubled & YouTube is serving over a trillion streams per year!
Calls for "transparency" in SEO may sound great on their face, but once you peal back the covers the absurdity is laughable. If Google didn't discriminate against certain types of players & if Google didn't compete in the very markets that it judges then perhaps transparency would be a good idea.
However Google is perhaps the single biggest direct competitor in many markets, so to be fully transparent with them when they are the opposite with you is a naive business strategy:
I also disagree that outing each other would make the industry less like a mafia, because SEOs aren't the mafia. SEO is a symbiotic marketing channel reliant on Google, until the next big search engine/method comes along. In a mafioso analogy, Google would be the mafia - as they control the market. Removing all webspam wouldn't necessarily create better search results or a fairer market, as Google still decides who wins and who loses. The biggest winner being Google itself, the next level being their friends.
Secrecy is also the cornerstone of all marketing channels. Social Media for instance works in a similar way to SEO, except they have secret voting methods rather than secret linking methods. You don't see major social media companies outing a rival's voting methods, as it would shine a torch on their own methods. Even outside of marketing, McDonalds probably worked out KFC's magic blend of herbs and spices decades ago, but it's not in their best interest to tell everybody.
Outing webspam helps an SEO blog to keep their UVs up and their VCs happy. It helps a failing newspaper to appear modern and edgy, whilst allowing the contributor to launch a protection racket off the back of another company's misery.
The question is less whether black hat and webspam are a good thing or not, but if Google is the unbiased and benevolent instance who shall make the rules. Google is a business and persuits its very own interestes, since it is aware of its market power with a lot of arrogance, aggresivity and obviously double standards. That was also Aaron's point, but seomoz has been missing the point completly in the last time.
I expect an SEO portal/community to focus on how stuff actually works/can work, not to propagate how the monopolist does it want to work. It is their risk of doing business if they decide for an algorithm, not ours. It is our risk however, to decide whether to stick to the rules or not. And it's not only about ethics but has several practical implications...
Full Disclosure Required, Except From Us
On paid links Google claims to require machine AND human readable disclosure. Then on their own site they use an ad color background that literally fades to white on many monitors. Maybe it is legitimate that they are only able to fool some of the users some of the time. But some of their ad initiatives have 0 disclosure at all. None.
That is now part of the "organic" search results, but is that a paid ad?
You wouldn't know by looking at it, but according to the WSJ it is: "Google lists booking links to the airlines as advertisements, but the company declined to comment on how much money it makes from the arrangement."
There is no disclosure that you are in a paid ad funnel until the very last click. And those who fail to pay are either unlisted, listed last, or have a broken booking process where their brand is arbitraged in an attempt to flip the click to somewhere else. According to Leocha, "Google and the airlines have a sweetheart deal with each other, and the consumers are getting screwed."
In the hotel market Google is also testing comparison ads & price ads.
Notice how little they care about relevancy so long as they keep the click on Google or are paid for the referral. They rank the car rental company Avis as a top Las Vegas hotel! And even the ad links that are sold off of that do not line up. Priceline pushes the Plazzo Luxury Suites & Booking.com pushes the Venitian.
Retarding Investment in the Search Ecosystem
What do you suppose the above behavior does to cash flow & multiples of websites in that vertical? Of course it contracts them & retards investment. Who wants to start a new hotel website at this point? What other verticals have investment held back by the fear of Google's eventual entry?
If you only had to manage competing against other market competitors & staying inside Google's editorial guidelines then investment isn't that difficult, but if you have to stay within Google's guidelines in the short term yet try to build a business that is sustainable even after Google enters & destroys the market it is far more difficult.
Skimming the Cream
At any time Google can enter any market and skim off the cream: "An independent study from Leads360 showed consumers using Google's comparison ads converted better than any other lead provider."
Other affiliate networks which do not own the search channel have to fight through quality issues if they try to build similar scale.
A Self-serving Bias You Can Count On
When Google enters a market it might buy out a competitor, buy out a supplier, bundle, use predatory pricing, grant themselves superior search placement, adjust the relevancy algorithms and/or editorial guidelines, violate IP, scrape 3rd party content, work with sketchy advertisers & publishers to undermine competing business models, or any combination of the above.
They are rarely transparent with their interests when they enter a market. Almost everything is labeled as "a beta" and "just a test." They promise to "act appropriately" & you may not be aware of the steamroller until you are under it.
A Google spokesman said "applications that are installed without clear disclosure, that are hard to remove and that modify users' experiences in unexpected ways are bad for users and the Web as a whole."
Google's founding research highlighted how bad ad-driven search engines were & then Google's core revenue engine of paid search was built on their violation of Overture's patent. They keep buying swaths of patents to protect against their other violations.
The business model of "violate & then buy protection" has helped lead to a protection-racket styled marketplace in patents that makes the risk of innovation for smaller players so expensive that it drives them under.
Where Google has gained a dominant position in a marketplace they can begin misdirecting for profit. Let's say you link to your own location on Google Maps to drive traffic to Google & help your users locate your office. Well in some cases they then reciprocate by confusing users by putting an ad in your location bubble.
Once again, you are forced to buy your own brand unless you teach your customers (and prospective customers) to avoid Google products.
Sure I May Have Failed, But at Least That Failure Was Transparent...
If you are fully transparent against an arbitrary set of guidelines when the company that judges you also competes against you & brushes up against the limits of the DOJ & FTC then you might lose for no reason other than being transparent. And not only are you competing directly against Google, but the algorithms are biased toward certain players.
Today the Internet is an information highway where anybody — no matter how large or small, how traditional or unconventional — has equal access. But the phone and cable monopolies, who control almost all Internet access, want the power to choose who gets access to high-speed lanes and whose content gets seen first and fastest. They want to build a two-tiered system and block the on-ramps for those who can't pay.
For many businesses the unknown Panda risk is every bit as damaging as the great firewall of China. Each additional unknown kills x% of small new online businesses. If unemployment is high, companies are not hiring & the bar for self-employment is too high then the web stagnates.
If the old established corporate competition needs to be as good as you to compete then there is little risk to being transparent if the competition is doing nothing beyond following you around. But if the playing field is tilted and the competition only needs to be 5% as good as you are to beat you (and can easily come from behind to copy any success you have) then full on transparency brings much more risk than potential profits.
You Are the Ad
We are moving into a media world where the content becomes ads & even how people interact with the ads and content becomes a part of the ad.
even after you remove the vote for a site they still keep showing it
you may vote for site A & they will show your image as voting for site B
when they show your picture they claim you voted specifically for the page being advertised (even if that page is promoting a scam or something else you wouldn't endorse)
Once again, I will highlight that they use the votes against the wrong sites & pages and that they keep showing the votes even weeks after you remove them.
Where is the transparency in that deceptive crap?
Yahoo! offers a useless "buying guide" for fish tanks that is nothing more than a paid pointer to Overstock.com.
If you click on their coupons tab on that fish tanks search Yahoo! shows you coupons for tank tops, which is pretty idiotic.
Why is this Yahoo! Shopping & Yahoo! Deals product so ugly? They outsourced it years ago. So it is a non-product & thus the integration can't be anything but crappy.
Why do Yahoo! & Bing typically get a pass? They own a fairly low search marketshare. Missing traffic from either or both of those is certainly significant enough to be felt, however even when they are combined it is still less than half of what Google controls in most markets. Market leaders are expected to operate in less conflicted & less self-serving ways than also ran players in their market do. If Microsoft would have had 10% or 15% marketshare for their operating system then it is unlikely their browser bundling would have come under such scrutiny.
Transparency in The Real World
In the past I highlighted how every form of media is manipulated in Why Outing is Bad, but I thought it would be fun to run through some other markets and highlight how transparency often exists only as an illusion (to lure in punters so they can be rooked).
TrueCar aimed to make that market more transparent by giving consumers pricing data online to remove some of the asymmetrical advantage dealers have & makes the sales process smoother for consumers. How does the automotive market respond? Honda issued threats to their dealers & now TrueCar has a hate video ranking for their brand.
This nontransparency is not something new, but rather the way it has always been.
It exists at every level of society. Countriesspy on one another & companies may chose to show different views of the world to different markets.
News International's leading profit centre, the News of the World, was dependent on a very ugly culture of lawbreaking, hacking and impunity. This freewheeling, ask-no-questions attitude spread to other parts of the organisation, such as the Times and the Sunday Times, both of which used have used illegal or unethical techniques. Even more troubling, when senior News International management were confronted with evidence of wrongdoing, the company made false statements and took actions which prevented key evidence from reaching the public domain.
Both cases involve News America Marketing, an obscure but lucrative division of the News Corporation that is a big player in the business of retail marketing, including newspaper coupon inserts and in-store promotions. The company has come under scrutiny for a pattern of conduct that includes below-cost pricing, paying customers not to do business with competitors and accusations of computer hacking.
Were The Robber Barons Transparent?
Going back into history it is sort of hard to pick a starting point (one can go to the spice trade & orders that are unsealed at sea, or likely earlier than that) but to pick a somewhat recent starting point, we could look at the railroads:
So how did unnecessary, inefficient railroads get built? Because of government subsidies. In short, the federal government paid to build the railroads through massive financing subsidies and also gave them ample land grants. The trick to building a railroad was not knowing anything about railroads or even about business; it was having friends in Washington who could give you the right financing and land subsidies.
Even then, the railroads lost money. Not only was there insufficient demand for their services, but they were run by people who were generally incompetent. (For one thing, they didn't even know their own costs of doing business.) Yet the people who owned the railroads made fabulous amounts of money (of which Stanford University is one symbol). The main way to do this was simple. The people who controlled a railroad (generally by putting up very little of their own money, thanks to the government subsidies) would also wholly own a construction company. They would cause the railroad to overpay the construction company to build the railroad—in effect transferring wealth from railroad stockholders and creditors into their own pockets
Remember the $500 million fine for Google from them pushing ads selling overseas Viagra in the US? Now they promote scaremongering ads against fakes from filthy labs.
Coca-cola runs The Beverage Institute & has "doctors" highlight how healthy soda is.
At the same time, when Pepsi was sued over an alleged rat being in a can of Mountain Dew. Pepsi's defense claimed: "the mouse would have dissolved in the soda had it been in the can from the time of its bottling until the day the plaintiff drank it" turning the mouse into a 'jelly-like' substance. But don't worry folks, it's healthy. :D
At least we still have water.
When they are not busy making it illegal to collect rainwater, Bechtel wants you to follow them on Twitter.
It is hard to know what is in our food & those who label things as organic have to fill out more paperwork than those who manufacture frankenfood. Then there are the baseline chemicals sold as biodegradable which are not. ;)
Oh well, at least we have insurance.
State Farm is the #1 ranked bad faith insurance company, but at least they upload & advertise irrelevant funny videos to YouTube to create brand signal for Google.
When looking at my credit card bill I saw a scammy $22.99 charge on it for a credit report I have never ordered. I looked up information about the "company" offering that service & the #1 result (with sitelinks) was my darn credit card company's website! They had to conduct a block on themselves, but if you don't notice it they will steal $23 a month until you die. ;)
Bank of New York Mellon ripped off their clients with unsavory Forex rates: "As investigators sought to determine whether the bank overcharged clients to execute their currency trades, a senior BNY Mellon executive nicknamed "Rambo" urged traders not to tell clients how much money they made on trading, according to the informant."
A former Federal Reserve member writes about the Fed: "No matter the legalistic interpretation, the Fed is, working through the ECB, bailing out European banks and, indirectly, spendthrift European governments. It is difficult to count the number of things wrong with this arrangement."
"What's happened is that, almost overnight, we've switched from democracy in real-property recording to oligarchy in real-property recording. There was no court case behind this, no statute from Congress or the state legislatures. It was accomplished in a private corporate decision. The banks just did it." - Christopher Peterson
The financial markets are becoming glorified crack houses: "Frankly, I am concerned that Wall Street is becoming little more than a glorified crack house. Day after day, the sole focus of Wall Street is on more sugar, stronger sugar, Big Bazookas of sugar, unlimited sugar, and anything that will get somebody to deliver the sugar faster. This is like offering a lollipop to quiet down a 2-year old throwing a tantrum, and expecting that the result will be fewer tantrums. What we have increasingly observed over the past decade is nothing but the gradual destruction of the ability of the financial markets to allocate capital for the benefit of future growth. By preventing the natural discipline of the markets to impose losses on poor stewards of capital, and to impose interest rates high enough to force debtors to allocate the capital usefully, the world's policy makers are increasingly wrecking the prospects for long-term economic growth."
Individuals who put in extra hours of work because they are sold on the promise of their options may also find thosedisappear: "Taking away the value of options that are vested means that the concept of vesting becomes bogus. It doesn't matter whether the employee understood if this was the deal or not, it's a scummy practice, and it's ultimately self-defeating (both for the company and the industry as a whole). Who would go to work for Skype (or any PE-backed company) in the future? "
Limitless fraud before the courts & dancing on the graves of the newly homeless: "Court records show that the firm angered state court judges for alleged false statements and filing suspect documents. Arthur Schack, a state court judge in Brooklyn, in a 2010 ruling said that pleadings by the Baum firm on behalf of HSBC Bank, a unit of London-based HSBC Holdings, in a foreclosure case were "so incredible, outrageous, ludicrous and disingenuous that they should have been authorized by the late Rod Serling, creator of the famous science-fiction television series, The Twilight Zone." ... The law firm said it would shut down after New York Times columnist Joe Nocera in November published photographs of a 2010 Baum firm Halloween party in which employees dressed up as homeless people. Another showed part of Baum's office decorated to look like a row of foreclosed houses."
That theft of physical property is ongoing: "Also announced over the weekend was the jaw-dropping, yet illuminating fact that the MF Global bankruptcy was fraudulently, nefariously and illegally drawn up as a Chapter 7 BK for a SECURITIES DEALER and NOT a commodity brokerage as it should have been. Look, MF Global was the second-largest non-bank FCM in the United States next to NewEdge which is the old FIMAT. If MF Global wasn't an FCM, then there are no FCMs. Of course it was an FCM. It had $7.2 billion in customer seg funds as of August 31, 2011. And yet MF Global was immediately, from the get-go, put into Chapter 7 BK as a SECURITIES FIRM. This is fraud. MF Global's BK should have OBVIOUSLY been established under Subchapter IV of the Chapter 7 code as a COMMODITY BROKERAGE."
And as banking criminals literally steal money, destroy lives & undermine the rule of law to grow their "profits" sleazeballs like Jamie Dimon think that the reason people hate them is envy.
The above makes no mention of helping Greece hide governmental debt, bid-rigging bribes in Jefferson County, robosigning bogus foreclosure documents, and a host of other crimes. But one thing in common with all the above crimes is this: no jailtime for the banksters.
Since there is nothing stopping those criminals they keep up their crimes:
Big banks represent the ultimate in concentrated economic power in today's economies. They are able to resist all meaningful reform that could really change their compensation schemes. Their executives want to get all the upside while facing none of the true downside.
But capitalism without the prospect of failure is not any kind of market economy. We are running a large-scale, nontransparent, and dangerous government subsidy scheme for the benefit primarily of a very few, extremely wealthy people.
The actions of the financial cartel are both obvious & predictable. And the damage they do is felt worldwide:
Credit-financed economic booms, by turns in private then public credit as one ratchets up the other over a series of booms and busts, are as irresistible to politicians as hookers and maids. ... The failures of American FIRE Economy policies are behind the movements in Libya, Yemen, and Syria, as reflation measures, from quantitative easing to currency depreciation, steal purchasing power from low income families world wide, acting as the most regressive tax imaginable. Simmering hatreds are exacerbated by the developing global crisis over oil supplies and costs. ... The so-called debate about debt ceilings, spending cuts, and entitlements reductions is a red herring. The public debt crisis arose from the 2007 - 2008 private credit market crisis, not the government liabilities that have been building for decades. The mistake of both the left and the right is thinking that we can escape an output gap without facing up to the politically unpopular task of demanding that creditors take a loss on loans taken out during the credit bubble era.
A creditor that makes bad loans deserves to go out of business. Their outsized compensation can't be justified unless they are also made to eat their losses. But rather than holding them accountable for their own actions, societies the world overabsorb that pain.
"Fascism should more appropriately be called Corporatism because it is a merger of state and corporate power"- Benito Mussolini
Money is a human construct. The fact that our money is now backed by nothing more than our collective future ability to "produce" relegates us to that of slaves.
Blood hours are a finite measure. Heartbeats.
What's in your wallet? Is it the new debt slavery card: "A personal bankruptcy is supposed to cut borrowers loose from lenders and debt collectors, but Capital One Financial Corp.—one of the nation's largest credit-card issuers—sometimes doesn't want to let go." Citigroup has an "effective" strategy they employ in some 3rd world countries to deal with those who can't pay:
After dropping his younger daughter at school, Octa walked into Citibank's credit card collection department on the fifth floor of the Jamsostek tower just after 10 a.m. Four hours later, he left the 25-story building slumped motionless in a wheelchair -- a victim of what police allege was a violent assault by debt collectors. Driven to a nearby hospital in a Citibank car, Octa was pronounced dead on arrival.
before being bailed out by governments, banks had never made any return in their history, assuming that their assets are properly marked to market. Nor should they produce any return in the long run, as their business model remains identical to what it was before, with only cosmetic modifications concerning trading risks.
So the facts are clear. But, as individual taxpayers, we are helpless, because we do not control outcomes, owing to the concerted efforts of lobbyists, or, worse, economic policymakers. Our subsidizing of bank managers and executives is completely involuntary.
The way the banks make money now is by hiding their losers off balance-sheet, or by forcing them on the taxpayers, and after having themselves declared "systemically important," adjusting their on balance-sheet exposures accordingly, crashing the system and cashing out on their leveraged derivative bets, also at the taxpayers' expense.
In real life, if there is such a thing anymore, all of the major banks are arguably insolvent. So, in reality, they're not making any money at all, they are merely having it transferred to them by their political operatives in Congress and the Federal Reserve Bank. This, after all, is the modern purpose of the Congress, and has always been the purpose of the Federal Reserve System.
government and banks are stuck together like a couple of dogs screwing and we don't know which is on top. Here, Republicans need government to finance war and Democrats need it to finance social programs. Both need it to finance both, as that is how government attempts to maintain power and influence over the people this day and time.
When Senate Democrats finally brokered a compromise over the proposed health-care law, a group of hedge funds were let in on the deal, learning details hours before a public announcement on Dec. 8, 2009.
The news was potentially worth millions of dollars to the investors, though none would publicly divulge how they used the information. They belong to a select group who pay for early, firsthand reports on Capitol Hill.
In the past, periods dominated by virtual credit money have also been periods where there have been social protections for debtors. Once you recognize that money is just a social construct, a credit, an IOU, then first of all what is to stop people from generating it endlessly? And how do you prevent the poor from falling into debt traps and becoming effectively enslaved to the rich? That's why you had Mesopotamian clean slates, Biblical Jubilees, Medieval laws against usury in both Christianity and Islam and so on and so forth.
Since antiquity the worst-case scenario that everyone felt would lead to total social breakdown was a major debt crisis; ordinary people would become so indebted to the top one or two percent of the population that they would start selling family members into slavery, or eventually, even themselves.
Well, what happened this time around? Instead of creating some sort of overarching institution to protect debtors, they create these grandiose, world-scale institutions like the IMF or S&P to protect creditors. They essentially declare (in defiance of all traditional economic logic) that no debtor should ever be allowed to default. Needless to say the result is catastrophic. We are experiencing something that to me, at least, looks exactly like what the ancients were most afraid of: a population of debtors skating at the edge of disaster.
And, I might add, if Aristotle were around today, I very much doubt he would think that the distinction between renting yourself or members of your family out to work and selling yourself or members of your family to work was more than a legal nicety. He'd probably conclude that most Americans were, for all intents and purposes, slaves. ... Clearly any pretence that markets maintain themselves, that debts always have to be honored, went by the boards in 2008. That's one of the reasons I think you see the beginnings of a reaction in a remarkably similar form to what we saw during the heyday of the 'Third World debt crisis' – what got called, rather weirdly, the 'anti-globalization movement'. This movement called for genuine democracy and actually tried to practice forms of direct, horizontal democracy. In the face of this there was the insidious alliance between financial elites and global bureaucrats (whether the IMF, World Bank, WTO, now EU, or what-have-you).
And, in spite of the FBI highlighting the massive mortgage fraud, and the above quote, the president (who is a horrible human being) aims to keep the population misinformed & ignorant, publicly stating that what Wall St did wasn't illegal!
this is how the much-lauded "freedom of the press" myth in the US actually works. If you perform the job of an actual journalist, telling truth to power, forget about attending press conferences at the White House, Pentagon or State Department. You won't even be admitted in the building.
The people who most heavily rely on pseudonyms in online spaces are those who are most marginalized by systems of power. "Real names" policies aren't empowering; they're an authoritarian assertion of power over vulnerable people.
That is what they say, typically at the bottom of the posts, in blog posts that equate Google Chrome to being the Internet & spread misinformation about how Chrome is good for small business.
some of those sites are paid posts and have live links in them to Google Chrome without using nofollow & talk about SEO in the same post as well!
some of those posts link to the example businesses Google was paying to have covered
and all the posts are effectively "buying YouTube video views" for this video youtube.com/watch?v=QFLP7HD1s7k
You can say they didn't require the links, that the links were incidental, that leaving nofollow off was an accident, etc. ... but does Google presume the same level of innocence when torching webmasters? They certainly did not to the bloggers who reviewed K-Mart & the Google reconsideration request form states:
"In general, sites that directly profit from traffic (e.g. search engine optimizers, affiliate programs, etc.) may need to provide more evidence of good faith before a site will be reconsidered."
The Orwellian things about Google using the above strategy to market Chrome are:
Google has a clear pro-corporate big brand bias to their algorithms & layout (Vince & Panda updates + the part near the top of the SERPs for some searches that says "brands" as a filter type).
The more usage data Google collects the more stupid hoops it forces smaller businesses to jump through in order to compete, thereby further driving them under. (If small business owners didn't have enough time & resources for SEO, do they now also have time to get reviews, get local citations, deal with social stuff on Twitter + Facebook + Youtube + Google+ and a bit of SEO?)
Google polices how small businesses can even make income online. When K-Mart paid some small business bloggers to do sponsored posts Matt Cutts wrote a post (mattcutts.com/blog/sponsored-conversations/) about how he torched those small bloggers (while doing nothing to K-Mart) & equated that exercise to selling links that promote bogus brain cancer solutions. Yet Google Japan was already dinged for this sort of paid post activity & now Google is doing the same thing again.
The fact that Google is paying to spread that sort of misinformation about how their browser is helping small businesses is sort of like BP buying ads about doing tourism in the gulf. Only since Google destroying smaller businesses is something more abstract on virtual lands the PR propaganda campaign is much more effective, because (unlike oil washing ashore) people do not see what is not there. (The birds still die, but the black oil covered carcass isn't rotting on the beach).
Should you follow Google & buy ads on these sites? Are they christened & beyond reproach? I would sort of be afraid to buy exposure on the blogs where Google is buying coverage...if that latent public relations disaster eventually blows up in their face, they may assume others are as guilty as Google is & burn down the whole forest.
Google the dictator meet Google the marketer. You guys are going to get on well together! Update: Danny highlighted how Google's Chrome ad buy created a lot of the low-quality filler pablum content that the Panda update was alleged to discourage.
Website Auditor is one of the 4 tools found in Link-Assistant's SEO Power Suite. Website Auditor is Link-Assistant's on-page optimization tool.
We recently reviewed 2 of their other tools, SEO Spyglass and Rank Tracker. You can check out the review of SEO Spyglass here and Rank Tracker here.
What Does Website Auditor Do?
Website Auditor crawls your entire site (or any site you want to research) and gives you a variety of on-page SEO data points to help you analyze the site you are researching.
We are reviewing the Enterprise version here, some options may not be available if you are using the Professional version.
In order to give you a thorough overview of a tool we think it's best to look at all the options available. You can compare versions here.
Getting Started with Website Auditor
To get started, just enter the URL of the site you want to research:
I always like to enable the expert options so I can see everything available to me. Next step is to select the "page ranking factors:
Here, you have the ability to get the following data points from the tool on a per-page basis:
HTTP status codes
Page titles, meta descriptions, meta keywords
Total links on the page
Links on the page to external sites
W3C validation errors
CSS validation errors
Any canonical URL's associated with the page
HTML Code Size
Links on the page with the no-follow attribute
Your next option is to select the crawl depth. For deep analysis you can certainly select no crawl limit and click the option to find unlinked to pages in the index.
If you want to go nuts with the crawl depth frequently, I'd suggest looking into a VPS to house the application so you can run it remotely. Deep, deep crawls can take quite awhile.
I know HostGator's VPS's as well as a Rackspace Cloud Server can be used with this and I'm sure most VPS hosting options will allow for this as well.
I'm just going to run 2 clicks deep here for demonstration purposes.
Next up is filtering options. Maybe you only want to crawl a certain section or sections of a site. For example, maybe I'm just interested in the auto insurance section of the Geico site for competitive research purposes.
Also, for E-commerce sites you may want to exclude certain parameters in the URL to avoid mucked up results (or any site for that matter). Though there is an option (see below) where you can have Website Auditor treat pages that are similar but might have odd parameters as the same page.
Another option I like to use is pulling up just the blog section of a site to look for popular posts link-wise and social media wise. Whatever you want to do in this respect, you do it here:
So here, I'm included all the normal file extensions and extension-less files to include in the report and I'm looking for all the stuff under their quote section (as I'm researching the insurance quote market).
The upfront filtering is one of my favorite features because I exclude unnecessary pages from the crawl and only get exactly what I'm looking for, quickly. Now, click next and the report starts:
Working With the Results
Another thing I like about Link-Assistant Products is the familiar interface between all 4 of their products. If you saw are other reviews, you are familiar with the results pane below.
Before that, Website Auditor will ask you about getting more factors. When I do the initial crawl I do not include stuff that will cause captchas or require proxies, like cache dates and PR. But here, you can update and add more factors if you wish:
Once you click that, you are brought to the settings page and give the option to add more factors, I've specifically highlighted the social ones:
I'll skip these for now and go back to the initial results section. This displays your initial results and I've also highlighted all the available options with colored arrows:
Your arrow legend is as follows:)
Orange - You can save the current project or all projects, start a new project, close the project, or open another project
Green - you can build an white-labeled Optimization report (with crawl, domain, link, and popularity metrics plugged in), Analyze a single page for on-page optimization, Update a workspace or selected pages or the entire project for selected factors, Rebuild the report with the same pages but different factors, or create an XML sitemap for selected webpages.
Yellow - Search for specific words inside the report (I use this for narrowing down to a topic)
Red - Create and update Workspaces to customize the results view
Purple - Flip between the results pane, the white-label report, or with specific webpages for metric updates
Workspaces for Customizing Results
The Workspaces tab allows you to edit current Workspaces (add/remove metrics) or create new ones that you can rename whatever you want and which will show up in the Workspaces drop-down:
Simply click on the Workspaces icon to get to the Workspaces preference option:
You can create new workspaces, edit or remove old ones, and also set specific filtering conditions relative to the metrics available to you:
Spending some time upfront playing around with the Workspace options can save you loads of time on the backend with respect to drilling down to either specific page types, specific metrics, or a combination of both.
Analyzing a Page
When you go to export a Website Auditor file (you can also just control/command + a to select everything in the results pane and copy/paste to a spreadsheet) you'll see 2 options:
Page Ranking Factors (the data in the results pane)
Page Content Data
You can analyze a page's content (or multiple pages at once) for on-page optimization factors relative to a keyword you select.
There are 2 ways you can do this. You can highlight a page in the Workspace, right click and select analyze page content. Or, you can click on the Webpages button above the filter box then click the Analyze button in the upper left. Here is the dialog box for the second option:
The items with the red X's next to them denote which pages can be analyzed (the pages just need to have content, often you see duplicates for /page and /page/)
So I want to see how the boat page looks, highlight it and click next to get to the area where you can enter your keywords:
Enter the keywords you want to evaluate the page against (I entered boat insurance and boat insurance quotes) then select what engine you want to evaluate the page against (this pulls competition data in from the selected engine).
The results pane here shows you a variety of options related to the keywords you entered and the page you selected:
You have the option to view the results by a single keyword (insurance) or multi-word keywords (boat insurance) or both. Usually I'm looking at multi-word keyphrases so that's what I typically select and the report tells you the percentage the keyword makes up of a specific on-page factor.
The on-page factors are:
Total page copy
Title tag, meta description, and meta keywords
H1 and H2-H6 (H2-H6 are grouped)
Link anchor text
% in bold and in italics
Website Auditor takes all that to spit out a custom Score metric which is mean to illustrate what keyword is most prominent, on average, across the board.
You can create a white-label report off of this as well, in addition to being able to export the data the same way as the Page Factor data described above (CSV, HTML, XML, SQL, Cut and Paste).
Custom Settings and Reports
You have the option to set both global and per project preferences inside of Website Auditor.
Per Project Preferences:
Customer information for the reports
Search filters (extensions, words/characters in the URL, etc)
Customizing Workspace defaults for the Website reports and the Web page report
Setting up custom tags
Selecting default Page Ranking Factors
Setting up Domain factors (which appear on the report) like social metrics, traffic metrics from Compete and Alexa, age and ip, and factors similar to the Page Factors but for the domain)
XML publishing information
Your Global preferences cover all the application specific stuff like:
Emulation settings and Captcha settings
Company information for reports
Preferred search engines and API keys
Publishing options (ftp, email, html, etc)
Website Auditor also offers detailed reporting options (all of which can be customized in the Preferences area of the application). You can get customized reports for both Page Factor metrics and Page Content Metrics.
I would like to see them improve the reporting access a bit. The reports look nice and are helpful but customizing the text, or inputting your own narratives is accessed via a somewhat arcane dialog blog, where it makes it hard to fix if you screw up the code.
Give Website Auditor a Try
There are other desktop on-page/crawling tools on the market and some of them are quite good. I like some of the features inside of Website Auditor (report outputting, custom crawl parameters, social aspects) enough to continue using it in 2012.
I've asked for clarification on this but I believe their Live Plan (which you get free for the first 6 months) must be renewed in order for the application to interact with a search engine.
I do hope they consider changing that. I understand that some features won't work once a search engine changes something, and that is worthy of a charge, but tasks like pulling a ranking report or executing a site crawl shouldn't be lumped in with that.
Nonetheless, I would still recommend the product as it's a good product and the support is solid but I think it's important to understand the pricing upfront. You can find pricing details here for both their product fees and their Live Plan fees.
SEO Spyglass is one of the 4 tools Link-Assistant sells (individually) and as a part of their SEO Power Suite.
We did a review of their Rank Tracker application a few months ago and we plan to review their other 2 tools in upcoming blog posts.
Key Features of SEO Spyglass
The core features of SEO Spyglass are:
White Label Reporting
Historical Link Tracking
As with most software tools there are features you can and cannot access, or limits you'll hit, depending on the version you choose. You can see the comparison here.
Perhaps the biggest feature is their newest feature. They recently launched their own link database, a couple of months early in beta, as the tool had been largely dependent on the now dead Yahoo! Site Explorer.
The launch of a third or fourth-ish link database (Majestic SEO, Open Site Explorer, A-Href's rounding out the others) is a win for link researchers. It still needs a bit of work, as we'll discuss below, but hopefully they plan on taking the some of the better features of the other tools and incorporating them into their tool.
After all, good artists copy and great artists steal :)
Setting Up a Project for a Specific Keyword
One of my pet peeves with software is feature bloat which in turn creates a rough user experience. Link-Assistant's tools are incredibly easy to use in my experience.
Once you fire up SEO Spyglass you can choose to research links from a competing website or links based off of a keyword.
Most of the time I use the competitor's URL when doing link research but SEO Spyglass doubles as a link prospecting tool as well, so here I'll pick a keyword I might want to target "Seo Training".
The next screen is where you'll choose the search engine that is most relevant to where you want to compete. They have support for a bunch of different countries and search engines and you can see the break down on their site.
So if you are competing in the US you can pull data the top ranking site off of the following engines (only one at a time):
Google Blog Search
Yahoo! (similar to Bing of course)
And some other smaller web properties
I'll select Google and the next screen is where you select the sources you want Spyglass to use for grabbing the links of the competing site it will find off of the preceding screen:
So SEO Spyglass will grab the top competitor from your chosen SERP will run multiple link sources off of that site (would love to see some API integration with Majestic and Open Site Explorer here).
This is where you'll see their own Backlink Explorer for the first time.
Next you can choose unlimited backlinks (Enterprise Edition only) or you can limit it by Project or Search Engine. For the sake of speed I'm going to limit it to 100 links per search engine (that we selected in a previous screen) and exclude duplicates (links found in one engine and another) just to get the most accurate, usable data possible:
When you start pinging engines, specifically Google in this example, you routinely will get captcha's like this:
On this small project I entered about 8 of them and the project found 442 backlinks (here is what you'll see after the project is completed):
One way around captchas is to either pay someone to run this tool for you and manually do it, but for large projects that is not ideal as captcha's will pile up and you could get the IP temporarily banned.
Link-Assistant offers an Anti-Captcha plan to combat this issue, you can see the pricing here.
Given the size of the results pane it is hard to see everything but you are initially returned with:
an icon of what search engine the link was found in
the backlinking page
the backlinking domain
Spyglass will then ask you if you want to update the factors associated with these links.
Your options by default are:
Yahoo! Directory Listing
On-page info (title, meta description, meta keywords)
Total links to the page
External links to other sites from the page
Page rank of the page itself
You can add more factors by clicking the Add More button. You're taken to the Spyglass Preferences pane where you can add more factors:
You can add a ton of social media stuff here including popularity on Facebook, Google +, Page-level Twitter mentions and so on.
You can also pick up bookmarking data and various cache dates. Keep in mind that the more you select, especially with stuff like cache date, you are likely to run into captcha's.
SEO Spyglass also offers Search Safety Settings (inside of the preferences pane, middle of the left column in the above screenshot) where you can update human emulation settings and proxies to both speed up the application and to help avoid search engine bans.
I've used Trusted Proxies with Link-Assistant and they have worked quite well.
You can't control the factors globally, you have to do it for each project but you can update Spyglass to only offer you specific backlink sources.
I'm going to deselect PageRank here to speed up the project (you can always update later or use other tools for PageRank scrapes).
Working With the Results
When the data comes back you can do number of things with it. You can:
Build a custom report
Rebuild it if you want to add link sources or backlink factors
Update the saved project later on
Analyze the links within the application
Update and add custom workspaces
These options are all available within the results screen (again, this application is incredibly easy to use):
I've blurred out the site information as I see little reason to highlight the site here. But you can see where the data has populated for the factors I selected.
In the upper left hand corner of the applications is where you can build the report, analyze the data from within the application, update the project, or rebuild it with new factors:
All the way to the right is where you can filter the data inside the application and create a new workspace:
Your filtering options are seen to the left of the workspaces here. It's not full blown filtering and sorting but if you are looking for some quick information on specific link queries, it can be helpful.
Each item listed there is a Workspace. You can create your own or edit one of the existing ones. Whatever factors you include in the Workspace is what will show in the results pane as factors
So think of Workspaces as your filtering options. Your available metrics/columns are
Search Engine (where the link was found)
Last Found Date (for updates)
Status of Backlink (active, inactive, etc)
Links Back (does the link found by the search engine actually link to the site? This is a good way of identifying short term, spammy link bursts)
Link Value (essentially based on the original PageRank formula)
Notes (notes you've left on the particular link). This is very limited and is essentially a single Excel-type row
Yahoo! Directory Listing
Total Links to page/domain
Most of the data is useful. I think the link value is overvalued a bit based on my experience finding links that often had 0 link value in the tool but clearly benefited the site it ended up linking to.
PageRank queries in bulk will cause lots of captcha's and given how out of date PR can be it isn't a metric I typically include on large reports.
Analyzing the Data
When you click on the Analyze tab in the upper left you can analyze in multiple ways:
All backlinks found for the project
Only backlinks you highlight inside the application
Only backlinks in the selected Workspace
The Analyze tab is a separate window overlaying the report:
You can't export from this window but if you just do a control/command-a you can copy and paste to a spreadsheet.
Your options here:
Keywords - keywords and ratios of specific keywords in the title and anchor text of backlinks
Anchor Text - anchor text distribution of links
Anchor URL - pages being linked to on the site and the percentages of link distribution (good for evaluating deep link distribution and pages targeted by the competing site as well as popular pages on the site...content ideas :) )
Domains linking to the competing site and the percentage
TLD - percentage of links coming from .com, net, org, info, uk, and so on
IP address - links coming from IP's and the percentages
Dmoz- backlinks that are in Dmoz and ones that are not
Yahoo! - same as Dmoz
Links Back - percentages of links found that actually link to the site in question
Updating and Rebuilding
Updating is pretty self-explanatory. Click the Update tab and select whether or not to update all the links, the selected links, or the Workspace specific links:
(It's the same dialog box as when you actually set up the project)
Rebuilding the report is similar to updating except updating doesn't allow you to change the specified search engine.
When you Rebuild the report you can select a new search engine. This is helpful when comparing what is ranking in Google versus Bing.
Click Rebuild and update the search engine plus add/remove backlink factors.
There are 2 ways to get to the reporting data inside of Spyglass
There is a quick SEO Report Tab and the Custom Report Builder:
Much like the Workspaces in the prior example, there are reporting template options on the right side of the navigation:
It functions the same way as Workspaces do in terms of being able to completely customize the report and data. You can access your Company Profile (your company's information and logo), Publishing Profiles (delivery methods like email, FTP, and so on), as well as Report Templates in the settings option:
You can't edit the ones that are there now except for playing around with the code used to generate the report. It's kind of an arcane way to do reporting as you can really hose up the code (below the variables in red is all the HTML):
You can create your own template with the following reporting options:
All the stats described earlier on this report as available backlink factors
Top 30 anchor URLs
Top 30 anchor texts
Top 30 links by "link value"
Top 30 domains by "link value"
Conclusion (where you can add your own text and images)
Overall the reporting options are solid and offer lots of data. It's a little more work to customize the reports but you do have lots of granular customization options and once they are set up you can save them as global preferences.
As with other software tools you can set up scheduled checks and report generation.
Researching a URL
The process for researching a URL is the same as described above, except you already know the URL rather than having SEO Spyglass find the top competing site for it.
You have the same deep reporting and data options as you do with a keyword search. It will be interesting to watch how their database grows because, for now, you can (with the Enterprise version) research an unlimited number of backlinks.
SEO Spyglass in Practice
Overall, I would recommend trying this tool out. If nothing else, it is another source of backlinks which pulls from other search engines as well (Google, Blekko, Bing, etc).
The reporting is good and you have a lot of options with respect to customizing specific link data parameters for your reports.
I would like to see more exclusionary options when researching a domain. Like the ability to filter redirects and sub-domain links. It doesn't do much good if we want a quick, competitive report but a quarter or more of the report is from something like a subdomain of the site you are researching.
SEO Spyglass's pricing is as follows:
Purchase a professional option or an enterprise option (comparison)
In running a couple of comparisons against Open Site Explorer and Majestic SEO it was clear that Spyglass has a decent database but needs more filtering options (sub-domains mainly). It's not as robust as OSE or Majestic yet, but it's to be expected. I still found a variety of unique links from its database that I did not see on other tools across the board.
You can get a pretty big discount if you purchase their suite of tools as a bundle rather than individually
Buzzstream recently rolled out a beautiful UI update and I've been impressed with their offering for awhile now.
We like to review products which we ourselves use , as well as products that we feel are impressive. For me, Buzzstream fits both of those characteristics.
Buzzstream is a tool that I am fully adding to my toolset for 2012 and I think you should give it a shot as well.
What is Buzzstream?
Buzzstream has two products:
Buzzstream for Link Building
Buzzstream for Social Media
We will be focusing on the link building tool in this post. Buzzstream for Link Building focuses solely on link building functionality from soup (prospecting) to nuts (tracking, reporting, relationship management).
One of my favorite aspects of this tool is it's dedicated nature. It focuses on making link building more collaborative, more scalable, and more effective. It does all three quite well and reinforces the belief that sometimes a dedicated tool is the answer.
Why Buzzstream for Link Building?
Link building has come so far in recent years with respect to things like degree of difficulty, requirements of quality, as well as the need to track links and manage relationships.
Link building is such a key piece of an online marketing campaign (not just passing link juice but bringing in targeted, quality traffic and building up brand equity) to the point where I think having a robust tool for it makes a lot of sense; especially when you can use a tool like Buzzstream for it.
Here are some of the key features of Buzzstream that we'll be covering here:
Link Reporting and Tracking
IMAP Email Integration
Buzzmarker - Link Bookmarking Tool
The dashboard gives you a good, high-level overview of your account's history and tasks.
You can filter the history by:
Showing complete history (notes, emails, twitter, logged calls, blog comments)
One of the above mentioned history fields
Show for all projects or a specific project
All items for/from a user or for/from a specific user
The filtering capabilities are solid and make project spot checks very easy. For a quick export of your history, in .csv format just click on the folder to the left of the task area (in the right column).
Here is what the dashboard looks like:
To the right of the history pane is the task pane as well as recently viewed link prospects. The task pane also offers some good filtering capabilities:
I like the clean, visual look of the dashboard as well as the quick and helpful filtering capabilities. If you are running multiple campaigns with multiple members involved then I think you'll quickly appreciate the way Buzzstream has structured their dashboard.
To begin your link prospecting search, you can go to the Websites link and jump right in.
Then click on the Prospects icon to start your research. Here, you will need to set up a profile and up to 20 keywords and keyphrases for the search. I usually name the search after the main keyword I'm looking for, so in this case we'll rock SEO Tools and I'll throw in a couple more specific keywords for the search function.
In addition to prospecting you can specifically search the following countries:
You also have your choice between website results, news results, and blog results under the Search Type option.
Also, you can have this auto-run daily for new results (which is a great feature!) as well as have notifications sent to a specific person (you or a team member or contractor) when new results arrive.
If you no longer wish to receive results but want to save the search for later, just click the inactive button and reactivate when needed.
Another cool feature here is the blacklist feature. Dump in sites you wish to exclude from your searches on a per project or account-level basis. This is extremely helpful for streamlining new prospecting searches across your entire account. Block out competitors, your other properties, sites you know you'll never get a link from, etc).
Working With Link Prospects
When you open the profile again you are presented with the results.
The results come with default columns but you can click the Columns icon to play with tons and tons of additional, useful options
Click on that and get all these column options:
Most Recent Activity
Date Added To Project
Last Modified (any project)
Last Modified (this project)
Last Viewed (any project)
Last Viewed (this project)
Last Communication Date
Inbound Links - SeoMoz
Juice Passing Links
City State Zip
Preferred Contact Method
"Contact Us" URL
Suggested Profile Info
Prospecting Metrics (for keywords in your search)
Highest SERP Position
Average SERP Position
SERP Count - Top 10
SERP Count - Top 20
Buzzstream does a good job here of giving you control over so many different options. The other nice thing here is you can add a bunch of metrics or customize whatever you want, do a quick export, and set everything back to normal if you don't want or need all these metrics every time.
Here's a snippet of what the results look like with no filtering:
From here you can do all sorts of filtering with just about all of the options I outlined above. You can also click on a specific link and manage it at any point:
From here you can do just about anything:
Add a task, tag or note
Assign it to someone
Update the relationship stage
Rate the link
Put your own custom field in there
Copy or move it to another project (love this feature)
Remove it from the project
Check the WhoIS information
Approve it for the project
Add to your block list
Also, you can see the Twitter, FB, email, and phone icons next to each link. Buzzstream will pull those in when available. You can also add a site yourself but clicking the Add Site button where you can add as much or as little info as you have or want:
What I like to do is update the search with all the SEO related metrics and then filter (not looking for addresses or anything at this point, just SEO metrics).
Here are the filtering options:
The options pretty much cover everything you can add as a metric to their prospect results page. You can also create a specific filter and save it for future use (a big time saver for ongoing prospect research).
Once you are done filtering out the junk you can begin to work the prospect list by:
Assigning it to an employee or contractor or yourself :)
Updating the contact history by adding notes about contact history
Update the relationship stage
Once the link is secured you can simply add it to the tracking and reporting component by clicking on the link and selecting "approve".
There are so many filtering options and editing options, as mentioned above, that I really encourage you to get in there and play around with it. You can customize it to fit your specific link building needs (big or small) which is a really nice feature to have (a tool that can scale up or down with you and your business).
Link Reporting and Tracking
I went ahead and approved the link-assistant.com domain as being a link I recently secured. To work with approved links you just need to move on over to the Links tab:
Again, you have a ton of filtering options here:
Buzzstream, via the Column tab, gives you lots of helpful data on a per link basis to help with overall link management and reporting:
You can also import all your links by clicking the import tab (Buzzstream gives you a template to use for this right from the import dialog box)
From here the next logical step is to set up link tracking to automatically notify you of any changes to links you are tracking.
Buzzstream offers automated and manual link tracking. Buzzstream will let you track the following link data types via their automated backlink checker (this runs every 2 weeks) and manual link checker:
Newly verified links
Links that have changed (anchor text, no-follow, and so on)
Links that have been removed
Previous linking pages that are 404's
You can select who receives this report, and the manual report via email. Manual reports can be completed by going to the links tab and clicking on the Run Backlink Checker Icon:
The report is then delivered to the specified email address (can be changed in project settings) in short order (longer for bigger checks of course).
I would recommend targeting the more important links here. There is a lot of churn on the web and link tracking tools, that are cloud based, do have tracking limits (Buzzstream comes in at 500 links for the basic plan, 25,000 for their Plus, and 100,000 for their Premium Plan). They also have a solo plan for 1 user and up to 1,500 tracked links.
They offer custom plans as well.
The link reporting is good and is one area where I think they can use some improvement (ability to spit out anchor text distribution reports, upload logos, automated report emailing, etc).
To generate a report you click on the pie (mmmmm pie) icon on the Links page:
Once you click there you get 2 options:
Link Report - reporting on link opportunities and completed links
Spend Report - reporting on the cost of links that cost money
Here is the dialog box for the Links Report:
Export options are PDF, HTML, and XML for Word and Excel.
The Spend Report is clean and simple to read, here is the dialog box for that:
The reports are quick to generate and clean. I think if they add some more customization options it will be a homerun; it's still better than most reporting options out there.
Keeping Up with Contacts
You can store, add, and access key contacts and their contact information within the People tab
As with their other options there is a wide variety of filtering and column customization capability to help you slice, dice, and keep track of key contacts within a specific project (or through an entire account).
You can add in pertinent contact info like their name, numbers, associated websites, social network information, and so on. You can also keep a history of calls, notes, and emails (more on emails in a minute) right inside the contact's information center:
IMAP Email Integration for Conversation Tracking
This is one of my favorite features. You can configure Buzzstream to automatically populate contact history on your link outreach campaigns:
If you are managing a team, or just your own link campaign really, this is a great feature to have. In addition to the other contact management features I mentioned above, this feature adds another layer of helpful contact management. Having CRM functionality inside of a link building tool is quite helpful when we talk about things like scaling link building campaigns and managing teams
When you add your email account you can also send email from Buzzstream. You can select any number of "People" or contacts that you want and work through them one by one by creating an email template (see below) and quickly customizing it to the specific person you are targeting
Using canned responses in Gmail is similar but the difference here is the integration with Buzzstream and the ease of going right through a selected list of contacts (and having it saved in their contact history automatically).
Lots of people use BuzzStream as a database of all their prospects/partners and then slice and dice them for campaigns. So, for example, suppose you are trying to secure guest posts. You go to All Contacts (contacts for your whole account, not just one project) and select everything tagged "finance" that's a "guest post" type and that's linked to you in the past.
After that, you take those contacts of known finance guest post opportunities, copy them to a new project and then work that list. You cover a lot of this in your filter descriptions. Essentially, use the tagging and filtering system to build your own database for rinse and repeat solutions.
You can also track Twitter stuff (which can get out of hand quickly in terms of back and forth contact, real time) and works the same way as Buzzstream's IMAP integration.
For the Twitter tracking you can basically import a bunch of twitter lists into BuzzStream, start retweeting their content and then filter to find everyone you've retweeted three days ago (filter by: Communication History=tweet, contact modified=3 days ago).
Save this filter and you have a list of people to follow up with on a regular basis. You can then send a template-based email that refers to the retweet and use that as a quick in to perhaps securing a link opportunity.
Buzzstreams' Buzzmarker gives you the ability to save a prospect's information from any browser. To set up the Buzzmarker you just go into your settings and drag the bookmarklet to your toolbar :D
Here is a snippet of the Buzzmarker dialog box:
Anytime you come across news stories, blog posts, and Twitter feeds that you want to store for future work inside of Buzzstream all you do is click on the Buzzmarker
The Buzzmarker pulls in lots of information and gives you options to do a variety of things like:
Add a task for the clipping
The ability to gather and note link information like acquistion method and link type, also checks to see if the site is linking to you already
Add contact info and social media profiles
Links through to contact info search in Google, Pipl, as well as Twitter and Linkedin Profile search via Google, Twellow, and Linkedin
Give Buzzstream a Shot
If you are looking for a strong link building tool which incorporates any of the features below, you should give Buzzstream a try:
Built in Link Prospecting
Ease of Use
Permission and Access Control for Teams
Link Tracking and Reporting
Buzzstream is a quality link building and link management tool that is certainly worth trying out if you are engaged in link building activity. The reporting is stronger than most other options out there but I think they can do even better with it after seeing what they've done on the inside. If you do try them out let us know what you think in the comments!
Take it for spin, they have free trials available over at Buzzstream.Com.
Sharing is caring!
Please share :) Embed code is here.
For many years it was true that SEO = links, but due to the rise of rel=nofollow, fearmongering & social media, organic links have lost much of their relative importance in many verticals.
Links are still valuable in some areas of course, but where the search results are full of listings from Google.com, pushed below the fold from larger AdWords ads and/or heavily skewed by things like brand bias there is much less value in link building in numerous big money markets. After all, few care who ranks #1 if #1 is below the fold!
Some of Google's new search results look quite alarming in terms of every single link above the fold is either a paid ad, or links to yet another Google page wrapped in ads.
I have a huge monitor & it is impossible for me to click *anywhere* above the fold on some search results without going through Google's toll booth or clicking off to yet another Google ad wrapped page. (click on the image for the full sized view)
Some people have given Google the benefit of the doubt "well this is just vertical search" and "this is just for the consumer" but we see that in many cases it harms consumers by limiting choice:
Charlie Leocha, the director of the Consumer Travel Alliance, says Google Flight Search is "limiting consumers' knowledge." He explains, "this is a situation where Google is trusted as a 'search engine' that goes across the whole Web, but it is only going to a small select group of airlines and including them in Flight Search."
The bottom line?
According to Leocha, "Google and the airlines have a sweetheart deal with each other, and the consumers are getting screwed."
Those who coddled Google & gave Google the benefit of the doubt now have egg on their face, and the industry as a whole is poorer for their poor judgement & lack of stewardship.
As absurd as the above behavior is, it gets worse. When Google acquired DoubleClick, Larry Page wanted to keep Performics (an SEO/SEM company). But since it would have been a flagrant violation of law for him to run an SEO company, they now decide that nobody should run an SEO company...telling consumers to simply forget about SEO even when they specifically search out information about SEO!
Google recently ran AdWords ads with the following copy when consumers searched Google for SEO information:
You know Google's slogan: "maybe the best ads are just answers." And sometimes they are misdirection or scams that quite literallykill people.
You can't be 100% certain which is which until long AFTER you click. And by then Google's cash register has already rang & it is off to dupe the next person.
Comments turned off, as this is a conversation that NEEDS TO SPREAD. If you run a blog about SEO, you owe it to your readers & your industry to cover this topic. If this topic doesn't get broad coverage then pretty soon your career might be over & you will deserve it too.
There's been quite a few posts by Aaron lately about the things Google is doing wrong, so I figured I'd help Google out and give my boys running the most dominant tech company on Earth a couple of ideas on some things I'd love to do. Who am I? I'm just an anonymous blackhat with too many ideas. You see, I lack the scale and lobbyist army to pull off giant game-changing feats, so rather than just waste a fantasy I think Google could turn them into blackhat realities.
Sell illegal drugs. There's a reason people sell drugs: money, and lots of it. Rather than do the usual narcotics though, I think Google could specialize in flinging massive amounts of pharmaceutical grade contraband…you know, the kind of stuff you need to see three doctors, a pharmacist, and a priest for. And the best part is, if they continually sidestepped large pharma companies by pushing the product via misspellings of the brand name drugs, they could get away with it for like 5 years. No one would ever know! Oh, they did that? Yikes, the DOJ? Ok, moving on.
I'll just chalk that up to them getting pinched for selling a legitimate product, a big brand turf war if you will. If that's the case, Google should invest in figuring out all the top ecommerce KWs and give the list away to oversees peddlers of counterfeit goods. It isn't drugs, but that Gucci knock-off at close to Gucci prices sure has a good margin on it when you're artificially inflating CPC bids with phony quality score demotions. They should get right on that. Man, I am behind the curve again! Don't worry G-men, I'll wink and nod while you "aggressively" crack down on these searches that take less than 5 seconds to find.
Well, ok. They sold drugs and fake goods already. I suppose they could always profit from their Adwords customers multiple ways by interrupting the landing page destination process a few percentage points of the time and…I GOT IT…they could somehow use their ridiculously ubiquitous toolbar base to provide a "feature" that invites the end user to compare the price of the product the advertiser worked so hard to attract and paid Google directly for. Man these guys are good…er…bad. I'm getting jealous here. This is like Goldman Sachs execs in the extreme north 1% making a ton of money advising a client like Greece (the Adwords customer in this case) and then actively profit in the demise of that client by shorting its bonds (by using Google Related to earn that secondary revenue stream). HAHAHA. Oh man, the only way they could have done that any more beautifully is if the recommended pages were somehow funded by Google Ventures and crammed full of Adsense and Viglink.
Speaking of toolbars, I don't think they are leveraging that toolbar install base enough. Yeah yeah, it is a browser extension or plug-in technically, and is governed by a fairly narrow permissible use TOS. But still, wouldn't it be cool if they used it to hijack an install process onto various OS? That way they could push out all sorts of malware, spyware, and adware and maybe even circumvent the OS itself to push people into Chrome OS. Holy crap, that's so awesome – take that Apple!
Come on, like Apple is a saint. We were all thinking of doing it. An OS is nothing though; what really turned Apple around as a company is its iPhone. If Google could have gotten advanced knowledge of its development behind a string of NDAs and a maybe a seat on Apple's board in order to quickly produce a near identical product; that would be something. Oh. My. Schmidt. What's even classier is refuting a dead man's words and calling his final dying passion a lie. Siri, get me a lawyer. LOL
Eric Schmidt and the crew do make awesome spies; I can't compete with that. I'm concerned that they aren't spying enough though. Hey, wouldn't it be swell if Google used those fancy street map cars that take naked pictures of me in the front yard and do something really special? I'm thinking grab EVERYTHING within signal range; the best way to make sure someone is using Google is to grab their router login, hack the logs, and check. My friends, I am in awe of your blackhatishness. Nmap is pretty cool huh G-men? Did you install some warez bots too while you're in there?
Warez and crackz shouldn't be scoffed at. Lots of traffic volume from China and Eastern Europe are from people looking for these things. Who cares if its illegal; if the first 6 things listed didn't stop my law-skirting buddies at Google, I don't think silly little copyright laws should slow them down.
And nothing should slow down the progress of making our kids literate, for a nice cut of the profits of course. The way I see it, Google is good at getting other peoples' content; what if they just took all the books in the world and copied them? I bet the authors wouldn't even blink an eye, since they just want their works discovered anyhow. Wahhh, you stole something I worked on for 3 years and put it on the web for "free" until ads are wrapped around it and I'm completely cut out of the process. Wahhh.
If the author's guild didn't even put a chink in the armor, Google's Wolfram and Hart trained biz dev team may as well get more aggressive. Clearly no one has the teeth to make them obey any sort of law. Killing search dissenters is probably a little early in the game plan (table that for 2014), so why not just kill business models instead. Coupons? Nuke Groupon by launching your own product that uses Adwords data from Groupon's campaign to fuel offer intelligence. That isn't good enough though; what if they took a huge information repository, flat out scraped it, served it up as their own, and then penalized the guys they took it for with a duplicate content penalty. Wow man that'd be hilarious. Well, Matt Cutts did say roughly 40% of the DMCA complaints are phony. That's probably just the case. ;)
DMCA got me thinking. All us SEOs are saying video is the next big wave of spam, so what Google really needs to do is pirate the video web in order to get ahead of the curve. Well then, surf's up.
Killing is probably still out and coveting other people's oxen seems kind of low margin, so maybe they could just steal some more stuff. Scraping has been done to death, but maybe they could steal software from others, sell it as their own, and hope they don't notice. Too bad you got caught, but then 'oracle' does sort of imply they could see it happening in advance.
You know what…Google is doing way better than I ever could, mainly because being a blackhat mostly means doing boring things like buying links, not engaging in the kinds of criminal activities listed above. Kudos my dark arts brethren; you've taken this to a level that would leave me behind bars, and yet you STILL have people believing you are the stand against all that is evil. You truly are masters of deception; here, I have new logo for you.
"Every single leading company is waiting for user-generated content or is licensing content" in order to reach advertisers, Rosenblatt said. "YouTube was tired of waiting. They told us that they needed a home and garden channel, a pets channel and a health/Livestrong channel. They are paying us up front, plus a rev share. This is the beginning of them funding professional content creators."
With some 1.4 million employees on its U.S. payroll, Walmart's world is about as large as the state of Maine. That's massive by any standard, but when you consider how social media amplifies that number, it's not simply a huge group but an influential one. No small wonder, then, that the earth's largest employer is taking greater measures to motivate and mobilize its people -- and opening up more opportunities for consumer brands to also reach them along the way.
These brands can not only leverage internal resources to further build off the boost Google offers them, but they can then take that attention and sell it back off to the highest bidder:
It's not clear how much ad revenue Walmart World has made or whether MyWalmart.com will become a profit center. But the former already takes in millions of dollars annually in ads from vendors seeking an audience with Walmart employees, according to people familiar with the matter.
If Google consolidates markets too aggressively then ultimately they create competition for themselves through vertical ad networks. In some cases (say travel) Google can buy out the market plumbing & then reassert control:
Wertheimer drew some criticism when he explained that "our airline partners were very clear" that they wouldn't participate in Google Flight Search if online travel agency booking links were included in the core flight-search results.
But Google doesn't have that same influence over retail & each time they put the big brands front and center the more they reinforce that 3rd party dominance.
In addition to leveraging their workforce, it is also quite easy for these brands to use customer incentives to dominate social media.
The above is another reason why Google is pushing so hard to control the second click. If they can taste the traffic again they add efficiency to their own model while introducing another layer of friction to other retailers.
When users finally manage to leave the Google click circus, Google tries to pull them back into Google with the Google Related toolbar
In the above quoted AdAge article there is some skepticism around how much a company like Walmart can get out of underpaid wage slaves:
"It's really hard when you're a person making poverty-level wages, just had your health-care premiums raised 60%, and you can only get part-time hours, to be a good ambassador for the brand, no matter how much you love it," said Jennifer Stapleton, spokeswoman for Making Change at Walmart.
However I think that skepticism is misplaced, as the less a person has the more thankful they tend to be for the little bits they do have. Most people who have nothing do not realize how systems are engineered to screw them over.
It is only when you have free time to think & are not clouded by arbitrary short-term stress that you can ponder the bigger & more uncomfortable questions in life. As long as you don't consider those uncomfortable questions it is far easier to push anything, because you don't know any better.
"The entire web has become full of garbage. The web has become almost a digital Detroit." - Roger McNamee.
If Walmart's strategy works then this ultimately will be why Google's brand-only approach to search will fall flat on its face. If this is successful I would then expect Google to put out some public relations drivel about celebrating the diversity of the web & move away from brand in the next 2 or 3 years.
In the meantime, I expect Google to keep increasing search complexity such that it's prohibitively expensive to make & market a small independent commercial website. That will force many smaller companies to live inside the Google ecosystem, with Google ranking the Google-hosted pages/products/locations for those companies, so that they can serve ads against them and get a bigger slice of the revenues.
Google's ad network is far more profitable than even the lowest waged employee, as it doesn't need to be fed & is designed to be an agnostic & amoral yield optimization tool. And it is effective enough that the biggest retailers are now becoming ad networks.
Average products for average people - with ads everywhere.
Welcome to the WorldWideMart. ;)
Then after the survey: "Thanks for your feedback. Candidate y supports your views on issue x."
Advertisers then get a report like: "in Ohio, 84% of the 289,319 swing voters with an average household income between $32,400 and $67,250 think issue x is vitally important and have a 6:1 bias toward option A. They respond to it more strongly if you phrase it as "a c b" and are twice as likely to share your view if you phrase it that way. The bias is even stronger amongst women & voters under 50, where they prefer option A by a factor of 9:1."
Couple that ability to flagrantly violate their own editorial guidelines with...
knowing user interests (and many other pieces of vital information)
... & Google is in an amazing position politically.
It is thus not surprising to see how politicians have a hard time being anything but pro-Google, as they are the new Western Union.
This isn't the first time Google experimented with cloaking either. Threadwatch had a post on Google cloaking their help files years ago & YouTube offers users a screw you screen if they are in a country where the content isn't licensed - yet they still show those cloaked pages ranking in the search results.
"The most perfidious way of harming a cause consists of defending it deliberately with faulty arguments." ― Friedrich Nietzsche
It is common knowledge that you shouldn't mix business and politics, however if one looks at history, many of those who gave us those sage words did precisely the opposite - and often illegally so - selling us down the river.
What is so obnoxious about Google's survey trial is that a big site that was hit by Panda was hit because they used scroll cloaking & didn't let the users get to the content right away. Googlers suggested users didn't like it & voted against it, and then roll out the same sort of "wait 1 moment please" stuff themselves as a custom beta ad unit.
And today Google just announced that they might create an algorithm which looks at ad placements on a website as a spam signal outside of Panda:
"If you have ads obscuring your content, you might want to think about it," asking publishers to consider, "Do they see content or something else that's distracting or annoying?"
On the one hand they tell you to optimize your ad placements & on the other they tell you that those were not optimal & are so aggressive that they are spam.
For a while there was a period of time where you could use something like "would Google do this" as a rule of thumb for gray area behavior.
In the current market that won't work.
"No man has the right to dictate what other men should perceive, create or produce, but all should be encouraged to reveal themselves, their perceptions and emotions, and to build confidence in the creative spirit." ― Ansel Adams
As ad units get more interactive & Google keeps eating more verticals the line between spam vs not will keep blurring.
Perception is everything.
"We are all in the gutter, but some of us are looking at the stars." ― Oscar Wilde
We are better off if we ignore what Google is saying and follow one thing: Google wants more money for Google. When we make this assumption, everything Google does makes sense. Deception and doublespeak are logical and expected rather than shocking and upsetting.
When it comes to scale, as pointed out with Groupon, all of these rules go out the window. If you look at the biggest advertisers, replace their account with one with no history and the brand "Geico" with "SEOBook auto insurance" and the campaign will simply not run. You are spam. In some cases larger advertisers are able to run ads which are clearly deceptive and go against guidelines which they actively enforce on smaller advertisers. I have a strong suspicion now that this is in fact institutionalized in Google's rating process rather than any employee going out of their way to overturn some sort of penalty.
Google will not disrupt a site or advertiser that will negatively impact their own quarterly earnings. When Google does disrupt one, it is because they have a backup in place. That backup may be their own internal project or a competitor of yours who sends 95% of their advertising through Google's ad platforms. When Google claimed they were going after content farms, and Demand Media's properties (which are explicitly spam) were spared, the reason was obvious, because it would have visibly impacted their bottom line.
Brand is a deceptive concept. A hairy, smelly drug addict that compulsively molests women is not a sex offender but rather a globally famous rock star. Much the same holds true to many of the biggest brands. As long as a brand spams, that spam is opaque to Google's customer base and their customers do not bring a negative association with Google's brand. However, when that same hairy, smelly drug addict is anonymous he is a nuisance which destroys your reputation when you publicly associate yourself with him.
Google is like an oil company which not only dictates the price of oil but also chooses where an oil field will exist. Google is now "too big to fail" as indicated by the recent DOJ investigation which could have resulted in a felony charge for their co-founder, and most certainly would have for a smaller firm without $500m of liquid cash. We should be thankful that visitors are still directed to our websites when they could simply receive excerpts of what they are searching for.
My conclusion: first, I monetize my existing sites with Google's own products as much as possible. Second: I no longer invest my time or money in new businesses that require Google's traffic. Google should expect more walled content gardens in their future. Google's biggest challengers such as Facebook and Apple recognize this, and their platforms are very much walled gardens. That is too bad for the web as we know it today.
As a consumer I want Google to have the best, most trustworthy experience possible. They can fight SEOs and affiliates all day long and it doesn't bother me. I fully expected the innovative waves that helped the web destroy old media do the same again to itself. But, when Google lies, and do things that in fact damage that consumers experience no longer can I defend Google (when eHow first started popping up in 50% of the searches I did I was shocked; I am absolutely appalled they still show up on page 1 for anything, the articles are obviously written by authors that re-hashed another article in 10 minutes and often factually incorrect on top of it.)
--- Andrew Johnson submitted the above (less the image) as a comment here, but we thought it deserved to be its own post on the blog so more people get to see it.
Google collects a lot of information on individuals & can have some level of confidence if the person is a real person or not based on things like their history of email usage, if they have a credit card on file, how they interact with other high confidence real accounts, how many people are friends with them on Google+, usage of an Android cell phone, their search history, etc.
Google doesn't need all those signals on any individual, just some blend of them.
From there they can create a lot of usage-based brand signals.
Query Volume + Click Distribution
For any keyword Google can see the search volume & the click distribution on the search results.
If a lot of people click on the top result & very few people click on the second or third result there is a strong chance the keyword is a brand. If the click distribution is spread more evenly across the search results then it is less likely to be a brand keyword.
The above was a hypothetical example, but the following image shows how lower volume branded navigational keywords can drive far more traffic than broader industry keywords. We get twice as much traffic for seobook & seo book as we do for seo.
When people search for a generic keyword they may (immediately or later) modify their search query to search for related keywords. In the past Microsoft offered a search funnels tool that would show common searches before & after a keyword. If someone searched for credit cards they might soon search for visa or mastercard.
Of course getting the user to click is just the first step. From there you must satisfy them. ;)
If you visit a page quickly & then jump right back to the search results Google asks users for an explicit vote against that site.
And if you visit a page for a significant period of time Google asks users for an explicit vote for that site.
That Google is measuring the time until return the search results to determine which explicit vote to request also implies that they can use the same aggregate data to create an implicit signal.
Where this measurement can get a bit fuzzy is that Panda can create a self-reinforcing impact (good or bad).
Self-Reinforcing Positive Impacts
Let's say your site got a ranking boost by Panda. It will rank higher across broader industry keywords, to where people may enter your site at the category level (say shoes or Nike shoes) and then surf around your site quite a bit. This equates to a longer time on site & a better user experience.
2 more factors on this front are branded navigation & familiarity.
On some search results Google shows branded search options.
If clicking those brand & store links feeds into the Vince relevancy signal, then any brand featured there has a huge wind at their back, building further brand signals. Eventually such suggestions can work their way into Google Instant keyword suggestions as well. Even if people do not click on those particular options, the various highlights in the search results act as advertisements for the brands, which drive incremental demand and search volume for those brands.
Amazon.com is responsible for roughly 1/3 of ecommerce spend in the United States (outside of travel), so many people might go and research product options generally & then conclude those search sessions by seeing if they can buy it off Amazon.com (due to getting free shipping & the high level of user trust Amazon has). As this becomes part of search relevancy algorithms this is the online equivalent of going to your local Borders store to find something to buy & then buying it on Amazon. In the short run you save a few Dollars, but in the long run stores like Borders go out of business.
Self-Reinforcing Negative Impacts
There are 2 bad ways a business can be impacted by Panda. One is missing out on the above promotional options that a large competitor may enjoy, which over time build more brand signals for them & leave your site stranded in no man's land until it is finally clipped by Panda for lacking "quality."
A second issue is a self-reinforcing issue with Panda. On WMW a user nicknamed Walkman described it as the "size 13 shoe problem." After you have been hit by Panda you are not likely to rank for broader category level searches. However you might still rank for some really obscure longtail keyword that is uneconomic to address directly (and thus only have a glancing mention of the user's intent). Your page might say we do not carry size 13 or size 13 out of stock and your Panda-hit site ranks for "Nike Carmelo Anthony size 13." Thus the user bounces, creating a self-reinforcing negative user experience signal.
A third (non-Panda) issue that can cause poor user experience metrics is when Google mutates the search query in a way that makes the organic results irrelevant.
More on User Votes
Google has long used reviews in their ranking algorithms & even made a tweak to demote businesses with negative reviews.
The above examples of +1 votes and blocks can be used (along with the time on site & repeat visits) to gauge user satisfaction, however if they can't get enough engagement then it will be very easy for big brands to buy that signal for pennies on the Dollar, as some social signals are easily bought by brands.
Not only does Amazon directly integrate promoting your wishlist on social media ...
... but they also have done interesting promotions like a "Tweet & get" ...
Imagine if/when a new local Wal-Mart store launches offering a free $10 coupon to everyone who Tweets their savings at the checkout counter!
One big issue I have with the +1 votes & blocks is that they apply across the board. I may dislike some craptastic videos hosted on YouTube, but there is also a lot of great content there. I love eBay for vintage video games, but it does not mean I love them for books.
Likewise some of the friend of friend stuff can be a bit off.
At some point Google should make +1 votes & blocks more granular.
Near the end of this article I will also further discuss some issues with ad votes.
Does Google measure repeat visitors? Yes.
They use that user interaction to ask for an explicit vote...
...and they can use it as an implicit vote as well.
They not only track how many times you visit a page or site, but also when you last visited it.
Once it is obvious Google is counting certain types of user metrics (just like they count links) there will be a race to the bottom to provide those said signals. That race to the bottom will lead to such signals being sold by accounts that either have sketchy trust metrics associated with them (if done through automation) and/or in markets with lower living costs.
In addition to AdSense & Google Analytics, Google has huge search market share, a widely distributed toolbar and their Chrome web browser. They can track where language is used in certain ways and where a site is popular
And they can also track where the votes come from.
If your domain name matches your keyword that may be a brand signal. However, Google may also look at some other signals (like user engagement, repeat visits, relative CTR, etc.) as confirmation signals on this front.
Sometimes when a spammer builds links they trap themselves by using the same anchor text too much. Whereas when a branded website pulls in organic citations the anchor text tends to be mixed up, like... http://www.paypal.com www.paypal.com Pay pal.com paypal.com pay pal Paypal paypal payments etc.
Diversity in any sense (anchor text, linking sources, pages being linked to, links built across time, etc.) is generally considered a good thing.
Other types of links might also be seen as potential brand signals. For instance, frequent exposure in trusted news sites, other trusted seed sites, or other known brand sites could pass additional karma. Some link spikes that are also associated with strong direct traffic spikes, strong referral traffic from the links, and strong brand searches might also boost the weight given to links.
In local search Google has long used the sites they displaced in the organic results as citations (even if they were in some cases unlinked).
In addition to offering branded filters in their internal navigation, many merchants submitting their products to Google product search may also be giving Google signals about which brands matter.
Google will be able to lean into Zagat ratings for business & other data sources (Google Wallet, Google Offers, etc.) will provide additional signals to Google.
Any type of non-search distribution you have (RSS subscribers, email newsletters, mobile applications, physical stores, membership loyalty programs, etc.) makes it easier to influence search engines.
If advertising with Google had a negative impact on search relevancy you can be sure that the relevancy algorithms would change. Whereas if there is a convenient positive spill over then Google won't complain. In fact, they will even go out of their way to advertise that spill over. Any sort of advertising you do increases brand awareness. And that leads to additional incremental brand searches (and thus brand signal)
More exposure also leads to more user experiences, which in turn leads to more opportunities for people to leave signals behind (be it links, social mentions, additional brand searches, and/or repeat visits). Here is State Farm buying *irrelevant* brand signal for pennies on the Dollar.
And of course there are all sorts of corporate advocacy ads as well.
Even if those votes don't influence rank directly, they still influence user perception. And what is so bad about that is that users are only voting of the content of the ad. This basically is the equivalent of cloaking.
If the landing page doesn't match the ad (free iPad anyone???) then people are going to see their friends vouching for scams & get duped by Google.
That is worse that a press release being advertised as though it was news
You can also be certain that some clever spammers are integrating +1 buttons in display ads on other ad networks in ways that may automatically collect user clicks & so on, or have users pay for viewing their next porn video by clicking a +1 button (much like some old school email spammers used porn viewers as manual captcha breakers).
Google does offer the ability to vote against an ad as well, but if an ad looks great upfront & its the landing page that scams you then how exactly do you vote against it if you don't see the site until after you click the ad?
If you read any of Google's older guidelines that leaked over the years you would see a consistent disdain toward affiliate sites. This was also reflected in official advice at search engine conferences & whatnot.
A friend of mine went to Google's campus & Google offered to "optimize" their AdWords account. As soon as the word affiliate came up it was like spoiled meat. Replacing the word "affiliate" with some other idiotic made up phrase (I think it was "regional online distributor") suddenly made everything O.K. again. Other friends had similar stories.
Note that the difference between "affiliate" and "regional online distributor" is for all intents and purposes linguistic crap, however it can be the difference between life and death for an online business.
To be fair, the ready availability of feeds to quickly generate sites means that most affiliate sites will be garbage. At some point Google gets sick of fighting the same battles over and over again. Then again, most websites are garbage & only the top x% of anything is going to be great.
At Affiliate Summit last year Google's Frederick Vallaeys basically stated that they appreciated the work of affiliates, but as the brands have moved in the independent affiliates have largely become unneeded duplication in the AdWords ad system. To quote him verbatim, "just an unnecessary step in the sales funnel."
It is worth noting that Google doesn't consider itself "just an unnecessary step in the sales funnel" when they insert themselves as an affiliate.
Should information empires be allowed to discriminate based on nothing more than the business model of competitors?
Spam vs Not Spam
The most recently leaked Google rater document stated
Spammers create spam pages to make money. Sometimes, they make money directly, by placing moneymaking links on the spam page. Here are two types of moneymaking links:
Pay-Per-Click (PPC) ads: Spammers get paid each time ads are clicked on their webpages. Another term for PPC ads is "sponsored links".
Thin Affiliates: Spammers make money when a transaction is completed after the user has clicked through to the merchant's site from their webpages
PPC ads appear on many, many webpages. Some pages with PPC ads are spam, but many pages with PPC ads are not. Pages should not be assigned a Spam flag if they are created to provide information or help to users. Pages are spam if they exist only to make money and not to help users. Sometimes, spam pages do not have moneymaking links. These spam pages are created to change search engine rankings or even to do harm to users' computers with sneaky downloads.
So in essence, the difference between spam & not spam is if the page is helpful to users.
The rating document takes 130 pages to clearly articulate the difference between what is spam and what is not spam.
But the core ethos in categorization is if it is original & helpful it is not spam unless it is doing something deceptive.
A Minor Exception*
Google's rater guides also arbitrarily sneaked in the "what the hell, if it is affiliate, it is spam" card:
Note: Major cosmopolitan cities are preferred targets for spammers, especially hotel affiliates. Such results should be flagged as Spam, even if they are related to the query and helpful to users. For example, a hotel affiliate page with a list of Chicago hotels may be assigned a rating Relevant, but also receive a Spam flag.
Google is directly going out of its way to attack competing business models.
Even if the site is quality - any way you slice it - they still tell raters to label it as spam if it is a hotel affiliate.
Once again it is worth pointing out that the label "affiliate" is just an arbitrary label. It could just as well be a "commissioned salesperson."
An Example Market: Books
In our forums one of our members quoted a brilliant book by Karl Polanyi from 1944 which was full of gems like "A so-called self-regulating market economy may evolve into Mafia capitalism — and a Mafia political system"
I searched for that quote & guess what ranked #1?
Google Books of course.
Google's owned & operated affiliate offering in the niche.
The stolen version hosted on Google.com ranks #1...everything else is either spam, unneeded duplication in the marketplace, and/or conjecture that can float up and down as they tweak the algorithms.
To say that the book publishing industry is undergoing pains would be an understatement. But maybe in some weird way Google promoting Google helps the book industry by giving it more avenues to be seen? Maybe they are trying to help out book authors?
The structure of the book industry prevents the book author from getting anything but a small slice of the book's revenues (unless the author is well known and/or they self publish). Markets being what they are, most authors live in obscurity on the long tail. To help supplement their low cut of the revenue pie, some book authors use affiliate links to link to Amazon.com as a purchasing option on their official book websites.
Recently in our forums a member created a thread about a client site being blocked from AdWords because there was an affiliate link on the page for their own book!
Google is The Biggest Online Affiliate
So the author is not allowed to advertise his own work to give you multiple buying options & highlight options which offer her additional compensation, however...
Google is free to steal the copyright work & promote their looted version first
And yet the word "affiliate" is a bad word.
The word affiliate is arbitrarily tarnished in the same way that SEO is.
Use another label & if you do the exact same thing it is clean. Craigslist or eBay are not affiliates as they are marketplaces. Wal-Mart & Amazon.com might do drop shipping & have some affiliate promotions on their sites, but they are retailers.
These arbitrary label differences make a big difference to the stability of an online business.
Machine Learning vs a Small Business Killing Machine
Google can claim that they use artificial intelligence and machine learning and are unbiased, but their ranking systems need training sets. And if upon this alleged independent rating affiliates come up as "spam" then how can an affiliate build a sustainable business model?
I know what you are thinking: "Well, Aaron, they can stop being affiliates and move up the value chain."
The problem with that is that as an affiliate I can compare a lot of products in a condensed space, but if I accept payments for products then I likely need to have a page for each product. The issue there is that if you do not have a strong brand and you have lots of pages on your site there is a great chance that the Panda algorithm will torch your website.
At the same time, if you try to go big & thick you have to worry about competing against Google as they buy out vital pieces of the supply chain, create their own affiliate partnerships, steal your content & outrank you with their copy of it, and launch their own affiliate channels & affiliate stores on their websites.
Brand Sites Become Affiliates
One of the things Google mentioned to identify thin affiliates from other merchants is this:
Check to see if the address of the image is the same as the address of the page or if it is the address of a "real" merchant?
The new items on the website will mostly get to consumers through third-party sellers, which means B&N won't have to carry the expense of inventory. The bookseller will just take a sales commission of 8% to 15% on each item.
What's worse, when brands come under review for spamming, Google says that they already ranked #1 so there is no reason to penalize them. Which is precisely why you can now buy rugs on Barnes & Noble. And it is precisely why you can find dating offers, education offers, jobs, and automotive sections on Excite.com. There is no SEO risk in brand extension for large brands that can do no wrong.
Google puts weight on domain names then suggests that domains can be a spam tool. So in a sense, if you invest in whatever Google trusts and are small you are a spammer. Whereas if you invest in whatever Google trusts and are large you deserve the benefit of the doubt & further promotion.
Sometimes the only difference between the brand and spammer labels is that the brands spam harder.
Google put the +1 button in display ads & claims that if you click on it you are recommending the site in the search results (in spite of having only seen an ad & not actually having seen the landing page yet! how hard is it to advertise "free money" and then offer up a landing page which says "oh, but there's a catch"?)
So if you have brand & money you can just flat out buy the "relevancy" signals. Yet if you try to create similar signals without paying Google & without owning a billion Dollar brand you are shunned & labeled as a spammer.
This subjective circular nonsense is getting a bit out of hand.
In summary, we are not SEOs and we are not affiliates.
We are a brand & we will buy retargeting AdWords ads + up our AdWords budget appropriately.
If we rebrand to remove "SEO" from the domain name can we please be added to Google's whitelist? ;)
As the co-founder of an SEO Consultancy, my biggest hurdle in business is finding more staff. Clients are lining up at our door, we have no trouble there, it's finding the staff to work with them that becomes the issue. This may not sound like the worst dilemma for a business to face, especially during the current global economic decline, but the causation is a matter of great concern to me as both an SEO and a businessman. Ayima's company structure is such that only highly skilled SEOs make it through to our interview stage and yet even then, less than 5% meet our skill requirements. This isn't me being picky, misjudging characters or sourcing bad candidates - this is a knowledge pandemic that is spreading through our industry. We've started apprenticeship programs to teach eager candidates from the ground up, but this can take several years to generate the finished article.
After looking back at our past 30 interview candidates, my opinion for the reason behind this issue may not be a popular one. I believe that celebrity SEOs, brands and blogs are feeding a generation of untested and poorly trained search marketers, who pass themselves off as SEO experts. I will of course explain my positioning…
The Pander Update
Some high profile SEO bloggers recently ceased client work and personal projects, in order to appear impartial and trustworthy to their community. This makes sense at first, after-all, who wants to use a link building tool operated by someone working for one of your client's competitors? It does however bring to light 2 much larger issues;
1) a reliance on tertiary information for SEO analysis, and 2) a reliance on search engineers to provide fresh and exclusive information/data.
Some SEO information sites may argue that they have access to the Web Analytics accounts of their partners and that they do study index changes, but nothing replaces the value of following a handful of websites every single day of the year. An absence of "boots on the ground" leads to misinformation and a distancing from the SEO practices and concerns that really matter. This in turn results in an information churn which newbies to the industry naturally perceive as important.
Moving away from servicing clients or running in-house/affiliate projects also causes a financial flux. Revenue no longer relies on problem solving, but on juicy search engine spoilers and interviews. Search Engines are businesses too though and it's in their best interest to only reward and recommend the publishers/communities that tow their line. A once edgy and eager SEO consultancy must therefore transition into a best practice, almost vanilla, publisher in order to pander to the whims of over-eager search reps.
How do we expect the next generation of SEO consultants to analyse a website and its industry competitors, when all they've read about is how evil paid links are and how to tweak Google Analytics?
I could directly link the viewpoints and understandings of some recent SEO candidates back to a single SEO community, word for word. They would be horrified to see the kind of broken and malformed SEOs that their community has produced.
OMG, Check Out My Klout
It's true that social media metrics will become important factors for SEO in the future, but this certainly does not negate the need for a solid technical understanding of SEO. Getting 50 retweets and 20 +1's for a cute cat viral is the work of a 12 year old schoolgirl, not an SEO. If you can't understand the HTML mark-up of a page and how on-page elements influence a search engine, pick up a HTML/SEO book from 2001 and get reading. If you don't know how to optimise site structure and internal linking, read a book on how the web works or even a "UNIX for Dummies" manual. If you're unable to completely map out a competitor website's linking practices, placement and sources, set up a test site and start finding out how people buy/sell/barter/blag/bate for links.
You may be thinking at this point, "Rob, I already know this - why are you telling me?". Well, the sad fact is that many SEOs, with several years of experience at major and minor agencies, fail to show any understanding of these basic SEO building blocks. There are SEOs who can't identify the H1 on a page and that seriously consider "Wordle" and "Link Diagnosis" as business-class SEO tools. It used to be the case that candidates would read Aaron Wall's SEO Book or Dan Thies' big fat Search Engine Marketing Kit from cover-to-cover before even contemplating applying for an entry level SEO role. These days, major agencies are hiring people who simply say that "Content is King" and "Paid Links are Evil", they have at least 50 Twitter followers of course.
"Certified SEO" is NOT the answer
In most other professional industries, the answer would be simple - regulate and certify. This simply does not work for SEO though. I die a little, each time I see a "Certified SEO" proclamation on a résumé, with their examining board consisting of a dusty old SEO company, online questionnaire or a snake-oil salesman. A complete SEO knowledgebase cannot be taught or controlled by a single company or organisation. No one in their right mind would use Google's guide to SEO as their only source of knowledge for instance, just as no self-respecting Paid Search padawan would allow Google to set-up their PPC campaigns. Google's only interest is Google, not you. Popular SEO communities and training providers have their own agendas and opinions too.
I do however concede that some learning should be standardised, such as scientifically proven or verified ranking factors. Just the facts, no opinions, persuasions or ethical stances.
My Plea To You, The Industry
I plea to you, my fellow SEOs, to help fix this mess that we're in. Mentor young marketers, but let them make up their own minds. Put pressure on SEO communities to concentrate on facts/data and not to be scared by controversy or those with hidden agendas. Promote apprenticeship schemes in your company, so that SEOs learn on the job and not via a website. Encourage people to test ideas, rather than blindly believing the SEO teachings of industry celebs and strangers.
An experienced SEO with, what I perceive to be basic skills, isn't too much to ask for is it?
Recently we had the pleasure of interviewing one of my favorite link building experts, Melanie Nathan. Melanie has been involved in online marketing since 2003 and is a wonderful writer on all things link building in addition to being a well-respected link builder by her peers.
Melanie runs CanadianSEO, an internet marketing company based in Canada. You can check out some of her posts from the web here, follow her on Twitter here, and follow her on Google Plus here.
We hope you enjoy the interview! So I see you started your career by running a successful e-commerce store, which you then sold off to a US company and then you moved into the client side of things. When did this all start and how did you decide to get into online, e-commerce stuff?
The e-commerce stuff started in 2003. My husband and I were operating a successful brick and mortar auto repair/aftermarket accessory store in Edmonton, where my husband's dad (a skilled mechanic) would fix the vehicles and we would bling them up with cool accessories like euro tail lights and hid lighting kits. When we found out that our main manufacturer would be willing to drop ship their products directly to our North American customers, starting an online store seemed like no-brainer.
I fell in love with SEO shortly after that, mostly through experimentation with various e-commerce shopping carts and my frustration at not being able to find a decent one (at the time). Some SEO's love the idea of running their own sites rather than working on client sites based on the difference between the ratio of profits to labor on your own sites versus client sites (relatively speaking). Some SEO's like doing both to help diversify their income streams, and some like pure client work. What lead you to decide to get into the client side of things?
I'm happy working for clients because I have a genuine interest in helping people and it's extremely gratifying being able to impact someone's life in such a way. On top of that, the work is constantly changing and I can pick and choose my projects therefore it never gets boring.
If there's a downside, it's that I don't get many opportunities to experiment with different techniques or work on personal projects. This is why I've been slowly making time for the leap into the 'other' side of SEO (tool creation, affiliate marketing and yes, even some BHT) with some domains I own.
I figure, if I'm offering professional services, it's best to be as experienced as possible in order to best serve my clients. If this leads to me eventually moving away from the client side of SEO though, then I might be open to the possibility.
If you're interested in co-developing a link building tool or an affiliate site, ping me and we'll talk ;) You're well-known as a link building expert and you've written extensively on the subject. Can you walk us through how you approach/plan out a new client's plan (generally speaking) and talk about which tools you use and why?
Site owners mainly hire me in order to see measurable movement in the SERPs for their top keyphrases. This means, to help my clients stand out (where Google is concerned), I first need to see what they're up against. I therefore always start with competitive research.
Among the tools I use are; SEOmoz Open Site Explorer & Competitive Link Research Tool. I've also been using SEOProfiler Competitive Backlink Intelligence tool lately. I also use Yahoo Site Explorer (I'll sure miss this when it's gone!) and, of course, Google itself.
I look for such things as; rankings of the site, number of root domains linking, quality of backlinks, backlink velocity and social media mentions. Once I chart out what each competitor's link profile looks like, what I need to do in order to differentiate my client, becomes pretty apparent.
After that, it's all about looking for prospects and then developing realistic ways to acquire links from them. I read, and actually have Evernoted (is that the new word for bookmarking?) your Search Engine Journal post on "6 Super Tips For Creating a Natural Link Profile" and some of things you talk about there (back in 2010) might have helped sites weather parts of this latest Panda parade of updates. Those tips are logical, solid, but require a good amount of work. Do you find that link building failures are a result of trying to look for shortcuts too often or just not being willing to really put a lot of natural effort into link building?
Thank you for Evernoting (love this) and mentioning that post.
In my experience, the majority of link building failures happen simply because the linkee was too busy thinking about THEIR needs rather than the needs of the linker. They also take shortcuts that often decrease their own chances, such as; sending bad email pitches and/or using generic email subject lines and/or using poor grammar etc.
Link building offers awesome rewards, but it can be an incredible amount of effort. If you're unwilling or unable to put in that effort, I guarantee you'll be disappointed with the results. Of course, in some areas these kinds of natural links can be harder (sometimes much harder) between different sites. Do you think link building opportunities are existent enough in each market irrespective of the competition (big brands, strong sites, etc)? Or, is it more of a budget issue on the client side when it comes to being unable to complete for really competitive stuff?
I'm always up for a challenge and I have yet to encounter a niche or market where links weren't readily obtainable. Unfortunately, sometimes the techniques required to attract those links, just don't fit within the client's budget. In these cases, I recommend starting out small and, as the client sees more and more ROI, they're happy to increase their budget. After all, some link building is better than no link building.
As far as eventually competing on a large scale, I'll just say that most people grossly underestimate the power that high-quality links can have. What are the key points you look for when identifying link opportunities? Do you consider pure link value to rankings and/or consider links that might be no-follow if they have the potential to bring targeted traffic to the site?
The main thing I focus on when selecting link prospects is; relevance. The link absolutely has to make sense or I won't waste my client's time on it.
After that, I look at the overall quality (How many links on the page? Is there any PR? Does it rank for anything?) and, to save a bit of time, I like to run it through the Raven Quality Analyzer (which tells me how many backlinks, indexed pages, age of domain etc). I do all of this in order to determine how much Google trusts the site and the likelihood of a link from the site directly affecting my client's rankings.
As for nofollow links, let's face it, clients don't pay me to get them links that aren't heavy hitting so I generally don't pursue them (unless there's a specific reason for doing so such as trying to help a paid link profile appear more natural). I don't build links in humongous quantities though, so it all evens out.
If you're building links for your own site though, I would never recommend turning down a link that makes sense…. even if it was nofollow. As a provider of services, I see that you also offer a full suite of services. Has that evolved over the years from being mostly a link building company to now being a full service company? Do you find this differentiates you from other providers and is that well-rounded approach one you'd recommend for someone starting a link building company today?
CanadianSEO has always offered a full line of SEO services, however over the years I've learned from experience that it's the LINKS that get you where you need to be in Google hence why I've made link building my main focus. I now look at web design/site optimization and content creation as necessary steps in making sure your link efforts will have the desired effect.
Not sure if this sets me apart, but my clients are happy therefore I would probably recommend this approach to anyone running a SEO company. You absolutely have to be capable of attracting/acquiring/sourcing valuable links though, and this is something that apparently not every SEO is willing (or able) to do. So let's say you are advising me on how to become a better link builder or a better manager of link building teams. What would your top 3 points be and what are maybe the top 3 myths or over-hyped points I should avoid?
Become a better link builder/manager by a) developing a system for tracking progress b) learning how to be persuasive to get what you want and c) never sacrificing quality in order to meet a deadline or fill a quota.
As far as myths, it may surprise many people to learn that both paid and reciprocal links are still effective as part of an overall link building strategy. I'm always trying to emphasize that Google doesn't know as much about your links as you think it does. Especially when it comes to how your links are obtained. Yes, they do watch for certain obvious things (rate of links acquired, unnatural use of anchor text etc) but it's totally ok to be creative. In fact it's best. As long as you're being logical, you'll get the results you're looking for.
Other than that, I still roll my eyes at people who say PageRank doesn't matter when it comes to links. Hi, um, have you heard that Google still uses PR as a metric of quality? I'd like to offer those same peeps a link from a relevant PR0 or a link from a relevant PR7 and see which one they jump at.
Not that PR should be the ONLY metric you use when determining the value of a link prospect, but if you're interested in making any impact on your rankings, it should definitely be taken into consideration. For tracking link building efforts and for tracking the links you secure, do you use tools for that (like Raven or Buzzstream) or do you do that internally?
I still do it all internally/manually via custom Excel reports. Guess I'm still old-school in that regard.
A typical link report includes such data as; link URL, link anchor text, Google cache date, Raven Quality Score, relevancy info, link type, PR and link status. It has everything my clients need in order to see the progress of their link campaigns and its also great for keeping me organized as I'm often building links for many sites at once. Please tell us what you think are going to be the most important aspects of link building going forward in this age of rapid algorithm changes and social signals?
Many people assume that link development is decreasing in importance, but this is far from the case. Links are still the simplest way for search engine spiders to judge the reliability of a webpage. However, the way that search engines view links is changing.
I've definitely seen (what I consider to be) evidence that Google is using social media mentions as a measure of quality. In an age where Facebook 'likes', Tweets and Google +1's can be readily bought and sold though, one has to wonder about the longevity of such a system.
I almost feel sorry for Google in that no matter what they try to use as a measure of quality, there will always be ways to game it. I think this is precisely why they're trying to move away from organic SERPs by diversifying them so much. It's an imperfect system and I seriously don't envy the position they've put themselves in.
As always, those that can keep up and adapt, will ultimately have the most success.
Thanks for your time Melanie! You can stay up to date with Melanie over at Twitter and Google Plus.
Melanie runs the show over at CanadianSEO.Com; a web marketing firm that offers web design, SEO, link building, and content creation services.
SEM Rush has long been one of my favorite SEO tools. We wrote a review of SEM Rush years ago. They were best of breed back then & they have only added more features since, including competitive research data for many local versions of Google outside of the core US results: UK, Russia, Germany, France, Spain, Italy, Brazil.
Recently they let me know that they started offering a free 2-week trial to new users. Set up a free account on their website & enter the promotional code "89MW-YR43-HFNJ-K94M"
For full disclosure, SEM Rush has been an SEO Book partner for years, as we have licensed their API to use in our competitive research tool. They also have an affiliate program & we are paid if you become a paying customer, however we do not get paid for recommending their free trial & their free trial doesn't even require giving them a credit card, so it literally is a no-risk free trial.
What is SEM Rush?
SEM Rush is a competitive research tool which helps you spy on how competing sites are performing in search. The big value add that SEM Rush has over a tool like Compete.com is that SEM Rush offers CPC estimates (from Google's Traffic Estimator tool) & estimated traffic volumes (from the Google AdWords keyword tool) near each keyword. Thus, rather than showing the traffic distribution to each site, this tool can list keyword value distribution for the sites (keyword value * estimated traffic).
As Google has started blocking showing some referral data the value of using these 3rd party tools has increased.
Using these estimates generally does not provide overall traffic totals that are as accurate as Compete.com's data licensing strategy, but if you own a site and know what it earns, you can set up a ratio to normalize the differences (at least to some extent, within the same vertical, for sites of similar size, using a similar business model).
One of our sites that earns about $5,000 a month shows a Google traffic value of close to $20,000 a month. 5,000/20,000 = 1/4 = 0.25
A similar site in the same vertical shows $10,000 $10,000 * 0.25 = $2,500
A couple big advantages over Compete.com and services like QuantCast for SEM Rush are that:
they focus exclusively on estimating search traffic
you get click volume estimates and click value estimates right next to each other
they help you spot valuable up-and-coming keywords where you might not yet get much traffic because you rank on page 2 or 3
Disclaimers With Normalizing Data
It is hard to monetize traffic as well as Google does, so in virtually every competitive market your profit per visitor (after expenses) will generally be less than Google. Some reason why..
In some markets people are losing money to buy marketshare, while in other markets people may overbid just to block out competition.
Some merchants simply have fatter profit margins and can afford to outbid affiliates.
It is hard to integrate advertising in your site anywhere near as aggressively as Google does while still creating a site that will be able to gather enough links (and other signals of quality) to take a #1 organic ranking in competitive markets...so by default there will typically be some amount of slippage.
A site that offers editorial content wrapped in light ads will not convert eyeballs into cash anywhere near as well as a lead generation oriented affiliate site would.
SEM Rush Features
Keyword Values & Volumes
As mentioned above, this data is scraped from the Google Traffic Estimator and the Google Keyword Tool. More recently Google combined their search-based keyword tool features into their regular keyword tool & this data has become much harder to scrape (unless you are already sitting on a lot of it like SEM Rush is).
Top Search Traffic Domains
A list of the top 100 domain names that are estimated to be the highest value downstream traffic sources from Google.
You could get a similar list from Compete.com's Referral Analytics by running a downstream report on Google.com, although I think that might also include traffic from some of Google's non-search properties like Reader. Since SEM Rush looks at both traffic volume and traffic value it gives you a better idea of the potential profits in any market than looking at raw traffic stats alone would.
Here is a list of sites that rank for many of the same keywords that SEO Book ranks for
Most competitors are quite obvious, however sometimes they will highlight competitors that you didn't realize, and in some cases those competitors are also working in other fertile keyword themes that you may have missed.
Here is a list of a few words where Seo Book and SEOmoz compete in the rankings
These sorts of charts are great for trying to show clients how site x performs against site y in order to help allocate more resources.
Compare AdWords to Organic Search
These are sites that rank for keywords that SEO Book is buying through AdWords And these are sites that buy AdWords ads for keywords that this site ranks for
Before SEM Rush came out there were not many (or perhaps any?) tools that made it easy to compare AdWords against organic search.
Start Your Free Trial Today
SEM Rush Pro costs $79 per month (or $69 if you sign up recurring), so this free trial is worth about $35 to $40.
Take advantage of SEMRush's free 2-week trial today. Set up a free account on their website & enter the promotional code "89MW-YR43-HFNJ-K94M"
If you have any questions about getting the most out of SEM Rush feel free to ask in the comments below. We have used their service for years & can answer just about any question you may have & offer a wide variety of tips to help you get the most out of this powerful tool.
So today Google announced that they have turned on SSL by default for logged in users, a feature that has been available for a while on encrypted.google.com. The way they set it up, as explained in this post, means that your search query will not be forwarded to the website you're visiting and that they can only see that you've come from an organic Google result. If you're buying AdWords however, you still get the query data.
This is what I call hypocrisy at work. Google cares about your privacy, unless they make money on you, then they don't. The fact is that due to this change, AdWords gets favored over organic results. Once again, Google gets to claim that it cares about your privacy and pulls a major public "stunt". The issue is, they don't care about your privacy enough to not give that data to their advertisers.
That might also enlighten you to the real issue: Google still has all your search data. It's just not allowing website owners to see it anymore. It's giving website owners aggregated data through Google Webmaster Tools, which would be nice if it hadn't shown to be so incredibly useless and inaccurate.
If Google really cared about your privacy, (delayed) retargeting wouldn't be available for advertisers. They wouldn't use your query data to serve you AdSense ads on pages, but I doubt they'll stop doing that, if they did they would have probably said so and made a big fuzz out of it.
If Google really cared, the keyword data that site owners now no longer receive from organic queries would no longer be available for advertisers either. But that would hit their bottom line, because it makes it harder to show ROI from AdWords, so they won't do that.
The Real Reason for killing organic referral data
So I think "privacy" is just a mere pretext. A "convenient" side effect that's used for PR. The real reason that Google might have decided to stop sending referral data is different. I think it is that its competitors in the online advertising space like Chitika and Chango are using search referral data to refine their (retargeted) ads and they're getting some astonishing results. In some ways, you could therefor describe this as mostly an anti-competitive move.
In my eyes, there's only one way out. We've now determined that your search data is private information. If Google truly believes that, it will stop sharing it with everyone, including their advertisers. Not sharing vital data like that with third parties but using it solely for your own profit is evil and anti-competitive. In a country such as the Netherlands where I live, where Google has a 99%+ market share, in other words: a monopoly, I'm hoping that'll result in a bit of action from the European Union.
Joost is a freelance SEO consultant and WordPress developer. He blogs on yoast.com about both topics and maintains some of the most popular WordPress plugins for SEO and Google Analytics in existence.
Google would spin Performics out of DoubleClick, and sell it to holding firm Publicis. Only one major force inside of Google hated the plan. Guess who? Larry Page.
According to our source, Larry tried to sell the rest of Google's executive team on keeping Performics. "He wanted to see how those things work. He wanted to experiment."
A search engine selling SEO services? Yep.
And now they are aggressively entering the make money online niche. Both Prizes.org & YouTube are in the top 3 ad slots for "make money online"
And I am seeing some of those across portions of the content/display network as well. I just saw this in Gmail today.
How does this align with the Google AdWords TOS?
To protect the value and diversity of the ads running on Google, we don't generally permit advertisers to manage multiple accounts featuring the same business or keywords except in certain limited exceptions. Furthermore, Google doesn't permit multiple ads from the same or an affiliated company or person to appear on the same results page. We've found that pages with multiple text ads from the same company provide less relevant results and a lower quality experience for users. Over time, multiple ads from the same source also reduce overall advertiser performance and lower their return on investment.
Google doesn't allow advertisers or affiliates to have any of the following:
Ads across multiple accounts for the same or similar businesses
Ads across multiple accounts triggered by the same or similar keywords
Well, as it turns out, the Google AdWords TOS doesn't actually apply to Google.
Search is a zero sum game.
Google is just getting started with breakfast. I am afraid to see what the last meal looks like!
Google has recently began refining search queries far more aggressively. In the past they would refine search queries if they thought there was a misspelling, but new refinements have taken to changing keywords that are spelled correctly to align them with more common (and thus profitable) keywords.
As one example, the search result [weight loss estimator] is now highlyinfluenced by [weight loss calculator]. The below chart compares the old weight loss estimator SERP, the current weight loss estimator SERP & the current weight loss calculator SERP. Click on the image for a larger view.
There are 2 serious issues with this change
disclosure: in the past refinement disclosures appeared at the top of the search results, but now it often ends up at the bottom
awful errors: a couple months after I was born my wife was born in Manila. When I was doing some searches about visiting & things like that, sometimes Google would take the word "Manila" out of the search query. (My guess is because the word "Manila" is also a type of envelope?)
Here is an example of an "awful error" in action. Let's say while traveling you find a great gift & want to send it to extended family. Search for [shipping from las vegas to manila] and you get the following
The search results contain irrelevant garbage like an Urban Spoon page for Las Vegas delivery restaurants.
How USELESS is that?
And now, with disclosure of changes at the bottom of the search results, there isn't even a strong & clean signal to let end users tell Google "hey you are screwing this up badly."
In some ways I am inspired by Google's willingness to test and tweak, but in others I wonder if their new model for search is to care less about testing and hope that SEOs will highlight where Google is punting it bad. In that case, they just roped me into offering free advice. ;)
Link Assistant offers SEO's a suite of tools, under an umbrella aptly named SEO Power Suite, which covers many aspects of an SEO campaign.
Link Assistant provides the following tools inside of their Power Suite:
Rank Tracker - rank tracking software
WebSite Auditor - on-page optimization tool
SEO Spy Glass - competitive link research tool
Link Assistant - their flagship link prospecting, management, and tracking tool
We'll be reviewing their popular Rank Tracking tool in this post. I've used their tools for awhile now and have no issue in recommending them. They also claim to have the following companies as clients:
Rank Tracker is one of the more robust, fast, and reliable rank checking tools out there.
Is Rank Tracker a Worthy Investment?
Rank Tracker offers a few different pricing options:
All of the editions have the following features:
Customizable reports (you can only save and print with Enterprise level however, kind of a drawback in my opinion. Pro accounts should have this functionality)
Human search emulation built in
User agent rotation
Google analytics integration
Multiple language support (English, German, Russian, French, Dutch, Spanish, Slovak)
Runs on Windows, Mac, Linux
All editions offer access to their keyword research features, with all the features included, the only difference here is the free edition doesn't allow KEI updates.
Rank Tracker Feature Set
Rank Tracker offers a keyword research tool and a rank checking component within the application. A more thorough breakdown of the feature set is as follows:
I prefer to do my keyword research outside of tools like this. Generally specific tools seem to excel at their chosen discipline, in this case rank checking, but fall kind of short in areas they try to add-on. I like to use a variety of tools when doing keyword research and it's easier for me, personally, to create and merge various spreadsheets and various data points rather than doing research inside of an application.
However, Rank Tracker does offer a one-stop shop for cumbersome research options like various search suggest methods and unique offerings like estimated traffic based on ranking #1 for that specified term.
Overall, a nice set of keyword research features if you want to add on to the research you've already done.
Rank Tracker also gives you the option to factor in data from Google Trends as well as through Google Analytics (see current ranking for each keyword and actual traffic).
As this is the core piece tool it's really no surprise that this part of Rank Tracker shines. Some of the interesting options here are in the ability to track multiple Google search areas like images, videos, and places.
In addition to the interesting features I mentioned above, Rank Tracker also includes a wide array of charting and design options to help you work with your data more directly and in a clearer way:
Usability is Top Notch
While the interfaces aren't the prettiest, this is one of one most user-friendly rank tracking tools that I've come across.
First you simply enter the URL you wish to track. Rank Tracker will automatically find the page AND sub-domain on the domain ranking for the keywords chosen, so you don't have to enter these separately.
You enter the site you want to check (remember, subpages and subdomains are automatically included)
Choose from a whole host of engines and select universal search if you wish to factor in places taken up by Google insertions into the SERPS:
Enter your keywords:
Let Rank Tracker go to work: (you can choose to display the running tasks as line views or tree views, a minor visual preference)
That's all there is to it. It is extremely easy to get a project up and running inside of this tool.
Working with Rank Tracker
Inside of Rank Tracker the data is displayed clearly, in an easy to understand format:
In the top part you'll get to see:
the keywords you selected
current rank compared to last rank
overall visibility (top rankings) in each search engine selected
custom tags you might decide to choose to tag your keywords with for tracking purposes or something
On the bottom chart you'll see three options for the selected search engine (bottom) and keyword (top):
ranking information for each search engine for the selected keyword
historical records (last check date and position)
progress graph (visual representation of rankings, customizable with sliders as shown in the picture)
The ranking chart shows the chart for the chosen keyword and search engine:
Within the ranking results page, you can select from these options to get a broader view of how your site is performing on the whole:
Customizing Rank Tracker
Inside of Rank Tracker's preferences you'll see the following options, most of which are self-explanatory:
This is where you can take advantage of some of their cooler features like:
adding competitors to track
adding in your Google Analytics account
customizing your reporting templates
changing up human emulation settings
adding in a captcha service
adding in multiple proxies to help with the speed of the tool as well as to prevent blocks
You can track up to 5 competitors per Rank Tracker profile (meaning, 5 competitors per one of your sites).
Key Configuration Options
Rank Tracker has a ton of options as you can see from the screenshot above. Some of the more important ones you'll want to pay attention to begin with their reporting options.
You'll want to set up your company information as shown here: (this is what will show on your reports)
On a per profile basis you can customize client-specific details likeso:
You can create new and modify existing templates for multiple report types here as well:
Emulation settings are important, you want to make sure you are set up so your requests look as normal and human as possible. It makes sense to check off the "visit search engine home page" option to help it appear more natural in addition to having delays between queries (again, to promote a natural approach to checking rankings/searching).
One thing that irks me about Rank Tracker is that they have emulation turned off by default. If you don't adjust your settings and you try and run a moderately sized report you'll get a Google automated ban in short order, so be careful!
In addition to emulation, search approach is also worthy of a bit of tinkering as well. Given how often Google inserts things like images, products, and videos into search results you might want to consider using universal search when checking rankings.
Also, the result depth is important. Going deep here can help identifying sites that have been torched rather than sites that simply fell outside the top 20 or 50. 100 is a good baseline as a default.
Successive search gives you a more accurate view as it manually goes page by page rather than grabbing 100 results at a time (double listings, as an example, can throw off the count when not using successive search)
Finally, another important option is scheduling. You can schedule emails, FTP uploads, and so on (as well as rank checks) from this options panel. Your machine does have to be on for this to work (not in sleep mode for instance). In my experience Rank Tracker has been pretty solid on this front, with respect to executing the tasks you tell it to execute (consistently).
Software versus Cloud
There are some strong, cloud based competitors to Rank Tracker. Our Rank Checker is a great solution for quick checks and for ongoing tracking if you do not need graphical charts and such (though, you can easily make those in excel if you need to).
Competitors and Options
Raven offers rank tracking as a part of their package and there are other cloud based services like Authority Labs (who actually power Raven's tools) you can look into if you want to avoid using software tools for rank checking.
There are some drawbacks to cloud-based rank tracking though. Some of them do not have granular date-based comparisons as they typically run on the provider's schedule rather than yours.
Also, most cloud rank checking solutions offer limits on how many keywords you can track. So if you are doing enterprise level rank checking it makes sense to use a software tool + a proxy service like Trusted Proxies
Pricing and Final Thoughts
Rank Tracker offers a generous discount if you grab all their tools in one bundle. If you want to customize, schedule, and print reports you'll need the enterprise edition.
I think requiring the purchase of your top tier for the basic functionality of printing reports is a mistake. I can see having that limitation on the free edition, but if you pay you should get access to reports.
You can find their bundle prices here and Rank Tracker's specific pricing here. Also, similar to competitors, they have an ongoing service plan which is required if you plan to continue to receive updates after the initial 6 months.
Despite my pricing concern regarding the reporting options, I think this is one of the top rank checkers out there. It has a ton of features and is very simple to use. I would recommend that you give this tool a shot if you are in the market for a robust rank checking solution. Oh I almost forgot, rank checking is still useful :)
One More Note of Caution
Be sure to read the below complaints about how unclear & sneaky the maintenance plan pricing is. This is something they should fix ASAP.