Does My Site Look Fat in this WordPress?

If your site runs on WordPress, it is highly possible – even likely – that your site needs a diet.  WordPress makes it mind-numbingly easy to create lots of different pages by recycling your content, or snippets of your content, into various related pages.  This has been grossly exacerbated by uninformed SEO consultants  pushing their clients to aggressively “tag” blog posts.

Tagging

WordPress Often Generates Too Many Pages

First, understand that search engines don’t necessarily review all of the pages on a site, but instead use the site’s authority (from links etc.) to determine just how many pages they will both crawl (find) and index (add to the consideration set for search results).  Therefore, sites with low authority and lots of pages may find that most of their pages receive zero traffic and aren’t ever seen by search engines.

Let’s use Atticus Marketing as an example to showcase why all of these extra pages are problematic.  Yesterday, I published a great post on the differences between three mainstream CMS systems. Not only have the search engines failed to send any traffic to my lovely content, they haven’t even indexed or even crawled it at all!  They don’t know it exists. This despite the fact that I’m doing all of the social media marketing: posts on LinkedIn, Facebook, Google Plus, Tweets and Retweets.

Here’s why:  Atticus is a very young site, with just 34 different sites linking to it AND I’ve built out lots of extra pages through WordPress’s Categories and Tagging functionality. Every time you create a category or a tag, WordPress generates a page to organize content with that category or tag.  These pages are optimized for the category/tag.  This functionality has the capacity to generate lots of pages with content that already has a home on your site – duplicate content that search engines eschew. (Note that tags are worse because the interface is very freeform, encouraging writers to generate multiple versions of similar tags).  You can see how this gets out of control:  on AtticusMarketing.com I have a paltry 10 pages and 25 blog posts – yet Google indexes 186 different pages for the site.  The reality is, the vast majority of these pages just contain content that exists elsewhere on the site.

The reason the search engines haven’t deigned to even look at my lovely new content is because my site’s authority and multiple duplicate content pages combine to convince them that much of my content just isn’t worth their time.

Tagging and SPAM

Look at this from a search engine perspective, to understand why a combination of page volume and site authority determines how many pages are reviewed.  Let’s review a more extreme example, from the Carter Law Firm, where a site utilizes WordPress tagging functionality to generate a litany of spammy pages.

The blog is well written, has some beautiful imagery, appropriately utilizes external and internal links and embraces edgy topics including Topless Day and revenge porn.  Unfortunately at the end of every single post is a long list of entries for both “Filed Under” i.e. categories and “Tagged With” i.e. tags.  Here’s the entry for the post on “Ask the Hard Questions Before Starting a Business”:

tag spam

This one piece of (very good) content is now going to be replicated on 15 different pages across her domain – most of which will have nothing but a verbatim copy of this content.  And many of these pages are “optimized” (I use the term very loosely here – but optimized with on-page elements like Title Tags, H1s, URL etc.) for very similar content:

  • “How to start a business” vs. “How to start LLC” vs. “start LLC”.
  • “Arizona business attorney” vs. “Arizona small business attorney” vs. “Phoenix business attorney” vs. “Phoenix small business attorney”.
  • “Business operating agreement” vs. “Limited Liability Company operating agreement” vs. “Operating agreement for LLC” vs. “what is an operating agreement”.

This is a content spam tactic intended to capture variants of long tail search queries.  In reality, search engines figured this out years ago and the site owner is doing nothing other than artificially inflating her page count – most likely to the detriment of her search performance.

How to Avoid These Problems

Personally, I enjoy the tag clouds that are generated by tagging my posts and there is definitely a user benefit of being able to see articles grouped along common threads.  To use tags and avoid an inflated page count, simply Noindex your Tags. (Be careful about noindexing your categories, to make sure your URL structure for your posts doesn’t include the category folder.)   The Yoast SEO plug in has simple check boxes for this, as does the All-In-One SEO pack (below):

Tag Spam 2

However, when implemented carefully, tags can be effective in generating inbound search traffic, but follow these best practices:

  • Limit your site to 5-8 general, broad categories.
  • Tags should have multiple posts of a similar topic associated with them.
  • Tags should be genuinely different, not replicated spamming for verbal nuance i.e. Not: “divorce laws” and “divorce law”.
  • The posts should be displayed as snippets (not the entire article).
  • The tag page should contain its own unique content – you can do this with the SEO Ultimate plugin.

If all of this sounds overly technical and confusing, buy a WordPress book or invest some time with someone experienced in both SEO and WordPress – leaving the tagging to the graffiti artists .

 

Google’s Penguin 5 SPAM Update Launches Today

Strap in lawyers.  About an hour ago, Matt Cutts announced the launch of Penguin 2.1 (referred to in the SEO industry as Penguin 5 – don’t ask why).

Penguin 5

Today’s algo update is relatively minor (thus the “.1” instead of 3) but should impact 1% of searches.  Those negatively impacted will most likely have to-date gotten away with a dodgy, purchased backlink profile.  Beneficiaries may include sites that have been working on Penguin cleanup from past penalties.

Penguin Traffic Hit
Image from Hungry Piranah

How to Diagnose if You’ve Been Penguined

Penguin updates are immediate and severe.  Review your Google Analytics data for sudden and otherwise unexplained changes in (search) traffic associated with the dates of Penguin roll-outs (see below).  Additionally, you may receive notice in your Google Webmaster Tools account.  Past Penguin updates include:

  • April 24th, 2012
  • May 25, 2012
  • October 9, 2012
  • May 22, 2013

So, tomorrow morning, wake up, pour some coffe and compare today’s traffic to last Friday.  Then breath a sigh of relief if you’ve been doing all the right things, or panic and start reading Demystifying Link Disavowals, Penalties and More by Jenny Halasaz.

Sometimes the Technology (really) Matters

Want an immediate 56% increase in your natural search traffic?

In most of my SEO 101 talks I invariably gloss over the technology aspect of SEO on the grounds that the platforms have evolved to  adhere to technical best practices for search.   However, identifying and fixing major technical issues is the one search tactic (outside of enormous Adwords budgets) that delivers massive and immediate performance improvements.  As an agency, this has the added benefit of delighting clients by associating confusing technical lexicon with increased phone calls from prospective clients.  SEO Agency Nirvana.  That’s what happened two months ago; and while I won’t share the client or the problem or the solution – I’m happy to share the end result:

Technical Fixes

That’s an immediate and persistent 56% increase in traffic – and if I squint really hard, it looks to me like a trend line that is continuing to grow.  Note also that we implemented the fixes at 2 am on a Saturday morning – futzing around with major technical changes is fraught with peril and best done well outside of regular traffic hours.

Was this cheap and easy?  No.  A full audit to weed out technical issues is time consuming, technical and requires access and working knowledge of advanced tools.

How to Tell if you Might Have a Major Technical Problems

This is by no means an exhaustive list . . .

  • Search engines can’t find most of your pages.  (When you do a google search for site:mywebsite.com, more than a third of the pages on are not included.)
  • Search engines find more than all of your pages.  What?  (When you do a google search for site:mywebsite.com, the number of pages returned are 2-10 times as many pages as you think you actually have – make sure you include “supplemental results” found by clicking the “see omitted results” link after the last result in your site: search.)
  • You change something and your site traffic plummets.
  • Your site isn’t built on WordPress
  • Your site was built more than 4 years ago and hasn’t been updated.
  • Your Google Webmaster Tools interface has anything under “Site Messages”.  (You do have GWT access don’t you?)

Lots of lawyers are looking for the easy SEO solution.  This may be it.

Are You Qualified to Hire an SEO Agency? A Simple Test

Think you have the experience to hire a good SEO Agency?  Read the next two bullets before you go any further:

  • Search engines have difficulty distinguishing URLs in paginated results.  CMSs address pagination in different ways – from parameterized URLs to completely unique pages.  You  can solve this problem by using rel=canonical in the <head> or HTTP header of paginated pages that have a “view all” page.  If you don’t have a “view all” content page, use rel=next/prev to specify a paginated series.
  • NAP (name, address, phone number) consistency across  trusted directory sites is a key ranking factor in local search.  To address problems caused by tracking phone numbers, specify NAP values using the GREP command in your XML sitemap.  Alternatively use a 301 permanent redirect to proxy your canonical number.  Note:  The GREP command only works for single location businesses.

One of the paragraphs above explains a fairly simple concept in confusing technical terms.  The other is utter gibberish.  In English it makes as much sense as:  “helicopters use pancakes to shingle doghouses moonbeam excellent.”

This morning I’m doing some research in preparation for a kick off call with a new client.  These guys have gone through 4 agencies in the past three years (in general a huge red flag) – but as I look at their site it seems that it was created by the 12 year old nephew 5 years ago and never touched.  The technology is problematic, the link profile is anemic, the content stale, basic fundamentals have not been addressed.

Selling SEO services to lawyers is drop dead easy; its easy to confuse and intimidate with technical lexicon to make the sale.  Delivering on results is entirely different.  As my new client today has demonstrated, the web is full of search charlatans eager to hook law firms on lucrative monthly contracts.

If you can’t identify the balderdash in the examples above, you shouldn’t hire an SEO agency without input from someone who can.

LawyerEdge Website Underperforming? A Cautionary Tale of Duplicate Content

Having trouble figuring out why your website isn’t getting more traffic?  Its possible the content on your site has simply been cut and pasted from another site – rending your SEO impotent.

Law Firm Website Almost Invisible

Initially, I couldn’t figure out why the law firm’s site was performing so badly – the technology was fine, the content seemed fairly well written and there was a reasonable link profile.  Despite this, the site was averaging less than 2 visitors a day from unbranded natural search –  and very few of those visitors were landing on the practice area pages.  Digging deeper, I found that the actual content on the practice area pages was cut and pasted across other LawyerEdge clients.

In the example below – we can see that Google has identified 58 other pages with the exact same content as this law firm’s page for pedestrian knock down accidents.

Duplicate content

When I looked across the website’s landing pages, I found that almost all of them had content that was duplicated across the web.  In the graph below, the vertical axis shows the number of pages found on the web containing the exact same content as the law firm’s topic pages.

Duplicate content on legal websites

Of the 40 pages I reviewed, just 13 had unique content.

Understanding Duplicate Content

Search engines hate duplicate content because it can generate a really bad user experience.  Here’s why:  Using the above example, imagine I do a Google search for “determining who is negligent in Pedestrian cases”.  The first result I click to doesn’t give me what I’m looking for, so I click back to the search engine and try the second result . . . . which leads me to the exact same content on another site.  Now I’m annoyed and instead of clicking back, I load up Bing to try to find something different.

The search engines minimize this poor user experience by identifying duplicate content across different pages and trying to identify the original version of the content (search geeks refer to this as the canonical).   Google and bing hide the other pages away from searchers in what is called “supplemental results” – which is of course, where I eventually found the law firm’s pages.  Supplemental results are shown here:

Supplemental Results

This is compounded when a large portion of a site’s content looks to be simply copied and pasted from other sites across the web.  Search engines reasonably deduce that the overall site is of pretty low quality wrt to unique, interesting content.  Google’s algorithm updated to try to identify (and weed out) these sites with the Panda update.  From the Google blog:

“This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful.”

Note that Panda is a site-wide penalty – which means that duplicate content on many pages will impact performance of the entire site – even those deliciously well written unique and insightful pages.  The bar graph above, which shows the majority of the law firm’s pages having duplicate content indicates they have most likely been hit by the Panda update.

In the pedestrian knockdown practice area example, all of the firms listed below are competing directly with each other with the exact same content:

  • Rochelle McCullough, LLP
  • Inkelaar Law
  • Eshelman Legal Group
  • Joshua D. Earwood
  • Saladino Oakes & Schaaf
  • Levenbaum Trachtenberg
  • Ellis, Ged & Bodden
  • Law Office of Bruce D. Schupp
  • Allen, Allen, Allen & Allen
  • Law Office of Kenneth G. Miller
  • The Law Firm of Kevin A. Moore, P.A.
  • Buchanan & Buchanan
  • S. Perry Penland, JR.
  • Ardoin Law Firm
  • McWard Law Office
  • LeBell Dobroski Morgan Meylink LLP
  • Cox & Associates, P.A.
  • The Gefen Law Firm
  • Echemendia Law Firm PA
  • McKinney Braswell Butler LLC
  • Law Office of Charney & Roberts
  • Johnson & Associates
  • Pistotnik Law Offices
  • Bledsoe Law Office
  • Law Offices of George A. Malliaros
  • Roberts, Miceli & Boileau, LLP
  • William E. Hymes
  • Law Office of Donald P. Edwards
  • Ferderigos & Lambe Attorneys at Law
  • The Law Offices of Fuentes & Berrio, L.L.P.
  • Robert B. French, Jr., P.C.
  • The Law Offices of Peck and Peck
  • Cherry Law Firm, P.C.
  • Dexter & Kilcoyne
  • Philip R. Cockerille
  • Brotman Nusbaum Fox
  • Stephen J. Knox Attorney at Law
  • Littman & Babiarz
  • The Law Offices of Weinstein & Scharf, P.A.
  • Friedman & Friedman
  • The Law Firm of Robert S. Windholz
  • Fahrendorf, Viloria, Oliphant & Oster L.L.P.
  • Conway Law Firm, P.L.L.C.
  • Head Thomas Webb & Willis
  • Charles B. Roberts & Associates, P.C.
  • Pistotnik Law Offices
  • Nordloh Law Office, PLLC
  • The Law Offices of Rosenberg, Kirby, Cahill & Stankowitz
  • Kerner & Kerner
  • McAdory Borg Law Firm P.C.
  • For a funny one – check out this:  The Law Offices of This is Arizona – a template, presumably available for purchase with ghost Attorneys John and Joan Smith.

(To be fair, not all of these firms are LawyerEdge clients – there is a smattering of different agencies.  This does highlight the extent to which content gets cut and pasted around the web by website developers.)

How to Tell if You Have Duplicate Content Issues

The most obvious sign of duplicate content, of course is zero to low inbound search traffic to specific pages.  You can diagnose this in Google Analtyics using the “Landing Pages” tab under content (make sure you filter for ONLY “organic search traffic”).

Another more accurate approach is to take a unique looking, sentence from your page and doing a search for it with quotations around the phrase:

Duplicate Content IV

If your search returns a ton of results . . . its time to start writing.

 

Are You Sending the Wrong Signals to Search Engines?

Looking at the screenshot below, it is very clear to any human that this is a blog post covering Drug Sniffing Dogs and Search Warrants.  Unfortunately, the underlying code does a very poor job of telling computers what the article is about – leading to this page (and all the other pages on this site) performing extremely poorly in search.  Here’s why . . .

At a very high level, search engines scan web page code for indicators to deduce the subject matter of content on a page (reminds me of the old California Achievement Tests in 5th grade.) We’ll review three of the primary indicators:  Title Tags, URL, and Headers (H1s etc.).  Why are these so important?  Content contained within these indicators are intended to describe what the page is about – i.e. if a page is titled “Fuzzy Bunny Slippers” and has a similar heading – it is most likely a page about fuzzy bunny slippers.”

Justice Florida

Key On-Page Elements

Title Tag

The title tag defines the title of the page, shows up at the top of a browser and also is the link that appears in search result pages.  In this case, the page is done correctly.  “Drug-Sniffing Dogs and Search Warrants : West Palm Beach Criminal Lawyer Blog”

URL

Unfortunately, when this page was created, the URL ends with “drugsniffing-dogs-and-search-warrants”. The failure to separate “drug” and “sniffing” in the URL optimizes the page for the never searched for word “drugsniffing”.

Heading

Heading tags define the heading of the page.  The primary heading is the H1, with subheadings H2, H3 etc. To a human, the heading of this page is pretty clear “Drug-Sniffing Dogs and Search Warrants”, but when we look into the code, we find that that heading is not identified with an H1 tag:

Justice Florida Code

In fact, the primary heading, H1, tells search engines that this page is about:  “Palm Beach County Criminal & DUI Lawyer : Criminal & DUI Defense Attorney in West Palm Beach & Palm Beach | Criminal Attorney: DUI, Assault & Battery, Felonies”.  What a mouthful – that’s some ugly keyword stuffing and is only very tangentially related to drug sniffing dogs and search warrants.  Note above that the H2 and H3 above contain generic, templated content as well..  Predictably, we find that every single page on this website uses the exact same, keyword stuffed, H1 – sending a strong signal to the search engines that every page on the site is about the exact same subject matter.

Why This All Matters

Not surprisingly, even with an exact search for the page title including the misspelling, the attorney’s content fails to surface.  I’ll bet dinner that his analytics also show zero inbound search traffic to this page.

Justice Florida Results

Why This Happens

Generally, modern website and blogging platforms have most of these technical problems ironed out.  You should never have to get your hands dirty in the code.  But this example highlights the importance of having a modern, up to date platform.  The justiceflorida.com site in this example is built on an outdated version of Movable Type.  The simple obvious solution:  a recent version of WordPress.

How to Diagnose your Own Pages

You know where to look for the URL and the Title Tag – heading tags are a little more hidden, but not too hard to find in the source code.  You can access the source code on a website by using the “view source” function in your web browser (usually under “view” or simply by right clicking on the page).  Then search the page for “H1” and see if you have a unique description of the content of the page.  These tags show up in pairs – so you should only have two H1s (as there should really only be one primary heading for a page); multiple H2s, H3s etc. are fine.

Legal Linkbait

Content is King – we hear this all the time wrt to search engine optimization, yet all too frequently I hear from attorneys suffering from a lack of inspiration about content.

“What am I going to write about?”

Writing successful legal blog posts requires equal parts of 1)a love of writing 2)training/experience in writing well (law school does NOT count) and 3)creative inspiration.

Fortunately the web is full of creative inspiration.  Amid all the royal baby news, yesterday was particularly ripe with legal leaning news stories begging to be transformed into interesting, well written linkbait legal blog posts.  For linkbait think Comso magazine titles – sex always works, as do drugs and celebrities.  Sex and drugs and celebrities all wrapped in one is a pretty good bet. Another approach is to take something popular in the news that is out of your jurisdiction and explain how it would apply (or not apply) within your jurisdiction.

So to answer the perpetual question “what am I going to write about” read on for some inspiration.  I’ve even rewritten the headlines with some fun, slightly salacious designed to draw clicks (and links.)

NYC Legalizes Bare Boobs 

This was the story (and accompanying picture) that inspired my post today.  Quote from the original article:  “It also notes that, should a crowd form around a topless woman, the officer should instruct the crowd to disperse and then respond appropriately if it does not.”  Good luck with that, men in blue.

20,000 Babies Hit By Falling TV’s

Our love affair with increasingly large screen area colliding dangerously with babies’ fascination with brightly colored objects.

Marines Bomb Australia’s Great Barrier Reef

This raises lots of legal questions – environmental, military, international – and  to date, the news coverage has been devoid of any legal ramifications.

Jesse Ventura Sues Murdered Navy SEAL’s Wife

This is an easy one – explain the legal mechanism that enabled Ventura to transfer his lawsuit to Chris Pyle’s wife after the sniper was killed; feel free to take an editorial swipe at the utterly classless move from the former WWF showman turned Governor turned plaintiff.

Your iPhone as Breathalizer?

Huge, widespread legal liability questions abound in this case.

Take me OUT of the Ballgame – Skydiver Lands on Baseball Player

Include the accompanying video for added benefit.

legal linkbait

Public Support for Boston Trooper who Leaked Tsarnaev Arrest Pics

The state trooper, who shared his own photos of the bloody arrest of Tsarnaev with Boston Magazine is fighting for his career against growing public support for his actions.   This is a great opportunity to discuss the balance of individual legal rights, professional conduct of public officials and employment law.

Pornless Britian?

Yesterday the Brit’s got a new King AND David Cameron’s plan to standardize an approach for limiting access to online porn.   Consider combining with the Topless New York story for added impact.

Billy Ray Cyrus and wife say “I don’t” to Divorce

Ahhh – People magazine, the eternal well of celebrity gossip, offers an example of how to get undivorced.  An interesting take on this would be. . . . “when is it too late to get undivorced?”  And if you get really desperate for celebrity smut inspiration, try US Weekly.

A Common Sense Law Firm Policy for Authorship

Authorship is the hottest new innovation in search.  And like many changes that preceded it, authorship has the legal community spinning in circles trying to figure out what to do.

But first . . .

A Quick Primer on Authorship

At a very high level, authorship is the association of an individual writer’s reputation to a piece of content.  This manifests itself in two important ways.  1. As a ranking factor – i.e. Conrad Saam has accrued a strong reputation for writing about search and therefore his content about search will rank well regardless of where it is published.  2.  As a click through factor – to help searchers identify good content, Google is including a thumbnail of the author in search results.

Authorship for Lawyers

Authorship is a big deal.  At the latest SMX Advanced conference, a study (with what looked like an admittedly anemic dataset) claimed a 200% increase in clicks to results with authorship than without, regardless of position on the SERP. Later in the same session, a major newspaper editor suggested that a writer’s Author Rank (uggg – our industry’s latest nauseating   buzzword that is soon going to be misused by MBAs desperate to display some tech cred) would soon be a primary hiring factor.

 

Authorship has been written about ad nauseam – I’d recommend Ann Smarty’s cheat sheet overview if you need to quickly get caught up. For now, I want to focus on the perceived risks of authorship . . .

Why Law Firms are Afraid of Authorship

Authorship does raise some genuine questions.  To capture search traffic, more and more law firms are expanding the velocity at which they publish content to the web – using more firm lawyers, or with ghost written content.  This has raised policy issues around authorship. The two most common concerns that have some law firms completely balking wrt to authorship are:

“What if I publish something that could be used against me down the road in a case?”

This red herring actually has nothing to do with authorship.  The logic is pretty simple –something published on your law firm’s blog that is so damaging is probably not going to be more so with a picture attached to the results on a search engine query.  I hear this concern mostly from law firms using outsourced third parties to vomit out a huge volume of low quality content onto their sites.  If this is a question you find yourself asking – consider fixing the content problem, not the authorship problem.

“What happens if Bill leaves my firm – can he take his authorship with him?”

This is more nuanced question and mirrors a common firm partner concern:  I spend year’s building Mary’s reputation as a great lawyer, and then she goes and opens up a firm across the street from me.  Allaying this concern requires an understanding of how authorship works. Let’s go back to the central premise of authorship: content for reputable writers ranking regardless of where it is published. So, yes, a lawyer can build up their writing reputation on a firm’s blog and then put that reputation in their pocket and start a new firm with a new website and leverage that reputation to rank.  From an Author Rank perspective two things happen here:  1)as their reputation builds on their new site, so does the value of that reputation to the original content and 2)if they choose to disassociate themselves from the original blog, they lose the value of that reputation.  It is important to note that reputation isn’t built just because I’m writing on a specific blog, but also because of many additional associated signals (links, shares, etc.).  Just like in real life, the reputation for an author (or lawyer, or singer, or SEO consultant) transcends any individual publishing platform.  Therefore, there is no downside to using authorship to enhance your content’s ability to drive traffic.

This brings us to . . .

A Common Sense Law Firm Policy for Authorship:

Don’t publish anything you wouldn’t attach your name to.

Your Ranking Report is a Dangerous Waste of Time

Lawyer:  “We’re ranking really well, but our phone just isn’t ringing.”

Me:  “Well, how much traffic are you getting?”

Lawyer:  “I don’t know”

You are wasting your time if you are looking at Ranking Reports to assess the success of your SEO campaign.  Worse – if you agency sends you a regular ranking report (and no traffic report), they are probably deliberately trying to hide their poor performance.

Ranking reports are often used by agencies to suggest success while they are delivering very little in value (i.e. more traffic.)  They distract from business goals and focus your search campaign on the wrong tactics.  They are used to rationalize exorbitant retainers that deliver little in the way of new business.

I recently talked to a lawyer who forwarded me her agency’s two most recent ranking report showing 172 different terms that “ranked” between 1-3. When we dug into the Google Analytics data, there were very few visits referenced for those terms.   Additionally, each ranking report had a different set of terms. I suspect her agency was simply using a third party rank checking tool, cherry picking the “good” results and sending her a rosy picture every month along with her bill.   I drew her the following graph cross referencing the ranking reports with her Google Analytics data to demonstrate why her agency’s glowing ranking reports weren’t driving inbound phone calls from prospective clients:

Ranking Reports for Lawyers

Why Good Ranking Reports Don’t Result in Traffic

So, how can a site rank for a term, yet fail to generate traffic?

Local

Remember that little thing called Google Local Maps Places that dominates the screen area for most localized searches (including legal searches)?  Ranking Reports completely ignore Places results.  Legal SERPs very frequently integrate Places – so your glowing Ranking Report displays a very misleading picture of your site’s ability to generate traffic.

Personalization

Search engines are increasingly delivering personalized results based on the individual searcher’s geography, previous search history and social graph.

Geography

A “divorce lawyer” search from my office will generate a results page with Seattle area divorce lawyers).

Previous Search History

My news related searches disproportionally return CNN.com because they know I visit that site on a daily basis; whereas my father may return Fox news results.  Attorneys will frequently sit in their office, run a ranking check for a specific term they want business for, be pleased when their site shows up #1, yet puzzled that their phone isn’t ringing with a flood of incoming prospects. What they don’t realize is that the search engines are personalizing their results based on previous surfing history. The Ranking Report is delivering a false positive because the single most frequented site by any attorney is their own site.

The big picture: With the exception of the false positives from previous search behavior, personalization isn’t taken into account by ranking reports.

Social Graph

My searches include results from people with whom I connected via the social graph.  Google calls this Search Plus your World.  (Bing functions in a similar fashion.)  Depending on the searcher and the subject matter, research has shown up to 60% of results can be influenced by the social graph.

Long Tail Terms

Searches are increasingly specific – think “trial for my third DUI arrest” instead of “DUI Lawyer”.  This is known as the long tail.  Focusing on ranking reports misses all of the traffic within the long tail.  To get a feel for how the long tail works – look at all of the different terms that bring traffic to the profile page on your website.  If you are like most attorneys you’ll see something like:  “William O’Smith”, “Billy osmith”, “Bill O. Smyth”, “Bill Smyth Avvo Rating.” “Bill Smith Lawyer”  “Bill Smyth phone number” etc.

Additionally, its very easy to generate a positive rank for an obscure term.  Think “fuzzy bunny slipper lawyer”.  As more and more consumers are accustomed to search engines automatically geographically targeting their query, (think “personal injury lawyer” instead of “Poughkeepsie personal injury lawyer” they are frequently dropping the geographic component of their search.  A lot of erroneous ranking reports I’ve seen have obscure geographic references in them that are never searched by anyone (zip codes, townships etc.).

 

The Alternative To Ranking Reports

Instead of monitoring the search engines for how your site ranks for a finite set of terms, use metrics that really drive your business. Look at the inbound traffic to a page or group of pages within a practice area.  This can be done easily in Google Analytics with the “landing page” report.  Alternatively, use a keyword or a group of similar keywords to track inbound search traffic for a specific practice area – “divorce”, “implant”.  Changes in traffic that include these keywords demonstrates progress (or decline) in your site’s ability to generate business – not just ranking.

The Final Word

I wrote a version of this post two and a half years ago for Search Engine Land – Excuse Me While I have a Ranking Report Rant.  In the ensuing comments, there were a number of defensive agencies insisting that clients still demanded ranking reports.  Matt McGee, one of the best SEOs I know responded to the anger:

I made a decision 3-4 years ago to never again provide a ranking report to clients. I tell prospects this before they commit to working with me and invite them to find another consultant if they want to track rankings, or to do it themselves. Best SEO decision I ever made . . . My clients hire me to help them make more money. The ones who seem more concerned with rankings than money get referred to other SEO consultants.