LSA now means Lets Screw Attorneys

Google’s Local Service Ads are now enabling a feature to let the end user message multiple law firms. See the image to the right “Message multiple” for an example. Google now looks exactly like the aggressive and often anonymized lead generation companies who sell single leads to multiple firms.  The catch here is that due to the opacity in reporting in LSAs… the law firm has NO idea not only how many other firms received the lead, but also no idea on how much they are paying per lead. And if history repeats itself (see below), it’s highly likely that law firms are paying very similar amounts for the phone call that goes exclusively to their law firm as they are, for a message lead that gets simultaneously submitted to three competitors in the same market.

Hat tip to Josh Hodges for the heads up on this.   

Some Recent History of Google’s LSA money grab….

On February 12th of this year, Google rolled out (and automatically opted all advertisers in) a product called Direct Business Search. Listeners of the Lunch Hour Legal Marketing Pod will know that I’ve been highly skeptical of them because of Google’s opacity in data on LSA performance – put differently, PPC rates for branded campaigns (“Smith and Jones Law Firm”) run (across our client base) $3.41 per click, compared to non branded campaigns (“car accident lawyer Cleveland”) which are much more expensive.  Because Google offers no granularity on the reporting; it’s very difficult to hone in on what Google is charging for those branded queries.  We’ve done some very blunt analysis on this, and in conjunction with some internal law firm studies have come to the conclusion, it’s about $150-$200 for branded terms – a 50x increase on what you’d pay in PPC. As such we’ve opted clients out of these. (Now the counterpoint, which Gyi raises in the latest LHLM podcast is… does opting out of branded get you kicked out of the non-branded… another “what if” that’s impossible to answer without more granular data.) If you want to listen to the full discussion on the pod – have a listen: Google Local Service Ads: To Brand or Not To Brand.

Squeezing the Legal Industry

Google’s pattern of behavior in conflating three very different types of advertising models (brand search, lead gen and direct response) ultimately leads to more law firms spending more money on a marketing channel that is deliberately designed to be economically inefficient. Note they did this in PPC as well – with close variance between branded terms and non branded ones – for example conflating “Morgan and Morgan” with “car accident lawyer”. This has increased the overall PPC spend w/o generating more clients for lawyers. Put differently: this change in LSAs is a reflection of Google finding more ways to squeeze more money out of law firms without providing incremental value.

What Should I Do About It?

Watch your economics carefully.  Any testing you’d like to do is going to be very blunt – i.e. turn off messaging for two weeks, then opt out of branded keywords for two weeks and then try to compare the economics (cost per consult, not cost per lead) across these very broad tests taken during different time periods. With Google’s refusing to provide any insight into what makes up your overall LSA spend, this is the only way to try to get a handle on how LSAs are performing for your firm.

ALERT: Google Business Profile Suspensions

In general, agencies who rely on fear for their marketing annoy me, but this is a situation in which Google’s moves have massive ramifications, so I’m sharing our experiences here so hopefully you can avoid these problems.

Transcript:

So I need to give you an urgent and super important update about something that’s going on with Google. Do not under any circumstances, do not touch your Google My Business profile right now. Google is in a complete mess on this and they are suspending accounts like crazy. We recently had one of our clients that account suspended.

We just added a UTM parameter, to their click to link and boom. Suspended not great. And there’s a couple other people who are experiencing the same thing. , These are two people that I know in the agency world. Not neither of them are legal specific, but these are two of the best, Blake Denman.

Until Google my biz fixes their shit, it’s highly advisable to not edit a single thing in G gmb, one of our clients just had their listing suspended for suspicious activities along with Blake Demond. We have the amazing queen of local Joy Hawkin, hearing a lot of agencies report in increase in Google business profile suspensions in the last week.

 Jason Brown has commented on this as well, so the people who are in the know right now hands off on making any changes to your Google business profile. Because getting suspended or getting out of being suspended is a huge pain in the ass.

Google’s Helpful Content Update (Lawyer Edition)

It’s not often that Google announces an upcoming algorithm change, and when they do, it typically means a major changes in overall SERP results. My prognostication is that the Helpful Content update, announced late last week and scheduled for a two week roll-out starting this week, is no different.  While the legal industry is not specifically called out as being problematic, I do anticipate large fluctuations due to many overly aggressive tactics deployed by legal, especially as it pertains to content. Google directly described the upcoming change as “meaningful”.

The Helpful Content Update (seriously – why can’t we just call this HICUP?) update in Google’s own words:

These launches are part of a broader, ongoing effort to reduce low-quality content and make it easier to find content that feels authentic and useful in Search.

What does this mean concretely for law firms?

Last week, I presciently posted this ad for a content development tool that uses AI to pump out pages and pages of content: “Let Jasper Write Your Marketing Copy for Free Artificial intelligence makes it fast and easy to create content for your blog, social media, website, and more!” From what I can tell from Google’s announcement, it’s tools like Jasper that are firmly within the targets of the “Helpful Content” algo update.  

This update is extremely similar to the Panda update from February, 2011.  For those of you who weren’t in the SEO game at that point – what happened in the past may be instructive of what to expect over the upcoming months. Panda was designed to root out content farms that were daily vomiting out thousands of pages of well ranking pages and monetizing that traffic through advertising. While the key targets were Demand Media (eHow) and Answers.com; many many other sites were caught up in the resulting algo update which focused on content.  The key issue here is that the Panda algo update had site-wide ramifications, which meant that a predominance of low quality content on a domain would negatively impact the traffic to high quality pages as well. Following Panda, Google noted that it impacted 12% of searches – meaningful indeed.

Prognostications

I spent the weekend reading through posts and prognostications of some of my favorite SEO nerds and Google directly.  Based on my experience with Panda and parsing Google’s announcement, here’s my expectations of what’s going to happen:

  1. Some legal sites are going to get utterly destroyed. Especially those that have deployed AI written content.  I’ve long been a critique of the blog blog blog mantra and believe this is going to come back to roost. There are a slew of legal marketing agencies who utilize AI generated content and then mark it up to human rates to their clients… meaning the law firms (may) have no idea they’ve been deploying a steady diet of vapid computer generated content that will be targeted. If your relationship with your SEO vendor includes something along the lines of…. “post 11 pieces of content every month”, I’d be particularly concerned. This goes for the small consultants as well as some of the big box providers.
  2. Given that there’s so much long tail content out there, and much of that has been AI driven, I’d expect to see large variability in ranking results for long tail terms.  This is, by definition, statistically difficult to ascertain, especially for lower traffic sites.
  3. In general, I’ve avoided Word Count guidelines (ie. Google like to see X number of words on pages).  However, over the past 24 months, we’ve seen a trend towards longer format content ranking – 1,000-1,500 words. It’s very possible that this trend reverses – Google specifically calls out word count as NOT being a ranking factor.  Google even calls out Word County focus as a particular concern: “Are you writing to a particular word count because you’ve heard or read that Google has a preferred word count?” From a precedent perspective, Panda specifically hit pages with inflated word count that thematically recycled concepts in order to bolster keyword density.  From a user perspective, overly verbose, redundant and repetitive prose (see what I did there) doesn’t always serve to easily elucidate consumers as to their legal issue and options.
  4. I think there’s a mild warning for those sites that heavily utilize Practice Area + Geo Pages to rank in nearby cities.  “Rockville Criminal Defense Lawyer” “North Rockville Criminal Defense Lawyer” etc.  Those pages have always fallen in the (very) gray hate area wrt to search guidelines; however, they do perform very well.  They are also painstaking to recreate with unique content. It’s possible poor executions of this tactic may be impacted as well.
  5. I don’t believe those pages sites that are plagued with thin useless pages that are either unintentionally generated through technology (think WordPress /tag pages) or those blog posts that are particularly useless (2 sentences in the “Mary Jones Won Superlawyers in 2014” blog post) are going to be hit any harder than they already are (which would be a departure of how Panda impacted sites). But again, this is just my conjecture from parsing Google’s wording – “search engine-first content is less likely to perform” and I believe that Google has long abandoned the notion that posting frequently is the key to SEO (despite the fact that many agencies and content consultants still preach this garbage).
  6. While Google specifically doesn’t call this a penalty, this is purely semantics b/c it’s going to look a lot like a penalty (allbeit not manual). This also means recovery is going to be dependent on Google’s algo deciding when things are better (not a human).
  7. This is English only (for now)… so those of you with already garbage Spanish pages and technical implementations have nothing to fear other than competitors doing it the right way.
  8. Very loose construct, if you have over 1K indexed pages with extensive content, I’d be worried.  Check out my article on calculating the Useless Content Ratio to see how poorly Google considers your content already.

What to Expecting (When You Are Expecting an SEO Nightmare)

  1. This algo update is going to take two weeks to roll out.  Starting roughly today(ish).  The next two weeks may showcase banana-boat-crazy fluctuations in your site’s ranking and traffic performance.  Ride it out.
  2. If you do get hit negatively, possibly expect a long recovery – even assuming you can identify and solve problems expediently, Panda recovery took many months b/c Google didn’t rerun updates for a long period. Helpful Content is different from Panda in that is constantly running; but I believe fixing this problem is a human, non-scaleable solution.
  3. Instead of going a comprehensive Content Strategy Audit and review (which is painful, takes time and gets exponentially harder the larger your site) consider just no-indexing a bunch of pages as a short-term bandaid. John Mueller gives this approach a qualified endorsement: “noindex is fine. Consider if all we see are good signals for your site, that’s a good sign.”
  4. This won’t be ambiguous… In an interview with Glenn Gabe, Danny Sullivan noted “if a site is impacted by the Helpful Content Update, then that impact should be visible”.
  5. The worse your overall content is the bigger the impact.  From Sullivan: “sites with a lot of unhelpful content on the site, like content created for search engines over humans, then you could see a stronger effect…”

Human Content Alternatives

Get ready to yell at your agency about your content strategy. And if you are already panicking now, it’s time to a)start looking into managing your low end legacy content and b)start writing good great stuff!  A great starting point for solid, legal specific content is John Reed at Rain BDM and Allen Watson at Blue Seven Content. Yeah – they expensive but if your site tanks, will look like a bargain.

Calculating the Useless Content Ratio (UCR)

The Useless Content Ratio

This post is long long overdue… I’ve been using and talking about he Useless Content Ratio for years.  Google’s upcoming “helpful content” algo update is forcing my hand, because it looks much like the Panda update from the past, during which we made extensive use of the Useless Content Ratio.

The UCR is very simple: the ratio of pages on a website that have generated traffic during a specific time period.  It’s application is twofold. First, to determine in aggregate, how Google views the overall content quality of a site, simply put, if lots of pages aren’t generating traffic, it indicates that the site either lacks the authority to support the volume of pages it has and/or that the content is overwhelmingly garbage.  In the case of Panda (and the upcoming Helpful Content update) which both have sitewide implications, a preponderance of pages that don’t generate traffic indicates that these algo changes will negatively impact the good content that exists on the otherwise bloated site.   The second application of the UCR is to guide firms into how much they should be investing in additional content.  Put simply, if 90% of pages have generated traffic during the past 6 months, then fire up the keyboard, because those new posts are likely to generate traffic.  Conversely, if only 10% of a site’s pages have generated traffic during the past 6 months, why on earth would anyone continue to barf out vapid blog posts that no one is going to see? This is typically done in search of the elusive long tail searches.  What many don’t grok is that without the authority to support a huge pagecount, these long tail pages will never surface in search results.

UCR is a Blunt Instrument

Now – its very important to note that UCR does NOT provide a page level analysis – we are looking at the site overall. So a high UCR doesn’t mean you shouldn’t publish that key piece of content for which you want to be hired.  Instead, it’s an overall indication of how Google views the quality of content on your site vis-a-vis the site’s overall backlink authority.  It not only provides us with guidance as to the aggressiveness of ongoing content development efforts, but also reveals those sites who may be solid candidates for content pruning – the process of going through content page by page and determining if that content should be kept, killed or consolidated. The subsequent reduction in pagecount actually results in greater traffic and more conversions (leads) which turn into consultations because ideally you are keeping the business-relevant content and jettisoning the irrelevant useless content. I’ve written Case Studies extensively on this for Search Engine Land back in 2017: More Content Less Traffic Part 1 and More Content Less Traffic Part 2. Here are two graphs from those articles that showcase how a reduction in pagecount correlates with an increase in traffic:

Calculating the UCR

Calculating the UCR is very simple.  First, the denominator is the number of pages on your site.  You can use a simple “site:example.com” to find an index count; however, I find that number to be very inconsistent. Instead, you are better off utilizing page count data out of Google Search Console. I wouldn’t recommend using your sitemap for this, b/c in many cases sitemaps contain errors, or more frequently omissions either deliberate or unintentional. Next you need to calculate the numerator of the fraction… this is done (relatively) simply straight out of Google Analytics through the Landing Pages report.

  1. Select a reasonablly long timeframe.  The lower your overall traffic volume, the longer this timeframe should be.  At a minimum look at 3 months of traffic if your site generates >2.5K users/monthly.  Below that volume, I’d probably look at 6 or even 9 months.
  2. Segment by Organic Traffic only
  3. Open the Landing Pages report under Behavior (don’t get misled by the other “Behavior” report which can be found under the Audience reports. Why Google has two different reports called Behavior is beyond me.)
  4. This will generate a list of landing pages (i.e. the first page someone saw on your site) from organic and local search queries over the specified timeframe.
  5. Scroll to the bottom right hand corner to get a total number of these landing pages.

Now you now the percentage of your pages that drive traffic (or don’t). Use this to mathematically evaluate the priority of continued content posting or if you should consider a content reduction as part of your Content Strategy (which is more likely the case for firms who have been aggressively chasing the SEO golden goose for years.)

Martindale-Avvo’s Own Numbers Reveal Google is Eating Directories….

Based on numbers put out by Internet Brands press releases, my back of the napkin calculations suggest Avvo has lost approximately 60% of their website traffic over the past 4 years.

I’ve long been (incorrectly) prognosticating that Google will start to remove Directories from the SERPS, as directories are rarely more than a conduit to the end businesses anyway and deliver very little (if any) actual value to consumers. While it doesn’t seem the organic listings have fundamentally changed to remove or reduce directory presence; their overall impact has been massive reduced by the reworking of the SERPs – put simply, SEO has been depreciated – now hiding below LSAs, Google Ads, and Google Local.

Looking through Internet Brand’s press releases from when Avvo was acquired 2018, there are two different releases the point to roughly 8 million monthly sessions to Avvo. The first, from January ’18, quotes Avvo as having “more than 100 million annual visits” – which translates to roughly 8 million monthly. The second from October of that year, quotes the combined entities of IB (including Nolo, Lawyers.com, Attorneys.com, TotalAttorneys, AllLaw and a smattering of other publishers) as having 25 million monthly sessions – assuming Avvo has roughly 1/3 of that traffic (I’m pulling that % out of the air, based on my recollection of relative marketshare while I was there) you end up with again, roughly 8 million sessions.

Now fast forward to today… and the LinkedIn profile of the Chief Executive for Martindale-Avvo:

Martindale-Hubbell and Avvo – the largest online legal marketplace in the country… serving 10+ million consumers visiting avvo.com, lawyers.com, and martindale.com every month and tens of thousands of attorneys ready to serve them.

Assuming the makeup of the IB directory mix hasn’t changed – apply 1/3 to that 10 million and you’re talking just over 3M monthly sessions – a 60% decline and traffic back to roughly where it was when I was there over a decade ago. Now it’s very possible we are comparing apples and oranges here or that the expected PR puffery during the acquisition was grossly overstating actual numbers. I simply don’t know. But, this all comes back to my theory that Google will disinter-mediate the directories, which they have done by changing the SERPs towards more of an ad driven model and possibly reducing directory presence within organic itself.

For more on how and why I pieced these numbers together:

Prediction: Google Screened as Ranking Factor for Google Local.

Google has traditionally and aggressively separated paid from organic.  The firewall between their departments ensures there’s not anti-competitive issues – i.e. spend more money on Google Ads and see your organic rankings skyrocket.  I’ve run into this over and over again with our awesome reps from the Google Premier Partnership program, who advise us and our clients on Google Ads.  These awesome peeps wouldn’t know the difference between an H1 and an Immigration Visa and think NAP is something their kids do after a particularly arduous virtual school day.

My prognostication:  Google’s separation between Search and Advertising may crumble in the near(ish) future.

Look what Erik Beatty spotted this morning in the support drop-down for LSAs: an  “Upgraded GMB Profile” option.

This is very valid useful data that, I would argue, should be used for showing up in the Local Results.  In fact, the prospect of improving what shows up in Local in the legal industry is what I thought the original intention of Google Screened …. removing the spammy crap that litters Local results – non law firms masquerading as law firms, out of state or out of market lawyers faking offices etc.

This would be a major adjustment for Google; breaching a very fine line between organic and paid.  While they’ve been reluctant to cross this rubicon in the past, lawyers should welcome this development as it will kick the bogus garbage out the Local, which filters real prospects through lead selling agencies, extracting a ton of value out of the legal profession with zero added value.

What is a Manual Action and How Do I Fix It?

Google tries to be vigilant about spam. It really does. Link building schemes, black hat tactics, and malicious software are some of the main things Google looks for. When it finds them, it might respond with a Manual Action.

 

So What is a Manual Action?

A manual action is when an actual, real-life member of Google’s team checks in on your websites and penalizes it for going against best practices. Manual actions can take a variety of forms and can be consequences of a variety of things.

 

Types of Manual Actions

 

  • Partial Matches (partial de-indexing)

If Google finds pages in violation of best practices it might de-index those specific URLs. This means that they will no longer show up in search results. This can be done to a page, sub-domain, forum, or any section of a domain. A partial match action is generally the best possible scenario for webmasters who are facing spam attacks, as the domain is still functioning and traffic can still find your site. It is still important to try and fix the issue and lift the action as soon as possible.

  • Whole Site Matches (total de-index)

If the problem is found to be larger than a few key URLs, Google may de-index the entire domain. This is a harsh penalty, but it can be reversed once the site complies with webmaster guidelines. Whole site matches are generally implemented when a site flagrantly ignores guidelines by cloaking content, redirecting users, and exposing users to malicious content. If your site is facing a whole site match, you need to consider what brought you there and if you need to change course.

 

What Might Cause a Manual Action

 

Google has a long list of reasons for invoking manual actions. Most of them involve spam links, as link building schemes are about the most forms of breaking best practices that webmasters do. The complete list includes:

 

  • User-generated spam

User-generated spam is spam that comes not from the webmaster, but the users of the website. This happens in forums and comments sections of websites.

  • Unnatural links to and from your site

This refers to link building schemes and spam attacks. If your site is suddenly sending thousands of links to a single, low authority site or is showing signs of spammy link exchanges, or has thousands of links from one low-authority site, Google might reprimand the URL or domain.

  • Thin or duplicate content

This is more subjective, as some sites do not need large amounts of content. That being said, many sites have unnecessary numbers of pages with practically duplicate content, which often sees penalties.

  • Cloaked content/images

This is a pretty old-school black hat technique, and Google is pretty good at finding when people try to implement it. Cloaking refers to showing different content to humans than to the GoogleBot. They can do this by having one image cover another, writing paragraphs of keywords in the same color as the background of the page, or stuffing keywords into gibberish text. Google really doesn’t appreciate these techniques and comes down pretty hard on those that do it.

  • Redirects

Redirects, whether desktop or mobile, refers to when a user clicks on a link to one website then gets redirected to another, completely unrelated, URL. The penalties are usually applied when the redirect goes to a site that is harmful or the redirect is malicious in it’s intent (i.e. sending a user looking for cartoons to a porn site).

 

How to Fix a Manual Action

Fixing a manual action starts by fixing the problem you were originally penalized for. If you were hit for displaying spam comments you might want to delete those comments and block the IPs they were sent from. If you were hit with a spam link attack, go through the disavow process and clean up your referrals. Google has recommendations on how to fix your website after all types of manual actions. 

Once you have made the changes you need to make, you can make a reconsideration request. This is a request for Google to re-review your website and lift the manual action. 

Sometimes you do the work, write the request, and get a denial. This means you didn’t do the fullest work you needed to do. Get back to work and draft a new reconsideration request. 

 

Final Thoughts

Don’t mess with Google. Even if they wrongly put a manual action against you, you apologize and follow the recommendations they give you. Google holds all the power.

Competitor Ads in your Google My Business Profile…

Well, we seem to be moving closer and closer to an advertising driven world, as Google has introduced advertising directly on competitor Google My Business listings. To the right is an example from Greg Sterling, at Search Engine Land which shows an ad for a competing car dealership showing up directly within search results. Greg notes that the advertisement is located almost an hour away…which, at least in the example, flies against the highlighted importance of “local” to consumers.

One important note – according to Greg’s review, firms can’t pay for ad free listings – which means any business may have competitor advertising embedded directly within their localized results. This “ad free profile” business model has been widely utilized by directories in (Avvo) and out (Yelp) of the legal market. From my experience this generates nasty backlash from prospective customers and Google is clearly trying to avoid that, although I’m not certain that the prospect of having competitor ads showing up by default on branded queries is going to engender any goodwill either.

If you’ve got an example of one of these ads in legal…please send a screenshot over.

Page Indexing Issues being fixed by Google

UPDATE 4/10/19: Google has announced this issue has been fully resolved.

Last Thursday, webmasters started noticing issues with Google’s indexation of pages throughout the web. Google had been removing pages from their search results for no apparent reason.

Google acknowledged the issue on Saturday, while also incorrectly reporting that the issues had been fixed. They haven’t provided any specific information around what caused the problem in the first place.

google search liaison indexing issue tweets

On Sunday, Danny Sullivan tweeted from the Google SearchLiaison (@searchliaison) that they are actively working on completely resolving the issue and that it was mostly fixed.

In his tweet, Danny also stated that the problem is solely on Google’s end, however, if there are high-importance pages that you noticed have been de-indexed you can request re-indexing through Google Search Console’s URL Inspection Tool.

request (re)indexing url inspection tool

John Mueller pointed out that even once the issue has been fixed, webmasters shouldn’t expect that all their website’s pages be added back to Google’s index. Additionally, he stated that “Awesome sites with minimal duplication help us recognize the value of indexing more of your pages.”

john mueller google indexing issues