SEO Disaster: “This Site May Be Hacked”

Oh….. its so bad when a site gets hacked.  Check out what people looking on Google for attorney Steve Boyd see:

Note that under the listing for the website there’s a Google warning:  “This site may be hacked.” This is Google’s attempt to protect users from sites that may unwittingly download malware or aren’t what they really purport to be.  WordPress is a notoriously common target for hacks due to its ubiquity.  Here’s a close up of that Google warning:

Further – it’s highly unlikely that Google will send anyone to any other pages on the site…. most likely, the only results you will get are for that flagrant brand queries.  And this is because the site has over 12,000 indexed pages, mostly in Japanese, peddling everything from Nike sneakers to Patagonia jackets.

But wait – there’s more! Go back to that original result and let your eyes land on the pictures to the right in the Knowledge Graph….. looks like not only Steve’s site was hacked, but someone also took the time to upload some new pictures for his office.  Either that, or Steve really likes galavanting in one-size-too-small football pants after taking a dip in the ocean and completing his morning’s 1,000th sit-up.

What to Do?

First off – don’t let this scare you away from WordPress – it is still the one and only website platform you should use.  But…

  1. Update it regularly.
  2. Host it on a Managed WordPress provider.  We recommend WPEngine – read more: Our Love Affair with WPEngine.
  3. Check results for brand searches regularly.
  4. Claim your Google My Business result.
  5. Monitor your site in Google Search Console.

And Steve – if you are reading this…. my apologies (or admiration if that is really you).

Title Tags, Meta Descriptions – the What and the Why

Title tags and meta descriptions. One of the first additions to any new SEO’s on-page optimization arsenal. Although simple, it’s important to have a strong understanding of what titles and descriptions are, why you should use them, and how to optimize them in order to get the most cost-effective means of SEO improvement.

What Are Title Tags and Meta Descriptions?

Title Tags

A title tag is an HTML element included in the <head> section of a page on a website. To read a page’s title tag, right click anywhere on a page and click “view page source”. The title tag is the text between “<title>” and “</title>” (believe it or not):

Go ahead and give it a try on this page!

The title tag never actually appears on the page itself. It gives search engines a boiled down description of what a page is about. According Moz’s 2015 search engine ranking factors survey, title tags are still one of the most important on-page ranking factors. Title tags are helpful for search engines, and they’re helpful for users. When a user performs a search for “Mockingbird Marketing”, this is what shows up:

Mockingbird Marketing SERP

The title tag added to a page (in this case, home page) is the first thing a user sees when they come across a website in the search results. For obvious reasons, you want this text to be inviting, informative, and accurate.

But search results aren’t the only place users encounter your title tag. Title tags show up in the text displayed on your browser tab:

Title tag shown in browser tab

and in social media:

Blog post in social media screenshot

Meta Descriptions

A meta description, similar to a title tag, is an HTML element that tells the users what a page is about. It too, can be found in the <head> section of a page:

.

Meta descriptions, although not as big and bold as title tags in search results, provide users with a more detailed description of what a page is about. This text is found directly below the title in search results.

Why Should I Use Title Tags and Meta Descriptions?

There are two reasons to make sure that each page on your site has optimized title tags and descriptions:

  1. For search engines
  2. For users

Of (1), it’s unclear the extent to which this helps. In the good old days, Google would take a page’s title tag and use that as a primary ranking factor. Since then, search engines have added a multitude of ranking factors to consider alongside meta tags, reducing their clout. Currently, the exact influence of a title tag on page ranking is unclear.

Google has been more clear on meta descriptions. Matt Cutts of Google said in 2009 that meta descriptions are not used as ranking factors.

Of (2), this is where the definitive value of optimizing title tags and meta descriptions lies. Giving your pages clear titles and descriptions draws in the user.  If a user comes across the title of your page in search and it does a good job of describing exactly what the content within the page is about, the user will click, and stay, on your page.

How Do You Optimize Title Tags and Meta Descriptions?

There are a couple things to keep in mind as you optimize your title tags and descriptions.

  1. Length: Google will display the first 50-60 characters of your title tag. Keep your title within this length to ensure nothing gets cut off. Meta descriptions should fall between 150 and 160 characters.
  2. Keep Users in Mind: Spamming meta tags with keywords looks suspicious to search engines and users. When writing meta tags for a page, first go through the page and make sure you have a strong understanding of what the page is about. Boil this down to title tag and meta description length.
  3. Important Keywords First: As users scan a page filled with search results, their eye starts on the left side of the page. Place the most relevant words early in your title tag.
  4. Never repeat: duplicate titles and descriptions confuse everybody, search engines and users alike.

There You Have it

To see how all of this fits in to the bigger picture, check out ahrefs’ guide to on-page SEO. This study does a good job of showing how much of an impact meta tags have on your on-page SEO.

 

 

 

 

What Google’s New Deal Means for Anti-Piracy Attorneys

As readers search for information on the web, counterfeit sites attempt to redirect their results. In a new deal with the UK, Google says ‘not today’.

Described by the UK Intellectual Property Office (IPO) as a “landmark agreement”, the deal serves to reduce the visibility of infringing content by June 2017. This will result in many pirated sites disappearing from the first page of search results for Google and Bing when people look for content.

Initially there was question whether the deal between IPO and Google would involve any algorithm changes. In a conversation with SearchEngineLand.com, Google confirmed no algorithm changes are necessary. Google is confident that their current algorithms (namely their “Pirate” algorithm) will continue working to prevent bad content from showing up in search results.

Google voiced that their main goal is to provide high-quality content to readers that is relevant to their needs. It is important for readers to be referred to legitimate and helpful websites. The existing algorithm serves to prevent pirated content and spam from interfering with that process.

Although Google seemed to downplay the significance of this agreement, the deal is monumental for the British Phonography Industry (BPI). To them, it was a much-needed move to reduce the visibility of pirated content and reduce copyright theft.

Without major algorithm changes, this means that websites that serve the needs of their customers and readers will not be negatively impacted. Instead, we will see a reduction in sites that may redirect their readers to pirated content and spam sites.

In the US there has been additional pressure to reduce the visibility of pirated content. Google and Bing aim to provide the best information for readers, as well as ensure that content creators see their valuable content appear in the results.

Civil and white-collar anti-counterfeiting attorneys can work alongside Google’s attempts at getting pirated content off the web. Anti-piracy law is invaluable when it comes to securing the sanctity of original content, sources, and businesses. We encourage attorneys to focus on what your clients care about and help protect their original and unique content online.

Google’s Video on How to Hire an SEO Consultant [or Agency]

If you’re considering an investment with an SEO Consultant or SEO Agency, please watch this 11.5-minute video released by Google. Maile Ohye, Google’s Developer Programs Tech Lead, outlines important things to consider, tips on what to ask for, and even items to expect from technical audits.

A good SEO will try to prioritize what ideas can bring your business the most improvement for the least investment, and what improvements may take more time but help growth in the long term. – Maile Ohye

SEO Summary:

  • If you want long-term success, there are no silver bullets to get your site to rank #1
  • SEO takes time to implement and see benefits
  • A good SEO agency will recommend best practices for a search friendly site, and back it up with documentation directly from Google
  • Putting more keywords in the meta-keywords tag and buying links don’t work to improve SEO

Hiring process summary:

  1. Interview your potential SEO consultant or agency and make sure they are genuinely interested in you and your business
  2. Check references
  3. Ask (and likely pay) for a technical search audit
  4. Decide if you want to hire

 

The Correlation Between Traffic & Leads

You might want to stop reading this right now because the conclusion of this post is (at least to me) forehead-smackingly self evident:

More law firm website traffic generates more law firm business.

I frankly wouldn’t even bother to write this post; other than a testy exchange last month between myself and LexBlog founder, Kevin O’Keefe debating if lawyers should focus on traffic when evaluating the efficacy of their marketing efforts.  In his post entitled Law Firm Publishers Screwing Up by Chasing Traffic, Kevin writes:

When publishing, you don’t have to follow all the other law firms off the traffic cliff.

I wouldn’t look at traffic and scaling up as measures of success.

As I’ve said before, and I’ll say again, I couldn’t disagree more – especially for firms interested in generating business. Ever since I ran marketing at Avvo – I’ve used traffic as a measure of success – and that holds true with my law firm clients today.  Last year, the study we conducted for the ABA showed a very high correlation between increased traffic and increased inquiries to law firms.

So now we have a great visual demonstrating the point.  One of my Account Executives shared the graph below on our internal #humblebrag Slack channel.  And the reason I love this graph is that we’ve had a drastic increase in traffic and a corresponding exceptional increase in inbound inquiries. The lines essentially move together. Note that the graph for this specific client doesn’t look at just phone calls (as our ABA study did), but also includes both form fills and chat.

So, if you’ve ever wondered if you should consider traffic an important goal in evaluating the efficacy of your marketing efforts? This picture is worth a thousand words (or prospects):

So should you follow those other law firms off the traffic cliff?  Only if you don’t want them earning the business that used to be yours.

Focus on What Your Clients Care About

How do I decide what to include on my website?
When in doubt, consult the website content flowchart.

This might be one of the most obvious blog posts out there, but given the number of sites we run across that seem to be missing this critical piece of advice, it’s worth repeating. The way to turn prospects into clients is by focusing on what the client cares about and offering a solution to their problem.

Most clients aren’t overly concerned with where you went to school, whether you’re a Lawyer of Distinction, or what your Martindale Hubbell rating is. All they care about is whether you’re going to be able to help them. What exactly does that mean though?

How should you be selling yourself to prospective clients?

According to Avvo’s white paper on How to Adapt to the New Legal Consumer, “three out of five legal consumers go online at some point to investigate and/or try to resolve their legal issue.” This shouldn’t be shocking news, but that means 60% of your potential customers are coming to your website with a specific problem in mind and a hope you can help solve it.

As a result, tailoring your messaging, blog posts, and resources to what your clients are most concerned with will pay off in the form of increased traffic and conversions.

Your focus should be positioning yourself as an expert and doing everything you can to answer the questions facing your clients. If your messaging isn’t built around how you can help your clients solve THEIR problem, you’re doing it wrong.

The more resources you can provide to potential clients the more likely they are to view you as an authority in your practice area. Proactively answering questions and addressing the issues your clients care about before they contact you is a great way to signal visitors that you’re the right person to hire for their case.

This doesn’t mean you need to build 1000s of resources to answer every possible question a prospective client might have. All it means is that if you’re not focusing on answering the four or five most common concerns facing your clients you’re missing an opportunity to start building a relationship with clients before ever speaking with them.

The same Avvo study on the “new legal consumer” also found that “37% of consumers try to resolve their situation themselves once their issue is triggered.” While some of those clients might be successful, and may never need to contact you, there’s still a significant percentage that will ultimately fill out a contact form or pick up the phone.

None of this is to suggest you stop sharing your achievements, showcasing your credentials, or even posting a few select badges on your website. All it’s suggesting is that you don’t lose sight of what’s driving your customers to contact you in the first place.

Given the limited attention spans of people visiting a website, your initial message should focus on what they’re looking for, address what they’re concerned about, and show them why there’s no need to look anywhere else.

If prospective clients landing on your website feel like you’re providing a solution instead of talking past their needs, you’ll be in a great position to convert your site’s traffic into actual revenue.

SPAM is for Eating, Not for the Internet

Google recently posted a blog with some tips on how to keep user/bot generated SPAM from ending up on your website. I’ve italicized user/bot because I don’t want you to get your SPAM confused… We’ve written about Google Analytics SPAM numerous times. This stuff is a little different, though it can be related.

Protecting your website from user generated SPAM is important because it can cause serious issues with your website in the eyes of Google. SPAM can be a source for malware or injected links. It can even go as far to result in your website being hijacked completely. Google doesn’t want to show a malicious website (or potentially malicious) to any of its beloved users, so act accordingly!

The major source for user generated SPAM on a Law Firm website is your blog comments. If you aren’t already, you should enable email notifications whenever someone comments on one of your blogs. This way you can act quickly.

Here are the tips from Anouar Bendahou, Search Quality Strategist at Google, to fight this type of SPAM (I’ve bolded my favorites):

  • Keep your forum software updated and patched. Take the time to keep your software up-to-date and pay special attention to important security updates. Spammers take advantage of security issues in older versions of blogs, bulletin boards, and other content management systems.
  • Add a CAPTCHA. CAPTCHAsrequire users to confirm that they are not robots in order to prove they’re a human being and not an automated script. One way to do this is to use a service like reCAPTCHASecurimage and  Jcaptcha .
  • Block suspicious behavior.Many forums allow you to set time limits between posts, and you can often find plugins to look for excessive traffic from individual IP addresses or proxies and other activity more common to bots than human beings. For example, phpBBSimple MachinesmyBB, and many other forum platforms enable such configurations.
  • Check your forum’s top posters on a daily basis. If a user joined recently and has an excessive amount of posts, then you probably should review their profile and make sure that their posts and threads are not spammy.
  • Consider disabling some types of comments. For example, It’s a good practice to close some very old forum threads that are unlikely to get legitimate replies. If you plan on not monitoring your forum going forward and users are no longer interacting with it, turning off posting completely may prevent spammers from abusing it.
  • Make good use of moderation capabilities. Consider enabling features in moderation that require users to have a certain reputation before links can be posted or where comments with links require moderation. If possible, change your settings so that you disallow anonymous posting and make posts from new users require approval before they’re publicly visible.Moderators, together with your friends/colleagues and some other trusted users can help you review and approve posts while spreading the workload. Keep an eye on your forum’s new users by looking on their posts and activities on your forum.
  • Consider blacklisting obviously spammy terms. Block obviously inappropriate comments with a blacklist of spammy terms (e.g. Illegal streaming or pharma related terms) . Add inappropriate and off-topic terms that are only used by spammers, learn from the spam posts that you often see on your forum or other forums. Built-in features or plugins can delete or mark comments as spam for you.
  • Use the “nofollow” attribute for links in the comment field. This will deter spammers from targeting your site. By default, many blogging sites (such as Blogger) automatically add this attribute to any posted comments.
  • Use automated systems to defend your site.  Comprehensive systems like Akismet, which has plugins for many blogs and forum systemsare easy to install and do most of the work for you.

How to Find (And Fix) Orphan Pages

What is an Orphan Page?

An orphan page is a page on a website that is not linked to by any other page on the site. Think of the internet like a perfectly built spider web, each strand connected to another. Now imagine, a couple feet away from the web, a strand of silk hanging mid-air, all by itself. It’s still a piece of web, and would be helpful to a spider if the spider could reach it, but this spider can’t jump, and the strand of silk is useless. This strand of silk is an orphan page.

Orphan pages are rarely stumbled upon by users. This is because a user would have to access the page directly (via URL search) or via sitemap, which doesn’t tend to happen.

Some orphan pages are orphaned intentionally. These are private pages used by webmasters that aren’t intended for users to stumble upon. But we won’t worry about these pages in this post.

Why Should I Care?

At Mockingbird, checking for orphan pages is part of our technical audit. It’s one of the many indicators we use at the very beginning of an engagement to asses a client’s website health. Lots of orphan pages = website health could be improved. Why is this the case?

  1. You might have valuable pages orphaned. Sometimes this happens accidentally. This could mean that you have great content on your site, but, as it isn’t linked to, a user will never find it naturally. This is bad for the user, but not only this, you’re missing out on the potential online credibility coming from your valuable content. People don’t link to pages that they can’t find. Search engines wont have the opportunity to recognize you as an online authority on any subject if your best pages aren’t getting seen, linked to externally, or talked about.
  2. Orphan pages might bring penalties. This is a debated point among SEOs. Some speculate that, upon discovering orphan pages on a site, search engines will treat these pages as doorway pages (unnatural pages intended to rank artificially high for certain search terms to bring in users), and penalize the site. Most disagree, but in this case it’s worthwhile to error on the side of caution.

How Do I Identify Orphan Pages?

There are plenty of ways to identify orphan pages on your site, but no matter how you get the it, all you need is:

  1. A complete list of every page on your site
  2. A complete list of every crawlable page on your site.

For (1.) I use the xml sitemap*. If this sitemap is working correctly, it should be updating automatically each time a page is added to your site, regardless of whether or not it’s orphaned.

For (2.) I use Screaming Frog. Screaming Frog crawls the site as a Googlebot/Bingbot would. This means it starts at the homepage and works down, exploring each link it encounters on its way. Because Screaming Frog works in this way, it excludes pages that are not linked to on any other page. You called it, orphan pages.

Now that you have both a list of every page on your site, and a list of every crawlable page on your site, it’s time to compare. Bring both lists into an excel spreadsheet and run a duplicate check. All pages that don’t appear in your spreadsheet twice (these should be the pages that appear in your sitemap, but not Screaming Frog) are orphan pages.

What Do I Do Once I find Them?

This is the easy part. If you’ve found unintentionally orphaned pages on your site, assess their value. If an orphaned page has thin content, duplicate content, or is outdated, you’re better off without it. Noindex these pages. For valuable, relevant orphaned pages that you find, link to them from a natural page. Put yourself in the user’s shoes and imagine where your orphaned page would be the most helpful. If you discover an orphan page on your auto website called “Everything You Need to Know About Pistons”, your “Engine Parts” page would be a great candidate as a page to link from.

 

*In order to access this, just tack “/sitemap_index.xml/” on to the end of your homepage URL.