Google Introduces Core Web Vitals

In early May, Google announced they would be tracking user experience by the creation of three new web vitals or as they say, “essential metrics for a healthy site.” In the beginning of July, google had its annual web dev live conference where the first day was majority focused on explaining these new vitals and how to optimize these vitals. Measuring user experience will always be a moving target and Google has stated that they will be reviewing web vitals annually during their I/O conference. I will cover what these web vitals are and some overview learning of watching videos form the conference.

Overview of the New Core Web Vitals

Largest Contentful Paint (LCP)

LCP measures loading and speed by marking when the largest (usually primary content) is loaded into the viewport. A great example of this would be a hero image often found on websites. Your target goal of LCP should be 2.5seconds or less when the page first starts loading. Google explains LCP more in depth.

Common Issues Affecting LCP

  • Slow server response times
    • Cheap hosting
    • Terrible server side coding
    • Unoptimized database queries
  • Render blocking JS/CSS – Before website loads content, it has to parse HTML page. CSS and JS files by default block the rending of this page till they are loaded.
  • Slow resource times
    • Unoptimized images – usually the main culprit
    • Loading videos
  • Client-side rendering – using JavaScript to dynamically load content like API calls and not optimizing or caching the calls.

You can learn to optimize for LCP from Google.

First Input Delay (FID)

FPD measures your sites interactivity and responsiveness by measuring how long it takes for a user to interact with your page as it loads. It measures the delay of your web page being unresponsive to the user. Goal is to have a FPD of 100 milliseconds or less. Google explains FID more in depth.

JavaScript Biggest Offender

The greatest impact of FID come from JavaScript. Javascript blocks the webpage from loading until it’s executed, it’s known as “render blocking.” For internal JavaScript files, it’s best to optimize and chunk your Javascript code. Third party scripts affect your site’s loading speed too.

You can learn more in depth ways to optimize FID from Google.

Cumulative Layout Shift (CLS)

CLS measures visual stability by quantifying any unexpected layout shifts of visible web content. Think of when a page loads and you start reading an article, then an image loads in and pushes the paragraph down or a website banner load sin late and shows the entire web page lower, both of these are examples of CLS. CLS is calculated by impact fraction * distance fraction.

Common Issues Affecting CLS

  • Images without dimensions – always include width and height attribute on media elements such as images or videos
  • Ads, embeds, iframes, etc, without dimensions
  • Dynamically loaded content (often from APIs) – don’t forget to save some allotted space for any content being loaded dynamically. Avoid loading new content above existing content, unless triggered by a user interaction like a load more button.
  • Web fonts causing FOIT(Flash of Invisible Text)/FOUT(Flash of unsettled text)

You can learn more in depth ways to optimize CLS from Google.

Measuring Web Vitals

Along with the announcement of these new core web vitals came some new tooling and updated to existing tools. Google measures these three metrics with two types of data. Lab data is artificial interactions used to track down errors and bugs. Field Data is real world users and how they are interacting with your site. If 75% of the page views meet the good threshold for each measurement, then the website is classified as having a good performance for that metric. First let’s explain Lighthouse, the main technology powering these tools.

Lighthouse: the Underlying Technology

Google uses Lighthouse, a website auditing tool that powers different tools to measure these vitals. Lighthouse 6.0 has been released with reporting on the three new core web vital metrics. The web core vitals: Largest Contentful Paint, Cumlative Layoutshift, and Total Blocking Time (Lab data to simulate First Input Delay) are now added into the scoring system. The performance scoring system is broken down below.

Weight % Audit
15 First Contentful Paint (FCP)
15 Speed Index
25 Largest Contentful Paint (LCP)
15 Time to Interactive (TTI)
25 Total Blocking Time (TbT)
5 Cumlative Layout Shift (CLS)

Tools Provided to Measure your Website

  • Pagespeed Insights – a tool that has been around for a long time and shouldn’t be new to anyone working with websites. What is new is leverages Lighthouse to measure core vitals. Great for finding and diagnosing easily replicable errors.
  • Chrome UX Report (CrUX) – uses real users for data and be setup using Data Studio or BigQuery to create reports. Developers can also leverage the CrUX API to pull in JSON data and visualize it how they’d like.
  • Search Console – has a new web vitals report built in based on real user data. Awesome for a constant monitoring tool your live website and uses real world data.
  • Chrome Dev Tools – had some new features implemented into the performance tab to measure core web vitals. You can also perform lighthouse audits direct int he Chrome dev tools as well. You can learn more from the web dev live video. Very useful for local debugging.
  • Web vital extension – you know Google had to create a Chrome extension.
  • Site Kit from Google – a WordPress Plugin from Google that connects to Google services and displays an overview in your WP dashboard.

Laws of UX Series: Law of Proximity, Law of Similarity and the Law of Uniform Connectedness.

Laws of UX are a collection of design heuristics created by Jon Yablonski to help designers leverage psychology to create more human-centered experiences. You can find explanations for each law on the website lawsofux.com, as well as an in-depth case study regarding his thought process on his website, jonyablonski.com

This will be a series of blog posts briefly covering the many laws and how they can help designers create better experiences for law firms.

 

Law of Proximity1) Law of Proximity

“Objects that are near, or proximate to each other, tend to be grouped together.”

The Law of Proximity defines how the human eyes with how our eye and brain form connections with visual elements. Elements that are close together are perceived as a group and related to each other. Elements that have space between them establish a disconnect and are not related.

For example, paragraphs in a book or blog post follow the Law of Proximity by grouping specific sections of text together to form similar ideas. Also, take our contact form in our footer for example. By placing the contact information and form fields within the boundary of a box,  we have established that everything within that element is related to each other and focuses on one specific function.

 

 

Law of Similarity2) Law of Similarity

“The human eye tends to perceive similar elements in a design as a complete picture, shape, or group, even if those elements are separated.”

The Law of Similarity is similar (heh heh) to the Law of Proximity. While the Law of Proximity states that elements that are close together are perceived as a unified group, the Law of Similarity states that elements that are similar to each other are perceived as a unified group, even if they are not grouped together. Similarity can be defined as shape, color, size, orientation, etc.

 

 

 

3) Law of Uniform Connectedness

“Elements that are visually connected are perceived as more related than elements with no connection.”

The Law of Uniform Connectedness states that elements connected by uniform visual properties are perceived as a unified group more than elements that are not connected. Similar to the Law of Proximity, if a group of identical circles is enclosed in a box OR connected by a line, they are more closely related than the other circles that are outside of the box or not connected by a line.

 

 

Stay tuned for the next post in this series where I go over Miller’s Law, Peak-End Rule, and the Serial Position Effect.

Mockingbird’s Approach to Building Websites

We are going to go over the tools and techniques used to make Mockingbird custom websites and how it helps us achieve our technical metrics and goals. We are constantly researching and trying to improve our build process. As more techniques and tools come out, we start learning how to incorporate them into our process.

What are our goals?

  1. Site Speed – we aim for all websites to load 3s or below (without third party scripts)
  2. Accessibility – We clear the AA Level of the Web Content Accessibility Guidelines
  3. Sexy and clean – sometimes, clients decide to leave us. When they do, we want to make sure whoever takes a look under the hood of the theme can easily do what they need to do.

What theme are we using?

  1. Sage 9 Theme – The sage theme comes with a lot of tools baked in for advanced WordPress development.
    • Blade Templating Engine – stay DRY (don’t repeat yourself) by using Blade templates which makes it easy to organize code so developers can quickly find what they need and prevent unneeded code bloat.
    • Webpack – we can write JavaScript and SASS that can be easily compiled, minified, and concatenated to reduce the size of the theme.
    • JavaScript Routing – combined with Webpack, we can dynamically load JS files on different pages to reduce load size on each page.
    • Automatically optimize theme images – all image files within the theme get compressed and minified for production.
  2. Tailwind CSS – the never ending debate of CSS structure and conventions can be tiresome. After some long consideration, we landed on tailwind, which is a utility first approach to writing CSS.
    • We don’t have to think of clever names for classes
    • Easier to scale vs other methodologies where you can easily repeat yourself such as adding borders or shadows to elements.
  3. Blade SVG – a way to easily incorporate SVG files into the website.
  4. Purge CSS – we configure a script to run throughout the site to purge all the extra CSS classes that aren’t being used, therefor reducing the file size.
  5. Lazyloading – we have created a custom implementation to enable lazy loading so pictures only load when they are needed.
  6. WP-CLI – installed on our local environments and hosting to easily manage things or run scripts on our projects.

What plugins do we use?

  1. Soil – cleans up all the extra junk WP likes to add to websites when rendering.
  2. Advanced Custom Fields – this is the only way to easily extend your WordPress customization.
  3. Query Monitor – used during development only so we can watch our calls to the database and see anything that is being resource intensive.

What tools are used in QA?

We want to measure how we are doing with everything above so we use a few different tools to measure.

  1. Wave – an accessibility website or extension that scans your pages and displays any accessibility issues.
  2. GTMetrix – a website speed analysts tool
  3. Google Lighthouse/Devtools – another tool that rates your website on site, speed, and accessibility.

Sitemaps: What are They and Why Do I Need One?

There are a lot of features on websites you really don’t think about as a user until you get a peek behind the scenes. Sitemaps are one of these features. Whether they’re HTML sitemaps or XML site maps, there are conflicting ideas on whether or not they’re actually necessary. So let’s go into the benefits of having a sitemap.

 

So what is a sitemap?

A sitemap is a page on a website that contains links to every other page on the website. See, here’s ours. It’s usually designed for crawlers and search engines, which I’ll get back to. It’s pretty much what it says on the tin: a map of the site. It’s a one-stop-shop to get to all the other pages.

 

So what are the benefits of a sitemap?

There are many benefits to a sitemap, but the main ones fit into the groups of ease for search engines, ease for users, and organization. 

 

Ease for search engines

For the new kids in class, search engines like Google know which pages to show searchers by looking at millions of websites. They utilize spiders (a tool that follows links and builds a web of links from the connections its found) to understand how everything on the site links to each other. They can do this by simply following internal linking structures, but have a better time when they can go through one page. Hence the sitemap. Crawlers can go directly through the sitemap to every page, saving time and resources. 

 

This would probably be a good time to touch on the differences between HTML sitemaps and XML sitemaps. XML sitemaps are designed solely for search engines. Humans don’t get to see much of it. HTML sitemaps are usually easy to find on a website; ours is linked to in our footer. You can see where all of our pages are and even find links to every single one of our blog posts. Every. Single. One.

 

Ease for users

The user-oriented sitemap is extremely useful for finding pages that might be hidden in layers of internal linking. If you remember the name of one of our blog posts, you are just a click and Ctrl+F away from finding it. It sure beats scrolling back months or years to find it. 

 

Organization

Even nice websites can be sloppily organized. It happens. But a sitemap can help to visualize and show you where you might be able to correct linking structures. If your service pages are organized by type of service, but their URL structures don’t reflect that, your site might have an organization problem. A sitemap will show you how your pages are currently set up, and you can decide whether or not you want to fix that yourself.

 

Downsides of a sitemap

To be honest, there aren’t really any other than it takes time (not even a lot of time) to build it. It’s generally just a good practice to have a sitemap, even if it isn’t necessary

 

Creating a Sitemap

We actually have a blog post about how to create an HTML sitemap, so that’s a good resource for that. As for creating a user-oriented sitemap, there are numerous WordPress plugins for this very purpose. If you would like more information on building a sitemap for your law firm’s website, contact a company that has experience in this area.

Laws of UX Series: Aesthetic Usability Effect, Doherty Threshold and Fitts’s Law

Laws of UX are a collection of design heuristics created by Jon Yablonski to help designers leverage psychology to create more human-centered experiences. You can find explanations for each law on the website lawsofux.com, as well as an in-depth case study regarding his thought process on his website, jonyablonski.com

This will be a series of blog posts briefly covering the many laws and how they can help designers create better experiences for law firms.

 

UX Law Poster1) Aesthetic Usability Effect

“Users often perceive aesthetically pleasing design as design that’s more usable.”

Users tend to assume that things that look better will work better, even if they aren’t actually more productive. Users who visit your website may have a positive emotional response to the visual design of your website, making them more tolerant of minor usability issues while using your site. When I say “minor usability issues” I mean text with low contrast, spelling errors, or typography that isn’t consistent. The Aesthetic Usability Effect does have its limits and when the design puts aesthetics over usability, users will lose patience and leave your site.

For example, I have seen law firm websites that include huge hero images on practice area pages that cover the entire screen without including any information until moving down the page. The page may look appealing at first with a large, beautiful image at the top, however, the image that is taking up the entire screen may be seen as an annoyance once they are trying to complete specific tasks.

 

 

2) Doherty Threshold

“Productivity soars when a computer and its users interact at a pace (<400ms) that ensures that neither has to wait on the other.”

Fast websites are fun to use. Laggy, slow response websites suck. The longer it takes for your website to respond to a request, the longer your user is taking to think of what they want to do next. If you keep your users waiting, they will find what they are looking for on another law firm’s site. As a general rule, you want to provide feedback to a user’s request within 400ms in order to keep their attention.

If your website has any loading screens that aren’t imperative to the functionality of the site, fancy page transitions, or anything else that may slow down their experience with your site, you are doing more harm than good with those “cool” features.

 

 

3) Fitts’s Law

“The time to acquire a target is a function of the distance to and size of the target.”

A touch target is an area that responds to user input. Make sure that all touch targets are large enough for users to understand their functionality and easily accessible for users to interact with.

Many law firm websites (and websites in general) have touch targets that aren’t clearly visible or are located in hard to reach places from where a users finger can reach(looking at you hamburger menus located at the top left or right on mobile screens). Make sure any touch target on your website is easily recognizable and accessible to avoid confusing your users.

 

Stay tuned for the next post in this series where I go over Hick’s Law, Jakob’s Law and the Law of Common Region.

How to Get The Most Out of Your Blog

Maintaining a blog can feel like a fruitless chore, believe me, I get it. You can add and add to it and see no returns on 95 percent of your blog posts. So how can you turn it around and how can you know if it’s even worth it to keep going?

 

Breaking Out of the Spiral of Dis-Content

If you feel like your blog is going nowhere, chances are you don’t have a clear idea of where it’s supposed to go. Without a goal in mind it’s impossible to write functional content.

Your first step is deciding what type of audience you’re writing for, and what part of that audience you want to become clients. Write about what they want to know and, more importantly, what they need to know.  This will help you figure out where your blog is going and will hopefully inspire some interesting posts that drive traffic.

 

Linking to Your Practice

One of the important things to remember about your blog is that it doesn’t have to be directly related to your practice area. You can write about a specific subsection of the law that you find particularly interesting. This will help you stay interested in it while also starting to rank for those long-tail keywords. 

Once you’ve figured out your focus, you can try to find places where it might link to your firm as a whole. Think of where it overlaps with your practice areas. This will help traffic flow to the rest of your site and into your firm.

 

Find a Schedule That Works for You

Daily blogs can honestly be a bit excessive. Annual blogs are a bit sparse. Try to find a schedule that makes sense given your bandwidth and list of ideas. This might mean once a week, twice a month, once a month, or any period. 

Once you have found a cycle that works for you be sure not to get married to it. If a current news story is incredibly relevant to your practice or your blog, write a surprise post. Staying flexible will be your friend.

 

Guest Post

As far as publicity and link building goes, not much works better than guest-blogging. If you can write a post for a well-known publication, or get a well-known author write a post for you, you are building the authority of both you and your blog.

 

Stop the Blog if it Isn’t Working

Not all law firms need blogs. They aren’t a requirement for your bar membership and if no one’s reading it, it’s just costing time. Try to take steps to improve it, but if it still isn’t driving any traffic after a year or 18 months, don’t feel bad for abandoning it. You can pick it up again at any point if you feel the desire, but don’t feel bad for not doing that.

 

If you are worried about your blog and feel like your content could be improved, consider our content development plan. Mockingbird can help you audit your blog, cut what’s slowing your site down, and make a plan for building on what you need. Contact us to learn more.

What is a Manual Action and How Do I Fix It?

Google tries to be vigilant about spam. It really does. Link building schemes, black hat tactics, and malicious software are some of the main things Google looks for. When it finds them, it might respond with a Manual Action.

 

So What is a Manual Action?

A manual action is when an actual, real-life member of Google’s team checks in on your websites and penalizes it for going against best practices. Manual actions can take a variety of forms and can be consequences of a variety of things.

 

Types of Manual Actions

 

  • Partial Matches (partial de-indexing)

If Google finds pages in violation of best practices it might de-index those specific URLs. This means that they will no longer show up in search results. This can be done to a page, sub-domain, forum, or any section of a domain. A partial match action is generally the best possible scenario for webmasters who are facing spam attacks, as the domain is still functioning and traffic can still find your site. It is still important to try and fix the issue and lift the action as soon as possible.

  • Whole Site Matches (total de-index)

If the problem is found to be larger than a few key URLs, Google may de-index the entire domain. This is a harsh penalty, but it can be reversed once the site complies with webmaster guidelines. Whole site matches are generally implemented when a site flagrantly ignores guidelines by cloaking content, redirecting users, and exposing users to malicious content. If your site is facing a whole site match, you need to consider what brought you there and if you need to change course.

 

What Might Cause a Manual Action

 

Google has a long list of reasons for invoking manual actions. Most of them involve spam links, as link building schemes are about the most forms of breaking best practices that webmasters do. The complete list includes:

 

  • User-generated spam

User-generated spam is spam that comes not from the webmaster, but the users of the website. This happens in forums and comments sections of websites.

  • Unnatural links to and from your site

This refers to link building schemes and spam attacks. If your site is suddenly sending thousands of links to a single, low authority site or is showing signs of spammy link exchanges, or has thousands of links from one low-authority site, Google might reprimand the URL or domain.

  • Thin or duplicate content

This is more subjective, as some sites do not need large amounts of content. That being said, many sites have unnecessary numbers of pages with practically duplicate content, which often sees penalties.

  • Cloaked content/images

This is a pretty old-school black hat technique, and Google is pretty good at finding when people try to implement it. Cloaking refers to showing different content to humans than to the GoogleBot. They can do this by having one image cover another, writing paragraphs of keywords in the same color as the background of the page, or stuffing keywords into gibberish text. Google really doesn’t appreciate these techniques and comes down pretty hard on those that do it.

  • Redirects

Redirects, whether desktop or mobile, refers to when a user clicks on a link to one website then gets redirected to another, completely unrelated, URL. The penalties are usually applied when the redirect goes to a site that is harmful or the redirect is malicious in it’s intent (i.e. sending a user looking for cartoons to a porn site).

 

How to Fix a Manual Action

Fixing a manual action starts by fixing the problem you were originally penalized for. If you were hit for displaying spam comments you might want to delete those comments and block the IPs they were sent from. If you were hit with a spam link attack, go through the disavow process and clean up your referrals. Google has recommendations on how to fix your website after all types of manual actions. 

Once you have made the changes you need to make, you can make a reconsideration request. This is a request for Google to re-review your website and lift the manual action. 

Sometimes you do the work, write the request, and get a denial. This means you didn’t do the fullest work you needed to do. Get back to work and draft a new reconsideration request. 

 

Final Thoughts

Don’t mess with Google. Even if they wrongly put a manual action against you, you apologize and follow the recommendations they give you. Google holds all the power.

In Defense of Location-Based Hero Images

We’ve all been there: you want to build your website, you have a homepage, and you don’t know what to put as the hero image. Should you go for your team, arms crossed and determined, all standing in front of your office? Maybe a picture of you chatting with a client? What about a photo of your infinite bookshelf full of leatherbound, embossed texts? Or maybe even a custom-shot video that includes all of these things and more?

I’m here to argue for the simple location image. If you really want it, a location image with the firm partners in it. Here’re four reasons why:

 

1. Easy to Acquire

A professional photographer should have no problem taking stunning photos of your local scenery, all without the time commitment of a full in-house photo or video shoot. 

 

2. Improves Local Credentials

Local is everything in digital marketing. Consumers make decisions based on what is close and what benefits their communities. A reminder that you are a part of that community as soon as they get to the homepage doesn’t hurt. Local landmarks, scenic views, and sunrises rarely go wrong. 

 

3. Adaptable

So let’s go through some of the other options. That picture of your team? Inspirational now, a weird reminder in a few years after one or more of your team members have moved away. Isolating if someone new joins the team. 

Chatting with a client? Looks good now, kind of confusing if you ever look any different than you did when you took the photo. None can escape the hands of time, holding onto a photo won’t save you.

That custom-shot video? The same issues as the other options arise here: changing team members, changing images. You might even change offices. Not to mention how much an embedded video can slow your site down.

You know what’s reliably beautiful and consistent? A gosh dang sunrise behind a local landmark. 

 

4. Send a Message Without Overthinking It or Underthinking It

You’re here to look like a lawyer and lawyer things are your priority, not spending hours deciding which image should grace your homepage. That being said, a simple stock image won’t cut it. We’ve all seen law firm websites that look like they were tossed together; out of proportion stock images, outdated fonts, just plain weird color schemes. Be different. Be better. Show the world that you’re willing to make an effort, but won’t spend all of your time making an effort on things that aren’t your clients.

 

Obtaining Your Hero Image

So now you know what you want. How do you get it? Through a stock photo site like how we got the hero image for this blog post? Probably not. The homepage to your website is a bit more important than the top of one of our blog posts. 

Hire a photographer. They can help provide you with options while ensuring no one else has the exact photos you’re using. It also gives you the option of having photos of you and your team alongside your scenic locale. You will have more control and will likely get better results.

Find a photographer with good reviews or through someone you trust and schedule your shoot.

How to Properly Use Alt-Text

Images are some of the more misunderstood features of web pages. While they are important for conveying messages and often improve content as a whole, they can also lead to trouble when misused. 

Alt-text is your way of staying out of trouble and improving the indexability of your content. 

One of the worst ways images can get you in trouble is with ADA compliance regulations. When a website isn’t set up for accessibility users with any level of visual impairment will struggle. These users often use software that reads the content out to them. This is tough when the content says “Refer to the above image” and the only thing the software can say about the image is that it is 1926374627.jpeg. Alt-text fixes that.

When you insert an informational image onto your page, chances are you’ve seen the box to insert alt-text for the image. You very easily might have ignored it. You don’t want to ignore it.

 

What to Write in the Alt-Text Box

Ok, let’s look at this chart from Search Engine Land:

Google 2019 ad revenue share pie chart. $98.1 billion from search and other products, 72.7%. $21.61 billion from Google network, 16%. $15.1 billion from Youtube, 15.1%.

What information does it give? It clearly shows that Google earned a majority of its 2019 revenue from Search and other products. You can see that at a glance. What would a computer read? 

Without alt-text, it would read google-ad-revenue-share-property-2019.jpeg.

Let’s fix that.

So it’s Google’s 2019 ad revenue share. We can start with that.

 

“Google 2019 ad revenue share pie chart…”

 

Now, what does it actually show? Let’s go through the data.

 

“Google 2019 ad revenue share pie chart. $98.1 billion from search and other products, 72.7%. $21.61 billion from Google network, 16%. $15.1 billion from Youtube, 15.1%.” 

 

There you go. That’s an alt-texted informational image.

 

But what about decorational images?

 

So not all images are informational. They don’t all have clearly defined data. What if it’s just a random image? What if it’s just a descriptive image to show a picture of your product?

If it’s a random image (like the hero image here) alt-text isn’t 100% necessary. No one is missing out not seeing this stock image of water.

If it’s a product, describe it as you would your product. 

Here is our Definitely Real product, Google Juice:

Google Juice. Green juice in a glass with two straws and kale.

How might we use alt-text to describe this Totally Not Fake Google Juice?

 

“Google Juice…”

 

Well, that was covered in the file name. Let’s go a bit more in-depth.

 

“Google Juice. Green juice in a glass with two straws and kale.”

 

That’s better. 

 

So now you know how to do alt-text. It does take a bit more time when creating a webpage, but it’s cheaper than an ADA lawsuit