Technical SEO Ranking Factors for Law Firms

Technical SEO Ranking Factors

This is #2 of the in-Depth versions of the 7 Key SEO Ranking Factors

Technical SEO is like laying the foundation of your house. Not super-sexy, but nothing else happens without it and the house will crumble if it fails.

Technical SEO is different from on-page SEO (which deals with how the content on the site is optimized) and off-page SEO (which deals with how popular a page is through building relationships with other websites).

Technical SEO is more about the things like how a search engine can “crawl” the web or find your site, and it’s more about website load time and coding competency than on- or off-page optimization.

Technical SEO Venn Diagram

With the constant updates from Google including Hummingbird and RankBrain, these technical aspects are forever changing and are often complex.

So while you can perform a technical SEO audit yourself, it may be helpful for your law firm to seek help from an experienced digital marketing agency. Without technical SEO in the right state, you won’t have the proper base for your on-page SEO or links to grow from.

Technical Analysis Overview

I’m giving you two lists, so that you can start with the most important elements and work your way up to being a more comprehensive technical SEO expert or hand it off to someone else.

If you really want to dive in deeper, you can set up a tool like SEMrush with an SEO site audit.

It will give you the errors, warnings, and notices and then show you your progress as you correct errors over time.

SEMrush SEO Site Audit

The Mega Technical SEO Action List

First, I will give you the mega list, so your eyes bleed. Just kidding, I mean, so you know how deep this stuff can get. Then I will give you a more manageable list you can take action on.

The second list is the one that’s more actionable for C-level folks and partners. 

Site Architecture – URL Structure – Getting Pages Indexed

  • Make sure your pages and assets are crawlable (don’t add a Noindex tag by mistake)
  • Make sure your content is actually indexed (Use Google Search Console)
  • Test your robots.txt file for “typos” and make sure you are not disallowing, or blocking, good content
  • Use an XML sitemap and keep it updated
  • Review blocked resources (hashbang URLS) with the Fetch as Google tool
  • Optimize your crawl budget (See Search Console >Crawl > Crawl Stats)
  • Set up permanent 301 redirects (not temporary 302’s) when launching a new website
  • Do not use meta refresh for moving a site
  • Use hreflang for language and regional URLS
  • Use HTTPS, so your site is secure
  • Make your URLs simple for users and search engines vs. “index.php?p=367595.”
  • Add breadcrumbs for better navigation and indexing
  • Fix 404 errors, and fix broken links (internal and external)
  • Avoid indexing multiple versions of homepage (e.g. www. vs. non www.)
  • Rel=canonical can be used for multiple versions of the same content
  • Use structured data markup /
  • Avoid canonicalizing blog pages to the root of the blog
  • Declare the default language of your site

Content Optimization (This is directly related to technical issues)

  • Make sure title and meta-descriptions exist, are unique and the right length
  • Use H1 tags for main topic heading, H2 for sub-topics, etc.
  • Fix broken images
  • Add ALT tags to images
  • Get rid of duplicate content
  • Review pages with low text-to-HTML ratio and add more content

Usability and Links

  • Improve poor mobile experience
  • Improve website navigation
  • Improve the amount of helpful links to internal pages
  • Keep a reasonable number of links on each page
  • Remove spammy / bad links to your website

Website Loading Speed Time

  • Limit the number of resources
  • Improve server response time
  • Optimize images for size and quality
  • Reduce the number of redirects and remove redirect loops
  • Make a browser cache policy
  • Minimize render-blocking JavaScript and CSS

Need some aspirin? Here is a more manageable list to start with. For many of you who are not digital marketers or technical SEO experts, the first list might have been a lot to take in. That’s why we’ve included this checklist you can use each time you run a technical SEO audit on your law firm’s website.

Technical Analysis Checklist: Top 12 Action

  1. Make sure URLs follow the same format and make them easy to understand.
  2. Implement an XML sitemap to get pages indexed.
  3. Use a robots.txt file to direct search engines.
  4. Set up secure browsing (HTTPS)
  5. Use breadcrumbs for improved navigation and indexing.
  6. Create a 404 error page to handle broken links.
  7. Make sure you use permanent 301 redirects versus temporary 302 redirects.
  8. Limit the number of redirects.
  9. Check for duplicate content and remove it (including having two home pages indexed).
  10. Correct any title and meta-tag issues (missing, duplicate, length).
  11. Improve mobile load time.
  12. Make your website’s user experience mobile-ready and responsive.

Site Architecture

The site architecture tells search engines how your site is put together. In order to make their job easier, you should ensure your law firm website has a clear and consistent URL structure; a sitemap; a robots.txt file; and secure browsing. We’ll go into detail about each of those first because no one will see your site in the search engines if the search engines don’t even have a chance to index it.

Making clean URLs

One thing both your users and the search engines want to see is a well-organized URL structure. This helps both parties understand what is on the page. For example,

is more confusing than .

You can use your URLs to add targeted keywords to any pages, but do keep in mind, using hyphens in your URL structure will make them easier to understand. Keep the number of characters under 2048, or the page won’t load.

XML sitemaps

XML sitemaps are useful if you want to correctly index your law firm’s website. When the search engine crawls your site, having an up-to-date XML sitemap is beneficial. The sitemap is what the search engine crawlers use to check and update their link database, so they can find pages on your website they might not have previously seen.

So if you’re currently going through an overhaul of your website content, or you have a load of updated pages, you should consider creating a sitemap.

Creating a sitemap is easy. You just need to follow these steps. offers users a free tool to create your sitemap.

XML Sitemaps Free Tool

All you need to do is enter the URL of your law firm, and hit “start.”

Once you have your sitemap downloaded, you’ll need to submit it to Google Search Console.

Google Sitemap

WordPress also has plugins for XML sitemaps. Using the Google Search Console, you’ll be able to upload new sitemaps, as well as see any previous sitemaps you might have created.

If you don’t keep your site up to date, you risk indexing old or duplicate content that can cause ranking issues. Google may miss some new content as well.


A robots.txt file is what webmasters use to instruct web robots (usually the crawlers we mentioned earlier) how they should crawl the pages on their website. In basic terms, a robots.txt file tells robots and crawlers which pages of the website they can and cannot access.

You do this through “allowing” or “disallowing” certain behavior by specific agents, robots, or crawlers. Here’s how a robots.txt file might look.


Blocking all web crawlers from all content

User-agent: *

Disallow: /

What the above means is that all web crawlers should not crawl any page on

If this seems complicated, don’t worry. There are plenty of tools available that can help you generate your own robots.txt file.

SEOBOOK has a free tool that generates the files for you.

Default Robot Access

A robots.txt file is also used in situations where you have private or sensitive data you only want certain people to be able to access.

Go to your competitors’ websites and add /robots.txt after their domain and you can often see how they’re using it.

Set up secure browsing (HTTPS)

Hypertext transfer protocol (HTTP) is the underlying protocol used by the World Wide Web that defines how messages are both formatted and transmitted. It includes letting web servers and browsers know what actions they should take.

But due to safety, you should upgrade your law firm website to the more secure HTTPS rather than just HTTP. Having (or not having) HTTPS on your website is a factor Google takes into account when ranking websites. HTTPS is also a much safer option and reduces the chance of your website being hacked.

Use breadcrumbs for better navigation and indexing

Your website’s “breadcrumbs” are a trail of a navigation system that shows where the user is located on your website. Breadcrumbs will be most useful for your law firm if you have a large law firm website with hundreds or thousands of pages.


It helps to create a proper structure. However, don’t consider breadcrumbs to be your entire navigation system; they should be a secondary extra addition.

Create a 404 error page to handle broken links

Most websites will have 404 error redirect pages that usually occur when a page cannot be found.

To combat users bouncing from your website when they land on a 404 error page, think about making a creative 404 page. Use it as an opportunity to direct visitors back to your home page, or another important service page on your website.

Just because you’re a law firm doesn’t mean you can’t inject some personality into your website, especially when things go wrong.

Fish & Richardson offers a great example of how to design a good 404 page.

404 Sample

And if you want to avoid users reaching the 404 page completely, then you can find your error pages by using Google Search Console.

Google Search Console 404

Click “crawl errors” and you’ll be presented with a number of pages that might have errors on them.

Use permanent 301 redirects versus temporary 302 redirects

If only I had a dollar for every company that came to me after building a website with massive search engine optimization problems. Advertising agencies, designers and programmers often fail to understand technical SEO, and they either don’t set up 301 redirects that permanently tell the search engines that the old pages now have a new name, or they do it improperly.

You need to make an Excel spreadsheet of the old URLs and map them to the new URLs you intend to launch on your new website. Then, using a .htaccess file or by adjusting the server via the admin panel directly, you make this available to Google.

WordPress has some plugins for doing this as well, such as Redirection by John Godley.

You really should direct the search engines on a page-by-page basis, or at the very least, you can redirect an entire URL to the homepage. Simply setting the URL to forward from Go Daddy or a similar domain name registrar is not the same thing and will result in you losing not only a massive amount of revenue if you are getting lots of traffic from the search engines, but you might forever lose the backlinks to the old pages as well.

Improper use of redirects is one of the most common and costly mistakes in all of technical SEO. Getting people to link to your website and its pages is one of the hardest and most expensive things you can do in digital marketing. It is very sad to see companies throwing money and energy away on a regular basis and only employing the help of technical SEO people after the fact.

If you catch mistakes like this within a few weeks, you might be able to get everything back, but if you wait for several months, other websites will stop linking to you because they see that your page no longer exists and doesn’t forward properly so they will delete the links. 

Limit the number of redirects

Redirects as we just outlined are extremely useful, but they can also slow your website down. The more redirects you have on one page, the longer a user must wait for it to load. Use a redirect tool like // to discover which pages of yours have redirects and consider the necessity of each one.

Redirect Checker

Check for duplicate content and remove it

Duplicate content is when Google sees two or more versions of the same content. and can be used to diagnose what issues you have.

You can have issues with the following types of duplicate content:

Text that someone let you use on your website but that you didn’t realize was already indexed somewhere else on Google. This will get you negative points with Google, so if you want to make use of content from one of your partners, you need to make sure it’s not indexed using a “no index” tag.

One way to do this simply with a meta tag is detailed on Google’s support pages: 

“To prevent most search engine web crawlers from indexing a page on your site, place the following meta tag into the <head> section of your page:

<meta name=”robots” content=”noindex”>

To prevent only Google web crawlers from indexing a page:

<meta name=”googlebot” content=”noindex”>”

If you have a technical issue getting rid of duplicate content on your site, you can use a rel=canonical tag to tell Google which of the pages is the one you want indexed and to get credit for.

You can even have duplicate content in terms of your title and meta-tags. Make sure that your page titles and meta descriptions are unique for each page. 

How to correct title and meta-tag issues (missing, duplicate, length)

In addition to having duplicate content in your title and meta description, it is basically considered a technical prerequisite to ensure that you have title tags and meta descriptions that are the proper length.

A title tag should be around 55 to 60 characters, depending on the font styling, which could make the pixel width wider or narrower.

A meta description should be 150 to 160 characters. We shoot for 150.

We will describe in much greater detail how to optimize these in the on-page optimization section. For now, you just need to know that search engine optimization experts consider these factors when looking at the various types of technical SEO issues you may have.

Optimizing website speed for your law firm’s website

Internet users have a short attention span. The same can be said for people looking for law firms. In fact, every 100 milliseconds of time it takes your page to load leads to a 1% decrease in potential leads.

In basic terms, the faster the page loads, the better for visitors and also for a site’s Google rankings.

Before you begin optimizing your page load time, you should first try to see how quickly or slowly it loads. Our Google rep called me a few weeks ago and pressed the issue of load time for our PPC clients. He suggested using , and we also use

If you want to improve your page load speed, you should think about the assets you have on your website, as well as where your website is hosted. Page speed optimization is an art usually best done by developers. At the simpler level, it includes enabling compression, optimizing images, and leveraging browser caching.

We will review a few of the basics but know that if you want it done properly, people who specialize in this will get it done quickly, whereas novices will spin their wheels and waste your time and your money. Every day that goes by with a slow-loading website means you are losing money, visitors, and rankings in Google. Here are some important considerations.

Use simpler templates

If you can, limit the number of elements your website template or theme uses. Remember that when a user finds your site, their browser has to load each individual element, and using too many can slow your website down. This includes reviewing what plugins, widgets, or tracking codes your law firm currently uses on your website and considering whether they are all necessary.

Page design is important too, as well as getting important information across to your audience. Don’t give up the quality of your law firm website just to reduce the page load speed.

Optimize your images and videos

Images and videos take up a lot of space on your website and are often one of the leading causes for slow website speeds. Ideally, you should be looking to compress your images so that you maintain quality, while still reducing the file size. Use CompressJPG to compress your JPEG images.


If you can, save your images in JPEG format (.jpg) when dealing with photographs and PNG format (.png) when dealing with graphics.

Make your law firm’s website mobile-ready and responsive

“Mobilegeddon” is a term you might have seen around the internet if you’ve been looking to improve your own law firm SEO strategy. It refers to how Google ranks websites in the search results, with a preference given to those with mobile-friendly sites.

This is especially true for local mobile results. So when a customer types “Family law firm Illinois,” all the results on the first two pages of Google will be responsive and mobile-ready. If your page isn’t mobile-ready, there is a huge chance you won’t show up in the search rankings at all, and if you do, it’ll be so far down, no one will discover you.

Having a responsive, mobile-ready site involves ensuring all the elements load in a way conducive to easy reading.

Responsive vs Non Responsive

Look at this example from Designer websites. The website on the right is non-responsive and not mobile ready because the text is bunched together and will be difficult for users to navigate. The example on the left shows how the images and text have been spaced out, making it easier to navigate the site.

To see whether your site is ready for mobile, use the Google mobile test tool.

Google Mobile Test Tool

When you click “run test,” you’ll be presented with a “pass” or “fail.”

If your page fails the mobile-friendly test, fear not, as Google will outline things you can do to improve it.

Accelerated mobile pages (AMP) are also worth looking into.


In this section of the guide we’ve walked you through some of the core technical aspects of law firm SEO that you should be concerned with. As we mentioned at the start, if you don’t have your technical SEO set up correctly, this will hinder any work you’ll do on the on-page and off-page SEO.

Technical SEO, although it might seem confusing, is a process you should follow to ensure your site performs at its highest rate. When performing a technical SEO audit, use our checklist to make sure you have everything covered.

Now it’s time to cram your site with repetitive keywords that will destroy the text. Hahaha, just kidding! Let’s dig into how to do on-page keyword optimization the right way!

0 replies

Leave a Comment!

Leave a Reply

Your email address will not be published. Required fields are marked *