// Push notifications 26 Jan 18

3 Important SEO Tweaks That Could Make a Huge Difference in Your Site’s Performance

by | Jun 28, 2018 | Actionable SEO Tips

SEO MistakesOne of the services we provide at Agora Integrated Marketing (AIM) is regular website audits. This involves testing SEO performance, analyzing your content, finding gaps and opportunities that can generate organic traffic to your website.

During a recent audit of 54 different websites, our main objective was to look for any critical issues that could be affecting their visibility in Search and ultimately their bottom line.

Our audits found three key areas in which these 54 websites could all make improvements to their SEO.

Giving away premium content
without any reward or return

Nearly 50% of the websites we audited had their PDF reports appear in Google Search. These are PDF reports that are offered free as lead magnets, but some of them were also part of frontend premiums, and even backend subscriptions. For two of these websites, we found that their paid newsletters were also appearing in Search each month.

Having your lead magnet and premium reports appear for free in search could be the difference between getting a new subscriber or getting existing subscribers to open their wallets and become paid subscribers.

Obviously, you’d want your free lead magnet reports to serve the purpose they are made for: generating new leads. The same goes for your premium reports: introducing your best content as part of a paid service.

Having these reports float around as a free-for-all totally defeats their purpose.

How to find PDFs appearing in Google Search

Figuring out if your PDF reports show up in Search is easy using one of the Google Search operators: The Site Search Operator.

Head over to Google, and place the following into the search bar: site:yoursitename.com pdf.

This will conduct a Google Search within your website for the term PDF. It will bring up any PDFs that aren’t blocked from the Googlebot, as well as any other content where you mention the term PDF.

In the example below, you can see that CNN.com potentially has 28 PDF files showing in Google Search:

CNN PDF files example
NOTE: Don’t use https://www/ when using the Site Search Operator.

How to remove PDFs from Google Search

Unfortunately, you can’t add a meta tag to .PDF files at page level in WordPress like you can for other pages on your site that you don’t want to index. However, there are two ways to solve this issue.

The easiest way is to add a line to your robots.txt file disallowing search engines from crawling your .PDF files.

Here’s a sample robots.txt file:

Sample robots .txt file

In this example, there are two wildcards used as well as .pdf. An * and a $.

Essentially this is telling search engines that you don’t want them to crawl any URL that ends in .pdf. The * means any URL that ends with .pdf while the $ means the end of the URL path.

NOTE: The filename is case sensitive (.pdf) in the robots.txt file so make sure it’s lower-case.

The big disadvantage with using the robots.txt method is it stops search engines crawling the .PDF rather than telling search engines to no-index the .PDF.

This means that going forward, you won’t have any new .PDFs appearing, as you’re telling Google not to crawl those links. However, the .PDFs already in the index will stay there for a long time. Eventually they’ll drop out, but it’ll take months, if not longer.

The robots.txt method takes less than 10 minutes to implement, and although it’s advisable to use a web developer, if there’s one available to you, it’s definitely not a requirement. Just make sure you test your robots.txt in Google Search Console’s robots.txt tester afterwards.

The second method does need the help of a web developer, and maybe even Dev Ops depending on how your website is set up.

It’s the x-robots tag method.

Remove PDFs from Google
Search using the x-robots tag

The x-robots tag allows you to control how specific file types are indexed.

How to implement an x-robots tag no-indexing .PDFs will depend on what type of server you use.

If you use an Apache server, you can add some code to your .htaccess file.

The code should look something like this:

<FilesMatch “.pdf$”>

Header set X-Robots-Tag “noindex, nofollow”

</FilesMatch>

If you’re using an Nginx server, you’d need to add some code to your server configuration. The code should look something like this:

location ~* \.pdf$ {
add_header X-Robots-Tag “noindex, nofollow”;
}

NOTE: Please do not attempt to add any code to your .htaccess file unless you’re a web developer. Speaking from experience, it’s very easy to break your website, and very tough to fix it, when messing around in the .htaccess file. Yoast has published a great article on how to deploy an x-robots tag that can be read here.

Once you deploy the x-robots tag, you can do a fetch in Google Search Console for each .PDF appearing in search to speed up the process of removing your .PDFs from the index.

Finally, if you do utilize the x-robots tag method, make sure you aren’t disallowing access to your .PDF files in your robots.txt file. If you are, the search engine won’t crawl the .PDF file, and therefore won’t know to no-index it.

Multiple Versions of a Website Live

The second biggest issue we found was that 18% of the 54 sites we audited have multiple versions of their website live. That is there is either a HTTP & a HTTPS version of the site live and/or there is a WWW and non-WWW version of the site live.

For example, a site could have the following versions live:

  • http://xyz.com/
  • http://www.xyz.com/
  • https://xyz.com/
  • https://www.xyz.com/

To the naked eye, these will show the same page. However, to search engines, these are four completely different pages.

By allowing multiple versions of your domain open and crawlable to search engines, you “split” your ranking authority which results in poorer visibility in search engines.

Here’s an example of how having more than one version of your site live can cause issues to your page authority and link metrics. Your page authority predicts a pages ranking potential in search engines while the link metrics below show the number of backlinks to this page.

Here’s the HTTP version of the XYZ company website homepage:

xyz company http version

Here’s the HTTPS version of the XYZ company website homepage:

xyz company https version

Please note that the page authority is different, the linking domains, and number of inbound links to each are also different.

If you don’t have a forced redirect in place to the preferred version of your website, search engines will treat each version as separate pages and each version would accumulate its own authority individually.

As well as not giving your site the authority it has earned, and deserves, having multiple versions of your site live will cause lots of duplicate content issues.

It means all your pages are competing with each other and none of them are ranking as highly as they would do if you consolidated all live versions into one version.

If you’ve installed an SSL certificate on your website, and have a HTTPS version of your site, you should always force redirect the other versions to one of your HTTPS versions.

It doesn’t matter if you choose https://example.com or https://www.example.com.

Consolidate your ranking power and present one uniform preferred URL to search engines and your users. This is a huge issue for your search engine visibility, however, it’s a relatively easy fix that any web developer can implement within a few minutes.

Here’s a video by Joost de Valk giving some more tips on redirecting your website to the HTTPS version.

No SSL Certificate

In July 2017, we found that 26% of all audited sites weren’t HTTPS compliant. Thankfully, that figure has dropped to 19%. However, it’s still too high.

Not having your site secure may cause some visitors to leave your site, which will negatively impact your sites bounce rate, advertising impressions, affiliate clicks, and e-commerce sales.

And if you think you’ll get away with not being compliant, think again. Starting In July, Google Chrome will mark all non-HTTPS sites, and from October 2018, the color of the “Not secure” warning in the browser tab will change to red when a user inputs an email address or credit card information on a non-HTTPS site.

You can read more about our findings, and why you should make your website https compliant here.

Form Style 6