7 On-Page SEO Checks That You Need To Conduct On Day to Day Basis

If it comes to SEO, we know that link constructing is a continuous process, but more often than not, we tend to neglect the on-page SEO aspects.

Site updates, motif updates/changes, plugin upgrades, adding a new plugin/functionality, along with other changes like updating a document using FTP can cause some unintentional mistakes that could lead to on-page search engine optimization problems. If you don’t proactively look for these mistakes, they’ll go unnoticed and will negatively affect your rankings.

Ultimate Resource: On-Page SEO by Brian Dean.

For example, I recently realized I was blocking out pictures in a few of my websites for almost 6 months because of an old and neglected Robots.txt file. Imagine the impact such a mistake could have on your positions!

On-Page SEO Checkup

Keeping the importance of SEO in mind, below are seven important checks which you have to conduct on a regular basis to ensure your search-engine SEO is on stage.

Note: Even though those checks are for people running a WordPress blog, they may be used for almost any blogger on any platform.

1. Check your site for broken links.

Pages with links that are broken (be it an internal or external link) can potentially lose rankings in search results. Even if you have control over internal links, you don’t have control over external links.

There is a huge probability that a webpage or source which you connected to no longer exists or has been moved to a different URL, leading to a broken hyperlink.

This is the reason it is suggested to check for broken links occasionally.

There’s a whole host of ways to look for broken links, but one of the simplest and most efficient methods is using the ScreamingFrog SEO Software.

To find broken links on your site using ScreamingFrog, put in your domain name URL in the space provided and click on the “Start” button. When the crawling is complete, choose the Response Codes tab and filter your results based on “Client Error (4xx)”. You should now have the ability to observe all links that are broken.

Click on every broken link and then pick the Inlinks tab to determine which page(s) actually contain this broken connection. (Refer to picture below.)

If you are using WordPress, it is possible to also use a plugin like the Broken Link Checker. This plugin will detect and repair all of the broken links.

Another way to look for broken hyperlinks is through the Google Search Console.

Should you find 404 URLs, click the URL and visit the Linked From tab to see which page(s) contain this broken URL.

2. Use the website command to check for the presence of low-value pages from the Google index.

The command operator website:sitename.com” displays all webpages on your website indexed by Google.

By viewing these outcomes, you should be able to assess when all pages indexed are of good quality or when there are some low-value pages present.

Quick Hint: If your site has a great deal of pages, alter the Google Search settings to display 100 results at a time. This way you can easily scan through all outcomes immediately.

An example of a low-value page would be the ‘search result’ page. You may have a search box on your website, and there is a possibility that search result pages are being crawled and indexed. All these pages contain links, and hence are of little to no value.

Another example could be the existence of multiple versions of the exact same page from the index. This can happen if you run an online store and your search results have the option of being sorted.

Here is an example of multiple variations of the same search page:

You may easily exclude such pages from being indexed by disallowing them in Robots.txt,or by using the Robots meta tag. You can also block certain URL parameters from becoming crawled using the Google Search Console by going to Crawl > URL Parameters.

3. Check Robots.txt to see if you’re blocking important resources.

When using a CMS like WordPress, it isn’t difficult to inadvertently block important content like images, javascript, CSS, and other resources that may actually help your Google bots better access/analyze your website.

By way of instance, blocking out the wp-content folder on your Robots.txt would mean blocking out pictures from getting crawled. When the Google bots can’t access the images on your site, your potential to rank higher due to those pictures reduces. Similarly, your pictures will not be available via Google Image Search, further decreasing your traffic.

In the exact same way, if Google bots cannot get the javascript or CSS on your website, it’s impossible for them to determine whether your website is responsive or not. Therefore, even though your website is responsive, Google will think it is not, and consequently, your website won’t rank well in search outcomes.

To discover when you’re blocking out significant resources, log into a Google Search Console and go to Google Index > Blocked Resources. Here you need to be able to find out all the tools which you are blocking. You may then unblock these resources using Robots.txt (or via .htaccess if need be).

For example, let’s say You’re blocking the following two sources:

  • /wp-content/uploads/2017/01/image.jpg
  • /wp-includes/js/wp-embed. min.js

You can unblock these resources by adding the following to a Robots.txt file:

  • Allow: /wp-includes/js/
  • Permit: /wp-content/uploads/

To double check if these resources are at present crawlable, visit Crawl > Robots.txt tester in your Google Search console, then enter the URL in the space provided and click “Test”.

We use Bluehost Hosting.It’s really best when you’re starting out.Get bluehost coupon at http://wpshark.in/

4. Check the HTML source of the significant posts and pages to ensure everything is right.

It’s 1 thing to use SEO plugins to optimize your website, and it’s just another thing to ensure they are working correctly. The HTML source is your best approach to make certain that all of your SEO-based meta tags are being added to the proper pages. It is also the best way to check for errors which need to be fixed.

If You’re using a WordPress blog, you only need to check the following pages (Usually):

As indicated, you only need to check the source of one or two of each of those pages to be sure everything is ideal.

To Inspect the source, do the following:

  1. Open the page that needs to be checked on your own browser window.
  2. Check the content within the ‘head’ tags ( ) to guarantee everything is appropriate.

Listed below are a few checks Which You Can perform:

  1. Check to see whether the pages have several instances of the identical meta tag, such as the title or meta description tag. This can occur when a plugin and subject both insert the same meta tag into the header.
  2. Check to see if the page has a meta robots tag, and also guarantee that it’s installed properly. To put it differently, check to be certain the robots tag isn’t unintentionally set to Noindex or Nofollow for pages that are important. And ensure it’s indeed set to Noindex for low pages.
  3. Check to find out if pages (particularly single post pages and the site) have appropriate OG tags (especially the “OG Picture” tag), Twitter cardsalong with other social media meta tags, and other tags like Schema.org tags (if you’re using them).
  4. Check to find out whether the page has a rel=”canonical” label and ensure it is showing the appropriate canonical URL.
  5. Check if the pages have a viewport meta tag. (This tag is important for cellular responsiveness.)

5. Check for mobile usability errors.

Websites which aren’t responsive don’t rank well in Google’s mobile search outcomes. Although your site is responsive, there is not any saying what Google bots will think. A small change like blocking a resource is able to make your responsive website appear unresponsive in Google’s view.

Thus, even in case you believe your website is responsive, make it a practice to check if your pages are cellular friendly or if they have mobile usability errors.

To do this, log into your Google Search Console and go to Search Traffic > Mobile Usability to assess if one or more of these pages show cellular usability mistakes.

You can also use the Google mobile friendly evaluation to test individual pages.

6. Check for leave blocking scripts.

You may have added a new plugin or performance to your blog that could have additional calls to many javascript and CSS documents on all pages of your site. The plugin’s functionality may be to get one webpage, yet calls for its javascript and CSS are on all pages.

For instance, you may have added a contact form plugin which only works on a single page — your page. However, the plugin might have additional its Javascript files on every page.

The more javascript and CSS references a page has, the longer it takes to load. This reduces your page speed which can negatively impact your search engine rankings.

The very best method to make sure that this doesn’t occur is to check your site’s post pages using Google’s PageSpeed Insights instrument on a regular basis. Check to find out whether there are render-blocking Javascript files and figure out whether these scripts are necessary for the page to work properly.

If you discover undesirable scripts, restrict these scripts just to pages that need them so that they do not load where they aren’t desired. It is also possible to consider adding a defer or even asyncattribute to Javascript files.

Start a website using Bluehost hosting by using bluehost coupon here.

7. Check and track website downtimes.

Frequent downtimes not only drive visitors away, they also hurt your SEO. This is the reason it is imperative to observe your website’s uptime on a continuous basis.

The majority of these services will send you an email or maybe a cell notification to inform you of website downtimes. Some services also send you a monthly report of how your website performed.

If you find your site experiences frequent downtimes, then it is time to consider altering your web host.

Things To Monitor For Proper On-Page SEO

These are some very important things to check for this that your internet search SEO remains optimized.

Assessing these checks on a regular basis will ensure your onsite SEO is really on stage and that your rankings aren’t getting hurt without your knowledge.

Let me know what kinds of things you assess for when doing an SEO site audit. What should you do to make sure that your on-page SEO stays optimized?

Leave a Reply

Your email address will not be published. Required fields are marked *