Search Engine Optimization Tips: How to Do Organic Keyword Research for Google (a Beginner’s Guide)
Here’s what Neil says,
Do watch it Properly.
If it comes to SEO, we know that link constructing is a continuous process, but more often than not, we tend to neglect the on-page SEO aspects.
Site updates, motif updates/changes, plugin upgrades, adding a new plugin/functionality, along with other changes like updating a document using FTP can cause some unintentional mistakes that could lead to on-page search engine optimization problems. If you don’t proactively look for these mistakes, they’ll go unnoticed and will negatively affect your rankings.
Ultimate Resource: On-Page SEO by Brian Dean.
For example, I recently realized I was blocking out pictures in a few of my websites for almost 6 months because of an old and neglected Robots.txt file. Imagine the impact such a mistake could have on your positions!
On-Page SEO Checkup
Keeping the importance of SEO in mind, below are seven important checks which you have to conduct on a regular basis to ensure your search-engine SEO is on stage.
Note: Even though those checks are for people running a WordPress blog, they may be used for almost any blogger on any platform.
1. Check your site for broken links.
Pages with links that are broken (be it an internal or external link) can potentially lose rankings in search results. Even if you have control over internal links, you don’t have control over external links.
There is a huge probability that a webpage or source which you connected to no longer exists or has been moved to a different URL, leading to a broken hyperlink.
This is the reason it is suggested to check for broken links occasionally.
There’s a whole host of ways to look for broken links, but one of the simplest and most efficient methods is using the ScreamingFrog SEO Software.
To find broken links on your site using ScreamingFrog, put in your domain name URL in the space provided and click on the “Start” button. When the crawling is complete, choose the Response Codes tab and filter your results based on “Client Error (4xx)”. You should now have the ability to observe all links that are broken.
Click on every broken link and then pick the Inlinks tab to determine which page(s) actually contain this broken connection. (Refer to picture below.)
If you are using WordPress, it is possible to also use a plugin like the Broken Link Checker. This plugin will detect and repair all of the broken links.
Another way to look for broken hyperlinks is through the Google Search Console.
Should you find 404 URLs, click the URL and visit the Linked From tab to see which page(s) contain this broken URL.
2. Use the website command to check for the presence of low-value pages from the Google index.
The command operator “website:sitename.com” displays all webpages on your website indexed by Google.
By viewing these outcomes, you should be able to assess when all pages indexed are of good quality or when there are some low-value pages present.
Quick Hint: If your site has a great deal of pages, alter the Google Search settings to display 100 results at a time. This way you can easily scan through all outcomes immediately.
An example of a low-value page would be the ‘search result’ page. You may have a search box on your website, and there is a possibility that search result pages are being crawled and indexed. All these pages contain links, and hence are of little to no value.
Another example could be the existence of multiple versions of the exact same page from the index. This can happen if you run an online store and your search results have the option of being sorted.
Here is an example of multiple variations of the same search page:
You may easily exclude such pages from being indexed by disallowing them in Robots.txt,or by using the Robots meta tag. You can also block certain URL parameters from becoming crawled using the Google Search Console by going to Crawl > URL Parameters.
3. Check Robots.txt to see if you’re blocking important resources.
By way of instance, blocking out the wp-content folder on your Robots.txt would mean blocking out pictures from getting crawled. When the Google bots can’t access the images on your site, your potential to rank higher due to those pictures reduces. Similarly, your pictures will not be available via Google Image Search, further decreasing your traffic.
To discover when you’re blocking out significant resources, log into a Google Search Console and go to Google Index > Blocked Resources. Here you need to be able to find out all the tools which you are blocking. You may then unblock these resources using Robots.txt (or via .htaccess if need be).
For example, let’s say You’re blocking the following two sources:
You can unblock these resources by adding the following to a Robots.txt file:
To double check if these resources are at present crawlable, visit Crawl > Robots.txt tester in your Google Search console, then enter the URL in the space provided and click “Test”.
We use Bluehost Hosting.It’s really best when you’re starting out.Get bluehost coupon at http://wpshark.in/
4. Check the HTML source of the significant posts and pages to ensure everything is right.
It’s 1 thing to use SEO plugins to optimize your website, and it’s just another thing to ensure they are working correctly. The HTML source is your best approach to make certain that all of your SEO-based meta tags are being added to the proper pages. It is also the best way to check for errors which need to be fixed.
If You’re using a WordPress blog, you only need to check the following pages (Usually):
As indicated, you only need to check the source of one or two of each of those pages to be sure everything is ideal.
To Inspect the source, do the following:
Listed below are a few checks Which You Can perform:
5. Check for mobile usability errors.
Websites which aren’t responsive don’t rank well in Google’s mobile search outcomes. Although your site is responsive, there is not any saying what Google bots will think. A small change like blocking a resource is able to make your responsive website appear unresponsive in Google’s view.
Thus, even in case you believe your website is responsive, make it a practice to check if your pages are cellular friendly or if they have mobile usability errors.
To do this, log into your Google Search Console and go to Search Traffic > Mobile Usability to assess if one or more of these pages show cellular usability mistakes.
You can also use the Google mobile friendly evaluation to test individual pages.
6. Check for leave blocking scripts.
Start a website using Bluehost hosting by using bluehost coupon here.
7. Check and track website downtimes.
Frequent downtimes not only drive visitors away, they also hurt your SEO. This is the reason it is imperative to observe your website’s uptime on a continuous basis.
The majority of these services will send you an email or maybe a cell notification to inform you of website downtimes. Some services also send you a monthly report of how your website performed.
If you find your site experiences frequent downtimes, then it is time to consider altering your web host.
These are some very important things to check for this that your internet search SEO remains optimized.
Assessing these checks on a regular basis will ensure your onsite SEO is really on stage and that your rankings aren’t getting hurt without your knowledge.
Let me know what kinds of things you assess for when doing an SEO site audit. What should you do to make sure that your on-page SEO stays optimized?