Essential Onsite SEO Checks
When it comes to SEO, many business owners and “professional” marketers focus most of their attention on offsite SEO; keyword selection, content creation and distribution, social authority etc. and this is indeed very important. But SEO is more than that and it is the most fundamental elements of SEO that often get overlooked.
Onsite SEO is not glamorous. Monitoring site speed and broken links are nowhere near as interesting as researching for a great blog post, creating marketing videos or snapping and sharing those great Pinterest and Instagram images. But search engines do take this stuff very seriously and bad technical onsite SEO can seriously derail all of your hard work on that offsite SEO.
One way to ensure that all is well is to schedule regular checkups for these oft forgotten SEO elements. How do you make that happen and what should you be checking? Consider a few of the tools and tips below.
Broken Link Checker
Broken links are not only bad for SEO but they are not at all good for user experience either, especially for a website that is any way involved in retail or ecommerce. Given just how many links the average retail website contains though going through them all one by one every month would admittedly be a huge timesuck.
Fortunately running your site through an online utility like Broken Link Check does all of that hard work for you, displaying the exact location and nature of any problems within minutes. It can’t fix them for you, but at least you have a great place to start. And do bear in mind that links break without warning, for a huge number of different reasons, so this is a check that should be run every month or so, not just once or twice a year.
Checking Your Site’s Robots.txt
This is a check that may sound a little intimidating if you are not very code savvy. However, once again the Internet makes things far easier than you might imagine.
If you are not sure why you should care about robots and their text at all, allow us to explain. Robots.txt code controls just which areas of a website Google searchbots – and their counterparts from other services – can and cannot access. Often robots can be blocked without a website owner even being aware of it (WordPress themes for example often block search engines by default ‘out of the box’ and those settings have to be manually changed once a site is built and published, something that is easy to overlook.
Checking what is blocked (or not) on your site is easy though. All you need to do is open a new tab in any browser and type in “Yourdomain/robots.txt”. You will then be presented with a list of any blocks so if any of your site that would prefer was crawled is currently blocked you can remedy that situation right away.
Checking for Crawl Errors
Once you have ensured that those little search bots have the right ‘invitations’ to crawl your site you also need to make sure that what they are finding is actually what you want them too. Crawl errors are also pretty common and just because everything is working right when the site is first published does not mean that it will stay that way.
The most efficient way to check for crawl errors is to log into Google Analytics and make use of the ‘Fetch as Google’ tool there which will not only point out any errors but also give you no-nonsense advice about how to fix them.
We have written about this Google tool several times before but it is well worth mentioning again because it is such a useful little utility and site speed is increasingly important, especially in the growing mobile space.
Not only will running your URL through the tool here tell you immediately how fast your site runs across numerous different platforms and across all of the major browsers but it also offers you, once again, easy for non-techies to follow instructions for fixing any issues that are slowing your site down .Koehler Home Decor is a wholesaler of home decor accessories and unique gifts. Source quality wholesale merchandise at KoehlerHomeDecor.com and find tips for promoting your business on our blog.