How Technical Website Issues Impact Your Retail Site’s SEO
Most retail website owners are at least aware of the importance of SEO in 2019. And most make an effort to boost their search engine visibility by following its tenets. But while you might be blogging, and ensuring you create product descriptions with the right keywords, are alt-tagging your images what is going on in the depths of your website’s code may be more of a problem than you know.
Google will sometimes issue a penalty for certain onsite technical issues, and that will affect your search position. Getting a SERPS penalty for a technical issue can seem exceptionally harsh to website owners as they probably never knew there was an issue, as they are not ‘tech-heads’ themselves. Google however, like the law, rarely accepts ignorance of the subject as a proper defense.
Technical website issues and SEO are often not talked about, but your code is the foundation of your online presence. Be sure it’s running smoothly, and you’ll stand a much better chance of lasting – and ranking well – over time.
Here’s a look at just some things that could impact your site right now that could lead to a Google penalty.
It should be noted that most penalties for technical issues are manual, and Google is increasingly sending out warnings when they spot problems. This is a wonderful thing, but far too many websites owners ignore them because they don’t understand them.
Don’t let that be you. If you don’t know what a notice means find someone to help you make sense of it and show you how to fix the problems before a warning becomes a penalty.
Missing Site Map
An XML sitemap is not a requirement, but there’s no reason not to create one, as it helps inform Google every time you post new content, prompting it to be crawled more quickly than otherwise.
Servers may crash from time to time, but if the problem is not handled quickly, or if it happens regularly, the search engines notice and take it to mean neglect. They can’t keep searchers happy by sending them to a 404 page, so if your site does crash, get it back up and running as a number one priority.
In 2010 Google publicly announced that they look at the speed of a site when determining where to rank it.
Over time a site can be bogged down by adding extra code, or plugin after plugin, and large files such as images and videos. To maintain fast loading times, keep your larger files on a separate cloud based storage server. Especially for large sites, slower site speeds will severely affect your indexation.
Sins of the past can come back to haunt you if you’ve just picked up a new domain. Even if it has expired, there may be spammy links pointing to it, which may create problems further down the line.
Reported to Google
Anyone can report your website as spam if they want to. This can be genuine or done with the malicious intent of a competitor who doesn’t mind playing dirty. It doesn’t make a difference who does it though, just know that it can happen.
It’s nothing to worry about if your site is squeaky clean, since the report will only flag your site, not penalize it. If, however, you’ve been scraping by under the radar while quietly breaking the rules, being reported could mean you’re flagged for a review.
Perhaps the ultimate violation in the online world, hacking becomes a greater risk the more well-known and successful your site becomes. But also using a popular CMS such as WordPress and not keeping your software or plugins up to date can also make you a target.
Get your site security in order and prepare for it as best you can. If you’ve already been hacked, and the hacker has left spammy links, or hidden text, you may not know about it right away.
The only thing to do, if you’re sure you’ve stuck to the rules in every way and your penalty could only be due to an SEO attack, is to conduct a comprehensive on-site audit.
Cloaking is a technique by which the search engine bots are presented with different content than what appears in the user’s browser window.
There are two types:
- IP Delivery Cloaking
Delivering different content based on IP address is something the search engines do all the time to present users with geo-targeted content. However, it can also be used to differentiate human user from search bot.
- User-Agent Cloaking
Both browsers and crawlers have unique user-agents assigned to them. This helps make websites slightly better tailored to different browsers when necessary, and of course can also be used to present totally different content to browsers and crawlers.
Cloaking is an old practice. If it is affecting your site, it was likely to have been put in place a long time ago. Again, you may not be able to see it, but a site audit will track it down.