Google (News
– Alert) has once again made some changes in its algorithm, which impacts your CSS3 and JavaScript implementations. In effect, Google now wants to render your website with a full implementation and reacts poorly to robot.txt blocking the crawler. CSS3 and JavaScript files should also be consolidated for optimization of the search.
As always with the clan that does not want to be evil, it’s a double-edged sword. On the one side, it makes so that content is better understood and navigable. On the other side, it makes so that federation and affiliation systems suffer and are judged harshly if no extra content is provided.
The result is that content reused without other modification is going to get a lower ranking. Obviously by eliminating the ability to block the bot, the use of affiliate data becomes easier for Google to find.
An important piece, then, is to make sure that text and commentary are enabled for users to have direct value from a site and not be redirected to other sites.
While this sounds perfectly reasonable, it also makes the role of curators harder since the explanation suggests that a site could be penalized if it was, for instance, a Van Gogh Virtual Tour that took you to all the museums that had Van Gogh paintings and let you look at the images without commentary.
Worse yet would be forcing commentary on something that might just need to be appreciated. You know, “art for art’s sake.” As the role of YouTube (News – Alert) grows at Google, the algorithm seems to have taken a step backward.
On the bright side, you can use the fetch tool, and see how Google is viewing your site with its webmaster tools, distinguishing between the rendering of mobile and desktop.
The bottom line is that paying attention to how your site is viewed by Google is going to require some active maintenance on your part… including some rewriting.
Edited by
Alisen Downey