Technical guidelines for Webmaster

According to our new policy you should access Google-bot to the JavaScript, CSS and image files to give your pages. In this way, your site is best viewed and indexed. If you suffer not crawling JavaScript or CSS files in the robots.txt file of your site, be it directly affects the representation and indexing your content by our algorithms and can lead to worse rankings.
So far resembled Google indexing systems old text-based browsers such as Lynx, and so it was also in our Webmaster Guidelines. Since the indexing is now based on the presentation page, our indexing systems are no longer comparable to a text-based browser. Instead, it is more of a modern web browser.
Similar to our modern browsers display module does not support all technologies used on a page under certain circumstances. Ensures that your web design to the principles of progressive improvement corresponds, as our systems (and many browsers) can then identify suitable content and basic functions better, certain web features should not be supported.
Pages that are displayed quickly, not only ensure that users have faster access to your content, but also make the indexing of these pages much more efficiently. You can check a website’s performance with various SEO tools available on World Wide Web. Considered the best practices for optimizing the performance side in particular the main point the should be kept in mind is unnecessary downloads should be avoided.

Tests and Troubleshooting

Optimize your CSS and JavaScript files by merged CSS and JavaScript files and the configuration of your Web server for providing a compressed (usually by zip compression). Ensures that your server can handle the additional load through the provision of JavaScript and CSS files to the Google-bot. In connection with the introduction of view-based indexing, we also have the function for the retrieval and presentation as by Google updated in Webmaster Tools so that webmasters can see how our systems represent a page. Thus, a number of indexing problems, invalid robots.txt restrictions, redirects, where the Google-bot cannot follow, and identify much more.