Last week, Google made an official statement saying they are no longer recommending the AJAX crawling proposal made back in 2009. Google’s depreciation of their AJAX crawling scheme doesn’t only mean enhanced experiences for users, but transparency for Transifex users who are interested in sharing digital content with a global audience.
In the early 1990s, most websites were based completely on HTML pages. When a user took any action, such as clicking from one page to the next, all the content had to be re-sent and the entire page had to be reloaded from the server. This was the case even when a small portion of the page’s information had changed. As you might imagine, the process of constantly loading pages placed additional load on the server, used excessive bandwidth, and caused slow load times.
Understanding AJAX Crawlability
In the above mentioned 2009 proposal, Google shared how search engines would crawl AJAX files so web developers could make appropriate adjustments and ensure their content was displayed correctly and indexed.
Google Recognizes That Times Have Changed
In today’s web-based world, developers want to make their applications as responsive as possible in order to satisfy the end user. Achieving this goal, however, comes at a huge cost. Crawlers are unable to see any dynamically-created content. Because of this, Google says, “the most modern applications are also the ones that are often the least searchable.”
Google’s announcement is great news for Transifex Live users! Although translated website content generated through Transifex Live was crawlable by search engines using AJAX crawling specifications, Google’s reversed stance makes it even easier for the translated versions of your website to be indexed.