Will making my AJAX app serve HTML snapshots improve SEO?

Will making my AJAX app serve HTML snapshots improve SEO? - Google Search Console is a free application that allows you to identify, troubleshoot, and resolve any issues that Google may encounter as it crawls and attempts to index your website in search results. If you’re not the most technical person in the world, some of the errors you’re likely to encounter there may leave you scratching your head. We wanted to make it a bit easier, so we put together this handy set of tips about seo, google-search, web-crawlers, ajax to guide you along the way. Read the discuss below, we share some tips to fix the issue about Will making my AJAX app serve HTML snapshots improve SEO?.Problem :


Currently, a small ajax web app of mine uses an anchor hijacking method to display content. The user clicks an anchor, I cancel the default browser event and funnel the link into an AJAX call, after which I retrieve content and insert/animate it in.



A person without Javascript enabled, or a crawler, would follow the anchor and find themselves in a barebones version of the content, ready to be retrieved and inserted into the main site, so essentially it's still visible to Google but not so much ready for human consumption (and if it's indexed it will be OK, because I could redirect it to a different, acceptable human ready page), which is OK with me BUT only if it doesn't affect SEO negatively compared to:



HTML snapshots, where I reprogram the application to work with the browser and have it serve up a version of the site for each bit of content, so essentially instead of seeing the barebones content everytime google requests a link it would see a version of what the content + code would look like had a user carried out that operation.



The problem is the latter is fiddly and annoying to reprogram, an I'd like to see if there's any significant improvement in SEO that would warrant the extra time and money spent to reprogram it.



https://developers.google.com/webmasters/ajax-crawling/



Link to the article on making your AJAX web application crawlable for reference.


Solution :

We did something like this on a one page site, where each section of the site was served up via Ajax and cachable as its own url. We did this by using the #! hashbang method and then using the escaped fragment meta tag to let google know about the section pages. When the bot hit our site with the escaped fragment querystring we only served the bot that section of the page.



It was a nice cheap way to get a single page site to show up as a series of small pages on google. As for SEO we did not get any direct benefit which was measurable, other than getting more pages indexed (each with different schema codes). The added schema per page let us show stars under our reviews section, and video thumbnail under video section.



We have since shut down the site, because it was a 1 pager meant to sell a product we no longer sell (vitamin). But to answer your question, its not worth the effort for SEO, its worth the effort for getting enhanced listings if you combine it with Schema.org markup.


If the issue about seo, google-search, web-crawlers, ajax is resolved, there’s a good chance that your content will get indexed and you’ll start to show up in Google search results. This means a greater chance to drive organic search traffic to your site.

Comments

Popular posts from this blog

Years after news site changed name, Google is appending the old name to search titles and news stories

Is it possible to outrank Google for a search term on their own search engine?

Load Wikipedia sourced biographies via Ajax or render it with the rest of the page as part of the initial request?