More and more web pages are becoming interactive. Even so, search engines' crawlers can't be as interactive as human beings.
Human vs. Crawler
With advancements in technology, web pages can now be tailored to each visitor, and many websites are taking advantage of this. But a problem comes up for indexing if there is a step that visitors "must" take, such as closing an ad window or choosing their geographical area.
The best example I can think of is the top page of a website I visted recently, which was just a site search page. There was a box for keywords and a search button. Unless visitors search for a product, they won't see a single product on the website. Since the search feature on the site is well developed, visitors can painlessly browse the site, but only if they know exactly what they want. Unfortunately, search engine crawlers don't search, so for this site, only the top page would be indexed by a search engine. And since the site has no fixed URL for individual product pages, an XML sitemap could not be used for the search engine crawler.
Invisible Web Pages
Entering keywords to search or closing an advertisement window are very simple tasks for human beings. But a search engine crawler, which is just a computer program, is incapable of doing these simple tasks. If the crawler can't see a web page, it will never show up on search result pages. Once you have created an invisible website like the one mentioned above, it will be extremely difficult to fix. So when you are renovating your website or creating a new one, keep crawlers' limitations in mind from the very beginning. Just one simple interactive feature on your site could kick out a search engine crawler completely.
© March, 2014