So I was reading through a PHP/MySQL book I have and ran across the touch() function, according to the book using the touch() function on a file will update its last modified and last accessed time.
In that same thought I have heard that Googlebot's spiders will get the lastmod time and if its been a while or is the same as last time it check it wont re-spider the page and could possibly lead to your page being considered "stale" and having possibly negative affects on your stats, etc.
Now, what would stop from someone, like...I dont know...me from setting up cron to run a PHP script that would go through a sitemap file and touch() all of the URLs in it to update lastmod time?
I ask because I run a credit card site, the credit cards dont get updated that often so this could possibly get me around the "stale" pages...
In that same thought I have heard that Googlebot's spiders will get the lastmod time and if its been a while or is the same as last time it check it wont re-spider the page and could possibly lead to your page being considered "stale" and having possibly negative affects on your stats, etc.
Now, what would stop from someone, like...I dont know...me from setting up cron to run a PHP script that would go through a sitemap file and touch() all of the URLs in it to update lastmod time?
I ask because I run a credit card site, the credit cards dont get updated that often so this could possibly get me around the "stale" pages...