I'm going to need to start tracking thousands of keywords automatically to monitor when they hit the front page. All of the keywords are local SEO terms ( location + niche ).
Now I'm nowhere near an expert on this, but I know that Google has been pushing to localize everything for people. So different datacenters can return different results based on locations.
I'm planning to do this scraping through proxies to be safe (all US based).
Are there any techniques to use to ensure that I'm getting accurate local results? Can I target certain datacenters based on the local keyword location?
Is this even a concern?
Now I'm nowhere near an expert on this, but I know that Google has been pushing to localize everything for people. So different datacenters can return different results based on locations.
I'm planning to do this scraping through proxies to be safe (all US based).
Are there any techniques to use to ensure that I'm getting accurate local results? Can I target certain datacenters based on the local keyword location?
Is this even a concern?