SERP Monitoring Tool

omgyams

WF Premium Member
Aug 18, 2008
1,889
25
0
Interwebs
I'm interested in a tool that basically scans predefined queries across Google, Yahoo and Bing on a schedule, and emails me a report. It can be as simple as scraping top x results and sending them to me in an email. My primary goals are monitering a SERP for competitors/bad reviews and tracking/monitering a SERP takeover.

Question is, does this tool already exist? If not, can anyone give me a high level of what would be involved in building it? I can live with only scraping Google if it drastically simplifies the tool.

And the boobs:

1313633240.jpg

1313633239.jpg

1313633235.jpg

1313633237.jpg

1313633238.jpg
 


I am looking for the same thing. authoritylabs doesnt look like it does that. If anyone has a good solution please let me know.
 
I've been looking for the same thing for a while - something that tracks the top 20-30 results rather than just an individual site or two. If it could track the way a SERP result changes over time, show graphs of how all the sites move around etc I'd subscribe immediately.

If anyone has anything like that then PM me.
 
I've been looking for the same thing for a while - something that tracks the top 20-30 results rather than just an individual site or two. If it could track the way a SERP result changes over time, show graphs of how all the sites move around etc I'd subscribe immediately.

If anyone has anything like that then PM me.

it's hard to keep things coherent and do what you want. How do you graph the top 10 results if one month from now, it's a completely different set of sites in the top ten?

I know of one member here who was building literally exactly like this idea, but he's moved on to greener pastures. Making sense of all kinds of moving data like that becomes a major headache quickly.
 
it's hard to keep things coherent and do what you want. How do you graph the top 10 results if one month from now, it's a completely different set of sites in the top ten?

I know of one member here who was building literally exactly like this idea, but he's moved on to greener pastures. Making sense of all kinds of moving data like that becomes a major headache quickly.

It wouldn't be easy but the data would be a lot more useful than a regular SERP scraper. Once you've got the results, avoiding IP ban etc, instead of looking for the position of certain site just scrape the position/URL of them all. If a site is already in the database, just update its position, if its a new site then create a new record (I'm no coder as you can probably tell...). If a site that was there yesterday is no longer in the top 10 then update its record to reflect that.

If the top 10 completely changes in a month then the graph of the last 30 days would show all 20 sites at one point or another, although only 10 on any one day.

I'm imagining after a few months you could run a query to find out which sites have gone up a certain amount, which have dropped etc. It would be really useful after a big SERP update, especially if you were tracking a lot of keywords - instantly run a query to find out which sites have taken a hit, then look for trends.
 
it's hard to keep things coherent and do what you want. How do you graph the top 10 results if one month from now, it's a completely different set of sites in the top ten?

I know of one member here who was building literally exactly like this idea, but he's moved on to greener pastures. Making sense of all kinds of moving data like that becomes a major headache quickly.

Just thinking out loud here but rather than trying to graph 10 or 20 serp positions over time you could instead measure a serp's overall volatility over a time period by assigning values to changes in position. Say you've got a top 10 SERP that you are monitoring for a 10 day span.

<#s pulled from ass>
Positive movement on pos. 100 - 11: +0.5
pos. 10: +1
pos. 9 - 8: +2
pos. 7 - 5: +4
pos. 4: + 6
pos. 3: +10
pos. 2: +25
pos. 1: +40
</#s pulled from ass>

SERP A
Day 1: Site X debuts on pos. 8... net volatility = 2
Day 2: no changes
Day 3: Site Y moves from pos. 5 to 3... net volatility = 18 (2+6+10)
Day 4-8: no changes
Day 9: Site Y move to pos. 2... net volatility = 43 (18+25)
Day 10: Site X moves to 7, Site Z debuts at 10... net volatility = 48 (43+4+1)

This might be another aspect to niche selection, if you had to decide between two niches with close SerpIQs you could weigh relative volatility in with it, in other words do you attack SERP A with a CI of 52 if it's easier to maintain position over time despite the initial effort vs. an easy win on SERP B with a CI of 39 but you have to worry about constant encroachment?

Like I said, just brain storming (or talking out my ass as the case may be.)
 
I use Rank Traker, and it's great! If you have some extra $, then just run it on a Windows VPS 24x7. It can output reports into HTML, or you can remote login and check the stats.
 
making sense of aggregate ranking data over time is a difficult problem. There are very few people involved in SEO/IM who are also expert programmers (who in my opinion, the only people who could handle such a project).

Knowing PHP and cURL will not deliver the greatness that would be such a project. This shit is intensive and very challenging from a resource level as well as a how-does-this-all-make-sense angle.

Justo nailed a lot of it really, good stuff.

If I were to attack it: I would not do aggregate graphing. I'd instead make it more of almost a breaking news system for the serps. I want to punch in a keyword and over time receive alerts like "URL Blah has jumped 52 spots in the last week" and "URL Meh has built the most total backlinks this month with 522".

Even that, though, would be a full blown startup in itself, not some weekend hacker project. Collecting the data is one thing, being able to consistently make sense of it is a whole different beast.
 
This might be another aspect to niche selection, if you had to decide between two niches with close SerpIQs you could weigh relative volatility in with it, in other words do you attack SERP A with a CI of 52 if it's easier to maintain position over time despite the initial effort vs. an easy win on SERP B with a CI of 39 but you have to worry about constant encroachment?

The correct answer is, and always will be, BOTH