Google throws a wrench in Market Samurai's business:

I imagine you don't have to do the scraping server-side if you want to serve the content from there. It's a bit more complex and you have to make sure you don't get fed malicious data, but if your users are willing to put your code on their PC, why not use those resources?


they server can use proxies, if it wants it can use 3-4 different IPs and cross reference the stats to make sure the data is the same, if not, average it, whatever.

Definitely beats 1000 users scraping the same keywords. Also, many SEOs (me included), get OCD over ranks checking the same rank several times a day. There is no need for that, but truth is the same query could be done 10-15 times a day by the same user.
 


I imagine you don't have to do the scraping server-side if you want to serve the content from there. It's a bit more complex and you have to make sure you don't get fed malicious data, but if your users are willing to put your code on their PC, why not use those resources?

MS users run app on their machines but use the proxies (very limited) of MS to make queries, it's a very poor architecture. Why not cache data and avoid redundant queries? Add this the use of a scripting code like AS3 + JavaScript, the picture is clear.
 
market samurai is a flawed business model, and abuses google too much.

They need a server side solution, I completely stand by goofle on this, what is the fucking point in 1000 users doing rank checking for the same keyword a day?

It bogs down google's servers, and i bet if any one of you were running google, youd ban these dickwads straight away. A server side solution will just scrape the serp for tht keyword once per day. Simples. the results are kept in archive by the rank tracker, and then everyone who wants to rank track their site for that keyword get the data from their archive.

this post shows a profound lack of understanding of how web apps/scraping/server side rank tracking works. what you proposed is not realistic for a variety of reasons.

1) The overlap of keywords from user to user is less than 1%.
2) They had a server side solution, they just haven't scaled it appropriately
3) When was the last time you felt that google's servers were "bogged down" when you were just casually searching?
 
And their solution to all of their competition data needs? Use Bing. :bigear: :xomunch: really?
Yeh this is my biggest issue really, the PR issue I can get around with the Firefox Seoquake plugin, although it takes a few extra minutes of course. But the SEOC coming from Bing? Fucking useless. Every potential keyword I research I have to go and double check it with a google search, and of course it is off by a long shot.
 
Link Assistant is right; Google hasn't changed a thing.

With that being said, 6 months is just a made up number.
I have been so screwed by Market Samurai I don't trust Link Assistant's word, MS's or anybody else's promises. So did MS throw a monkey wrench into the functionality to justify moving their previous customers to month-to-month? Now THAT I would believe.
 
was just about to buy MS too....

Anyone use link-assistant tools like SEOspyglass? Are those any good and still working?

MSM looks pretty good
i been using SERPFOX also