I'm developing a dedicated web-based automation API

NathanRidley

New member
Jul 26, 2008
189
1
0
London, UK
I'm developing a full-featured automation API for you guys and anyone else who wants to use it. It's a full web-based REST API with the ability to store and run your own scripts, proxies, CAPTCHA details and so forth. Scripting provides access to a modern Chrome browser implementation with screenshot-taking, native (real) clicks and text typing and so forth. There's a great facility for extracting pretty much whatever content you want from wherever you want and you can have it automatically send you HTTP push notifications on completion. There's a lot coming for it but right now I'm rushing to get the first version out for you guys to use, but I'm going to need some testers.

It would be handy if:
- you know some JavaScript (for the scripting)
- you have somewhere to receive push notifications (e.g. a server)

Also, I'd love to know what server technologies you're using for your automation, as I'd like to get a few client libraries developed for different languages such as PHP, Python, etc, depending on demand.

Here's the website, though it's just a splash page right now: http://systemizer.net

Got a lot planned for this one. Oh one other thing, as this is a level of product which hasn't really been done in this space before (unless you count ifttt.com, which is simplistic in comparison), I haven't yet worked out what pricing I'll be looking at, but suffice to say I have already established a company for this, as well as lined up a credit card payment processor and the system itself is running on a dedicated server and is designed to be scaled up if needed.

I'm hoping to post in a few days (or early next week) some details for the first testers to give it a run, but phase 1 is all API-driven and thus has no user interface for the programatically-challenged.

Let me know if you're interested!
 


This does seem interesting, I'm in for testing this. Personally I use C#, however I'll probably start using some Python too.
 
Actually the whole thing's written with .Net so you'll have access to a .Net client library out of the box. I just mentioned Python and PHP because that seems to be what a lot of folks are using in these parts.
 
Cool, I'll add those to the list. The core system is coming along very well. Note that you will initially need to provide your own proxies in order to have the system make external calls, but later on I plan to provide a proxy tunnel application so the server can use your own computer's IP address as the proxy for your automation jobs.
 
I should mention I originally promised an early beta within a week of the original post but some problems with my main business, followed by Christmas, delayed it a bit. It's in progress though and I have to use it shortly to power part of my own product, so it'll be available for you guys to try out shortly. Stay tuned!
 
Put me down for the beta. I code in Python.

But, have you considered the privacy implications? You are basically asking someone to run their entire scraping through your server - that includes a LOT of stuff people would like to keep private. Keywords, private proxies, link sources, etc.

Also, maybe I'm misunderstanding, but how is this different from ruby/watir?

I'm not trying to be a hater. It sounds like a really cool idea and you seem more than capable of implementing it. I'm just pointing out some kinks I see that need to be smoothed out.
 
@chatmasta - fair points and in reality, no service suits everybody's needs, however I personally have never really done any affiliate marketing properly and get a lot more satisfaction out of building the pick axes for the gold miners than actually mining the gold, so to speak.

I am very open to any suggestions to help people making use of the API feel more confident in the privacy of their data, beyond my simple word that I won't reveal the data to anyone or pry into it myself other than for API diagnostics and debugging. Realistically though, the idea is that you'll use the API and then delete your data and jobs when they're completed and you've pulled the data back locally. The API will automatically delete any data that you've finished with after 30 days or so, though you can shorten or lengthen that expiry date per complete job.

Also regarding the difference between this and ruby/watir, it's got a lot more infrastructure surrounding web-based automation than you'll get with watir. For example, with Systemizer you'd be able to set a delayed job to process on a given date and then if it completes correctly, queue up an HTTP push notification and queue up another job to happen on another date. I suppose the main point to take away is that watir isn't really "designed for" affiliate marketing, scraping, automated posting, etc, and therefore you need to build a lot of your own code around it to really make it work the way you need it to, whereas this is as simple as making a REST call and forgetting about the nitty gritty of it. I'll be able to explain a lot more about what you can do with it soon.
 
I have an automated API of a few tasks with Rich API for SEO Stats, Proxy's with XML output, fully automated rankings and i can build rich iPhone/iPad/Blackberry/Android Frontends... let me know if you want to partner up!
 
ill have a butchers at the beta.. I mainly use php but I can do python ish... and I know javascript pretty well..

sounds very interesting if you can pull it off ;)
 
Cool, good to see more people interested. It's definitely happening folks, but I will be away between the 15th of Jan and the 8th of Feb so you won't hear anything from me during that time, but I'll be back on it as soon as I return.
 
I'm still tuned for this, but I have a question, does this api works solely via your servers? It wouldn't be cool if I'd deploy a whole bunch of scrapers and they would stop working one day due to downtime of your servers..