iMacro Help Desk



Tor - benefits for multi loads

Hi guerilla,

See you use the TOR suite rather than using a direct proxy.

Was wondering whether it was beneficial for sites where you are hammering them hard (for example creating logons and users ewtc.) and those sites have a very effective IP checking mechanism. e.g. Google blogger.
Blogger is currently tripping me up due to IP checking using my firefox macro, and even with anon proxies, a firefox macro of mine keeps getting tripped up by the blogger IP check - and the username keeps getting blocked.

Basically the script simply creates blogger accounts - but with certain proxies that I pass through, it seems it still catches me. Tried deleting cookies and whatever.

Would love to hear you views on usage of TOR with imacros. Speedwise as well? Heard tor was quite slow - but will go and check it out and report back

Love an imacros thread!

Thanks
 
Tor is slow, so it makes it harder to hammer sites. I usually multi-thread with Tor, I run 2 or 3 macros at a time. I'm pretty sure my particular tactics are unique, so it will be different for everyone else depending on their application. I have one PC I use just to run macros on, so it doesn't affect my other browsing.

I don't use Tor for Blogger. Google is aware of Tor IPs. Most of the Alexa top 100 are.

I use LFE (see my sig) to create Blogger accounts.
 
I've created a bunch of macros to register on content submission sites like hubpages, squidoo, etc. It goes to the registration page, fills out details, prompts for captcha (I enter manually), clicks submit and then goes to the next site and does the same again and continues until all sites are done.

The problem is that sometimes I enter the captcha wrong and it just continues running, so that site ends up not getting registered. If that makes sense. I figured the best way might be to add a wait statement so that if I enter it wrong, then I can quickly reenter the captcha and click submit manually before it moves on to the next site. Is this the best way to handle that or is there a better way?

Cheers :)
 
WAIT sucks for this sort of thing.

You want to use PAUSE or !SINGLESTEP

PAUSE - iMacros
!SINGLESTEP - iMacros

!SINGLESTEP iirc, can be used within the code to create step through sequences, then return the script to running normally

Do some stuff with the form
PROMPT to complete captcha
SET !SINGLESTEP YES
Submit
(((STEP WAIT)))
Go to next form
SET !SINGLESTEP NO

I haven't tested this, but I think it will do what you want it to do.

You can also use PAUSE.

Do some stuff with the form
PROMPT to complete captcha
Submit
PAUSE
Go to next form


But WAIT is a hassle, because it is timed. So you have to rush it if you get the CAPTCHA wrong, and when you get the CAPTCHA right, you will have to sit through the WAIT counter.

hth
 
Awesome! Yeah I figured there would be problems with the wait command due to either having to rush to enter the captcha or if entered correctly, have to wait for it to continue.

The pause command looks like it will work perfectly for this and is a very simple addition.

Cheers dude, that's sorted that problem for me :)

Hmm, let's keep this thread alive shall we, I love learning about this stuff, anything to make my life easier, lol.

The next thing I'm going to learn is how to automate email confirmation link clicking. I've gathered that I need to use wildcards to let imacros recognise the link, so that shouldn't be a problem, just a couple questions on this:

1.) Am I right that imacros recognises each email based on the subject line, so the actual position of each email won't matter? (If I have 20 confirmation emails, from 20 different sites, arriving in different orders each time, it won't make a difference as long as the subject line is the same each time)

2.) Is there any problems with spam folders? For example, If an email from hubpages goes to the spam folder one time, I set up a macro to log in, go to spam folder and then click the link, etc. What happens if the next time I register on hubpages, the email doesn't go to spam folder and instead just goes to inbox, I can see problems arising there and it's not under my control?

I did a quick test today and it seemed to go ok. From what I can see, it runs by looking for the subject line that I clicked during the test, then clicks the confirm link (which I can set up with wildcards in future), then goes and looks for next subject line, etc. So the only possible problems I can see are:
* The sites change the subject line of their confirm emails
* The emails randomly go to inbox or spam box (imacros won't understand)

Cheers man, great thread!
 
Great thread guys. Would love to contribute (and hopefully I will) but atm I'm here to ask for help.

I have an iMacros sign up script that reads from a csv. example
Code:
TAG POS=1 TYPE=INPUT:TEXT FORM=NAME:registrationForm ATTR=ID:registerUserName CONTENT={{!COL1}}

After each loop the script goes through is their a way to save the respective data eg. the info used in that loop to signup to a new file?

That way I can save all the logins pw's etc that are associated with accounts to a new file.

Muchas gracias for any assistance.
 
Yeah, you need to use the EXTRACT command.

Here is some sample code from another macro I had written

Code:
VERSION BUILD=6240709 RECORDER=FX
SET !TIMEOUT 60
SET !DATASOURCE melba.csv
SET !DATASOURCE_COLUMNS 6
SET !DATASOURCE_LINE {{!LOOP}}
SET !EXTRACT_TEST_POPUP NO
SET !ERRORIGNORE YES
SET !VAR1 *
TAB T=1
URL GOTO={{!col3}}
ADD !EXTRACT {{!col1}}
ADD !EXTRACT {{!col2}}
TAG POS={{!col4}} TYPE={{!col5}} ATTR={{!VAR1}} EXTRACT=TXT
ADD !EXTRACT {{!NOW:mm/dd/yyyy<SP>hh:nn}}
SAVEAS TYPE=EXTRACT FOLDER=* FILE=melba.csv
WAIT SECONDS=30

Ok, I am using the same name for my output and input, but they are stored in different folders, so I didn't care.

Line by line

set timeout to 60 seconds
set my datasource
set my total # of columns in the datasource
tell datasource to loop
tell it to supress the extraction test popup <<<< IMPORTANT
ignore the set var * bit, it didnt work the way i wanted
first tab
goto the url in col3
add col1 and col2 of my source file, into my extraction array (they will be comma separated)
ignore this line, i was trying to feed it dynamic variables to extract a value
add the date into my extract array
save my extraction array to a file
wait 30 seconds before looping again

Hope that helps!
 
  • Like
Reactions: gutterseo
guerilla FTW

+rep

and thanks for the speedy response.
 
Here is something I whipped up tonight to check Alexa rankings for a research project.

Just feed it a text file with a list of root URLs (without the http:// or www. ) from your datasources directory, and name the output file which will be saved in your downloads directory. Your output file will be a two column CSV with the domain checked and the Alexa traffic rank value.

I was going to do a UBot but time was tight and I was working on another web automation project at the time.

Code:
VERSION BUILD=6240709 RECORDER=FX
SET !TIMEOUT 60
SET !DATASOURCE list-of-urls.txt
SET !DATASOURCE_COLUMNS 1
SET !DATASOURCE_LINE {{!LOOP}}
SET !EXTRACT_TEST_POPUP NO
TAB T=1
URL GOTO=http://www.alexa.com/siteinfo/{{!col1}}
ADD !EXTRACT {{!col1}}
TAG POS=1 TYPE=DIV ATTR=CLASS:data<SP>* EXTRACT=TXT
SAVEAS TYPE=EXTRACT FOLDER=* FILE=output-filename.csv
WAIT SECONDS=45

Enjoy.