php + hosting multi tasking issue

enderz

New member
Jan 13, 2009
727
2
0
Hey...

I have created a simple script called sim.php to demo my problem:
PHP:
<?

$fl=$_GET['file'];

$f=fopen ($fl,'w');

for ($i=0;$i<120;$i++)
{
    fwrite ($f,$i);
    sleep (1);
}

?>
so if I run this : sim.php?file=test1
a file called test1 will be created and every second it will get updated for 2 minutes, right?

Now I open 10 browsers and execute in each:
mydomain.com/sim.php?file=test1
mydomain.com/sim.php?file=test2
mydomain.com/sim.php?file=test3
.
.
mydomain.com/sim.php?file=test10

What I would expect to happen is to see 10 files (test1..test10), but I see only 6 (test1..test6) and after a while (2 minutes actually) when the test1 script finishes test7 will be created and so on.

So, I guess somewhere somehow its defined that each script can run max 6 times at the same time, how can I increase it????

Thx!
 


Can you explain better why you'd need to do that, and what kind of hosting you have. It could be two things, lack of memory for every php script to run, and the maximum execution time for any script. PHP will actually drop the script if it doesn't finish in under a certain time frame.

Have you bothered to look at the error log?

I think you need to figure out how to do what you want, more intelligently, even if that means not relying on PHP to do it.
 
Can you explain better why you'd need to do that, and what kind of hosting you have.

Sure I use GoDaddys vps. I have a script that scans my DB selects a record and then do some heavy stuff (scraping, creating autoblog, etc.), it takes it 1-4 hours to finish each record.

So, I run the script several times (locking each handled record of course), but 6 times simultaneously is not enough (I have thousands of records to process).

It could be two things, lack of memory for every php script to run, and the maximum execution time for any script.

Can't be the memory since the sim.php I wrote to debug this issue is extremely simple and should consume minimum memory. the max execution time is set to unlimited.

Have you bothered to look at the error log?

Why of course I wouldn't bother you without checking the basics... (hmm.. at least the basics I know of ;) ).

I think you need to figure out how to do what you want, more intelligently, even if that means not relying on PHP to do it.

Well.. my "machine" is already coded in php (took me several months to write and debug and Im quite experienced programmer) so it will make me a bit suicidal if I'll need to code it in other lang...
 
If I knew more specifically what the script was doing rather than your example of multiple writes, I could help pinpoint the issue, as well as maybe your php configuration. Otherwise there's not much I can help you with , with just that. (and kind of doubt whatever you wrote would have to take months, for something too simple not to take memory [you'd be surprised, especially if you have a memory leak])

Also processing a bunch of things is best done strictly on server side, rather than browser initiated. A very experienced programmer would more likely write a bash script that could be run by a cron job at the desinated time directly sending the php file to the php parser on the server itself, thus that way you don't have to worry bout something as simple as client-side timeouts. But it would technically be more practical to use something like python, perl, etc on the server side to process data if you already got it collected into records.
 
Well... its always possible to maximize any piece of code, but for my current issue, it doesn't matter what/how my original script does, since the sim.php simulates the same problem!!

I cant run it 10+ times simultaneously (or any other php script for that matter) after 6 executions the rest are in some queue waiting to their turn... I guess its a general problem and not specific-script-related
 
Well... its always possible to maximize any piece of code, but for my current issue, it doesn't matter what/how my original script does, since the sim.php simulates the same problem!!

I cant run it 10+ times simultaneously (or any other php script for that matter) after 6 executions the rest are in some queue waiting to their turn... I guess its a general problem and not specific-script-related

Are you using mod_php or fast-cgi?

Also if you only tried via a webbrowser, have you tried directly sending the script to the php parser?

Also are you using a control panel such as Cpanel/Whm?
 
Please check what php information is installed on the server.

Load up a file with only this in it and run it:

Code:
<?php phpinfo(); ?>

See what's configured.

It'll tell you what memory limit is active. The default is anywhere between 5MB to 25MB based on the host. The timeout is usually 10-60 seconds depending on the host.

Both of those could hurt your performance, as well as the PHP version.

If you are doing heavy scraping, you'll want to use cURL. That will increase performance, but again it needs to be enabled.
 
Are you using mod_php or fast-cgi?

fast-cgi.

Also if you only tried via a webbrowser, have you tried directly sending the script to the php parser?

Run it through cron job.

Also are you using a control panel such as Cpanel/Whm?

Sure am, why?

And thx for ur help man!



@Rexibit - do you really think I'd post here shit without fucking checking basic shit on my php.ini?? plus if you would have read the post you'd see the timeout was already discussed...
 
There's also a max execution time you may be running up against.

This sounds like something that would be better written as a script and run from cron/command line... You can still do it in PHP, you're just not running it from a browser anymore.