Curl, fopen and saving images

Status
Not open for further replies.

BluuueJammm

Big Member
Apr 19, 2008
830
12
0
England
Ok, so I've been using curl, php for a couple of months now and am getting used to scraping stuff. Thing is, when I come to scrape images I grab them with curl, then open a file handle with fopen(), then write the file and close the handle. Is this the best way to be doing it? I guess it just feels a bit long winded.
 


You could use
file_put_contents('image.jpg', file_get_contents('http://www.someimage.com/image.jpg'));
 
Thx for the replies.

Don't forget yo CURLOPT_BINARYTRANSFER

Hmmm, I've just read up a little on this and it seems like I should be using it when I'm grabbing files with curl, thing is, I've not been using it and everything seems to have been working fine. Anyways, I'll start using it and see if there is any noticeable difference.
 
I have an idea for a scraper script I wanna try to make. I'm still really new to curl though so here's a question for all you curl guys; is it possible to have curl only fetch images of a certain dimension? For example, get all images from a URL that are 200x200 pixels in size?
 
I would think not.

Image size is there and readable in the file, not the html... which means ya gotta have the file to keep or delete it.

::emp::
 
here's a condensed version:

$img = file_get_contents('http://www.yahoo.com/image.jpg') or die('Could not grab the file');
$fp = fopen('folder/whateveryouwanttosaveimageas.jpg', 'w+') or die('Could not create the file');
fputs($fp, $img) or die('Could not write to the file');
fclose($fp);
unset($img);

wget might be a better option if you don't need to do stuff like scan the url with a regex
 
I have an idea for a scraper script I wanna try to make. I'm still really new to curl though so here's a question for all you curl guys; is it possible to have curl only fetch images of a certain dimension? For example, get all images from a URL that are 200x200 pixels in size?

You have to retrieve the image, then determine it's size.
 
here's a condensed version:

$img = file_get_contents('http://www.yahoo.com/image.jpg') or die('Could not grab the file');
$fp = fopen('folder/whateveryouwanttosaveimageas.jpg', 'w+') or die('Could not create the file');
fputs($fp, $img) or die('Could not write to the file');
fclose($fp);
unset($img);

wget might be a better option if you don't need to do stuff like scan the url with a regex

$img = file_get_contents('http://www.yahoo.com/image.jpg') or die('Could not grab the file');
file_put_contents('/path/to/image.jpg',$img);

Double post FTW.
 
I have an idea for a scraper script I wanna try to make. I'm still really new to curl though so here's a question for all you curl guys; is it possible to have curl only fetch images of a certain dimension? For example, get all images from a URL that are 200x200 pixels in size?

You can use regex to grab images where width and height is defined.
Alternatively you can do getImageSize() on a remote URL.
 
here, this is from my functions class

private function curl_get_image($ch,$url,$ref)
{

curl_setopt($ch,CURLOPT_URL, $url);
curl_setopt($ch,CURLOPT_REFERER,$ref);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch,CURLOPT_BINARYTRANSFER,1);
curl_setopt($ch,CURLOPT_ENCODING, 'gzip,deflate');
curl_setopt($ch,CURLOPT_HTTPHEADER, array("Pragma:","Accept: image/png,image/*;q=0.8,*/*;q=0.5","Accept-Language: en-us,en:,q=0.5","Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7","Keep-Alive: 300"));
$rawImage = curl_exec($ch);
/*
$fout = fopen("/an/absolute/path/system/captchas/test".rand(1,40).".png",'w');
fwrite($fout,$rawImage);
fclose($fout);
*/
return($rawImage);

}

you can proceed to fpc the bulk of what it returns. keep in mind you need a curl handle already init'd and you have to pass it to this function as well.
 
Status
Not open for further replies.