Weird Errors – Fix Timeout Issues in CURL, PHP, and Apache.

Hitting strange errors when trying to execute long-running PHP processes, like large file reads, generating static HTML pages, file uploads, or CURL calls? It might not be just bugs in your code.

Are you getting pages that seem to load, but then nothing shows up in the browser? When you go to a page, does your browser sometimes ask, "You have chosen to open something.php which is a : PHP file. What should Firefox do with this file" or possibly "File name: something.php File type: PHP File Would you like to open the file or save it to your computer" Do you get internal server errors at random intervals?

Depending on what you are trying to, you could be running into timeout issues, either in PHP, in a particular library, in Apache (or IIS or whatever web server you use), or even in the browser. Timeout issues can be a real pain because you don't run into them very often and they don't result in clear error messages.

Let's take a PHP script that does a number of CURL calls as an example. PHP gives you access to libcurl a really powerful tool for calling up other web pages, web services, RSS feeds, and whatever else you can dream up, right in your PHP code. This article is not a general introduction to CURL, so I won't go into detail, but basically the CURL functions allow your code to make requests and get responses from web sites just like a browser. You can then parse the results use the data on your site.

Let's say you have a page on your site where you would like to display the latest posts from a few of your friends' websites, and they don't have RSS feeds set up. When a user comes to your site, you can make a series of CURL calls to get the data:

$curl_session = curl_init(); curl_setopt($curl_session, CURLOPT_HEADER, false); curl_setopt($curl_session, CURLOPT_FOLLOWLOCATION, true); curl_setopt($curl_session, CURLOPT_RETURNTRANSFER, true); curl_setopt ($curl_session, CURLOPT_HTTPGET, true); curl_setopt($curl_session, CURLOPT_URL, 'http://www.example.com'); $string = curl_exec($curl_session);

 

You can now parse the results in $string and hack out the most recent post. You would repeat these calls for each of your friends' web sites.

You try running the page and everything seems to work at first, but then you hit reload and get some strange behavior, like the the problems listed above. In the worst cases, you won't get the same exact error each time - sometimes the page will load, some times you'll get an empty $string or errors from curl, sometimes a blank page will appear, and some times you will be asked to download the PHP file - which includes all your source code!

In this situation you could be timing out. CURL is going out to another web server and your code will have to wait for it to finish before moving on to something else. In addition, your web server may be waiting on PHP to finish it's work before sending something to the browser.

Luckily, there are a few ways to control how long the CURL functions, PHP, and Apache wait and you can do a little bit to ensure that the user's browser doesn't just give up either.

CURL has two options worth looking at, CURLOPT_TIMEOUT and CURLOPT_CONNECTTIMEOUT. The former sets how long CURL will run before it gives up and the latter sets how long CURL will wait to even connect to the site you want to pull data from. If you wanted to wait at most 4 seconds to connect and 8 seconds total, you would set it like this:

curl_setopt($curl_session, CURLOPT_CONNECTTIMEOUT, 4); curl_setopt($curl_session, CURLOPT_TIMEOUT, 8);

This can be very helpful if you are connecting to a large number of different web sites or connecting to sites that not always available or are on slow hosts. You may wish to set the timeouts much higher, if you really need to get that data, or fairly low, if you have a lot of CURL calls and don't want PHP to time out. You can get an idea how long things are taking by using curl_getinfo():

echo '

';
print_r(curl_getinfo($curl_session));
echo '
';

PHP may also time out if it is running for too long. Luckily, you can control this to some extent by changing a setting in your php.ini or using the set_time_limit() function. If you can make changes to php.ini, it might be worth adding or adjusting the following lines:

max_execution_time = 300 ; Maximum execution time of each script, in seconds max_input_time = 60 ; Maximum amount of time each script may spend parsing request data memory_limit = 8M ; Maximum amount of memory a script may consume (8MB)

If you don't have access to php.ini, you may be able to use set_time_limit() to change the max_execution time on each page where it is needed. If you are in a shared hosting environment, don't monkey with these values too much or you might impact other users. If you raise the time limit too high, you may get an angry email from your admin. Some hosts have programs set up to look out for long-running processes and kill them - check with your admin if you raise the time limit and the script still dies an early death.

Your web server (Apache is used for this example) may also be running into timeout issues. If you have access to your httpd.conf, changing the timeout is pretty easy:

Timeout 300

Unfortunately, not everyone will be able to edit their httpd.conf and this is not something you can add to an .htaccess file to change for just the scripts in a particular directory. Luckily we can work around this limitation, so long as we are sending the webpage to the user in parts, rather than waiting for the entire PHP script to execute and then sending the response.

How do we do it? First, make sure mod_gzip is turned off in an .htaccess file:

mod_gzip_on no mod_gzip_item_include mime ^text/.* mod_gzip_item_exclude mime ^image/.*$

Mod_gzip is a great way to reduce bandwidth use and increase site performance, but it waits until PHP has completed executing before zipping and sending the web page to the user.

Second, take a look at your PHP code and make sure you are not output buffering the whole page, including output buffering to send gz-encoded (gzipped) output. Output buffering can give you a lot of control, but in this case it can cause problems. You can Look for something like this:

ob_start(); // ... // a whole ton of time-consuming code here // ... ob_flush(); //or possibly ob_end_flush();

Finally, if you have a number of time-intensive sections in your code, you can force some data out to the browser to keep Apache going and help make sure the browser doesn't lose interest either. It might look something like this:

echo "Loading Steve's page ..."; // ... // a time-consuming CURL call // ... //do a flush to keep browser interested... echo str_pad(" Loaded. ",8); sleep(1); flush(); echo "Loading Jill's page ..."; // ... // a time-consuming CURL call // ... echo str_pad(" Loaded ... ",8); sleep(1); flush();

The flush() function is the main trick - it tells PHP to send out what it has generated so far. The str_pad() and sleep() calls might not be necessary in this case, but the general idea is that some browsers need a minimum of 8 bytes to start displaying and the delay from the sleep(1) call seems to make IE happy.

This technique is not just useful in getting around timeout problems, it can also be used on long pages to give the user something to start looking at while the rest of the data loads. Also, some browsers might not handle content serves as XML incrementally – in that case you might want to serve it as text/html:

header("Content-Type: text/html");

Hopefully this will help you track down those nasty timeout-related bugs. Have questions or some other tips? Post in the comments below.

  1. Great! Some must have knowledges for beginners in web programming! Thanks.

    Maruf Hossain
    May 26th, 2007 at 6:03 am
  2. Thank you!!!
    Cuality peace of high intrest for those of us who cross in the way of such tricky bugs, sincerly, thank you!!!

    El Demonio
    October 8th, 2007 at 3:40 pm
  3. Thanks a lot, this helped me running giant SQL scripts that import 1000s of products to my webstore without running into time-out :)

    Karel Boek
    November 27th, 2007 at 3:55 am
  4. After doing all this, the only thing that really solved the problem was setting:

    IdleTimeout 3000

    in the httpd.conf file.
    Hope this helps somebody!

    Fernando
    February 6th, 2008 at 3:24 pm
  5. I have a script which is experiencing the same problems that you have described. It works fine without any errors until you increase the number of CURL requests then it seems to give up half way through processing the list of requests. I will be trying these suggestions first thing tomorrow when I get to work! thanks for your help

    steve simpson
    September 10th, 2008 at 3:53 pm
  6. Hi

    I am executing a PHP script in which a lot of curl calls( around 1500 in a loop) are made which in turn is accessing memcache.

    My problem is that my script is dying in the middle and apache error logs are not showing anything either. My PHP timeout is set to 30 seconds. I am accessing my PHP file from command line using curl.

    Any help will be greatly appreciated.

    Thanks

    Anshul
    July 15th, 2009 at 2:52 am
  7. Flush is the key for this, but how am im going to Flush in the middle of the CURL call?

    I am using CURL to tunnel files bigger than 50 MB, and i recently discovered that APACHE is limiting it, after downloading for around 6 minutes the file using CURL just stops.

    All those variables in my php.ini like “max_input_time” are high, thats not the problem, the apache is, but sadly i can’t change the apache configuration.
    Anyway if I could wich line do i change?
    (Timeout ? or KeepAliveTimeout?)

    I found some CURL functions like “CURL_PROGRESSFUNCTION” that may be used to flush, but they are only available in php 5.3, is there any other way that i can do to flush the buffer without stoping the CURL call?

    If you have any solution please email me.(anderseta@gmail.com)

    Thanks!

    Seta
    January 10th, 2010 at 12:36 pm
  8. I’m also dealing with timeout problems executing ffmpeg from a PHP script. I’m using the command shell_exec and it works fine for small video files, but if input file is larger than 20Mb i get a 500 Internal Server Error. The PHP script is being executed from a cronjob using lynx there is any workaround for this issue? I’ve tried using max_execution_time and max_input_time with ini_set from my PHP script, but result is the same…

    Plank
    January 11th, 2011 at 9:56 pm

Post a Comment

(or leave a trackback to your blog)