adblock Apache browser CURL errors extensions firefox How To HTML Innovation PHP plugins Programming The Internet timeout Web Design

The Best Firefox Plugins and Extensions

Firefox is a great web browser. If nothing else, the large number of people switching from Internet Explorer to Firefox convinced Microsoft to finally update IE. When Firefox added inline spell checking with version 2.0 they boosted the writing quality of every blogger, wiki contributor, and forum post on the Internet. What more can you ask for? Actually, the best thing about Firefox is its extensibility. Anyone with some programming skill and some free time can add features and functionality by building plugins and extensions. There are well over 2000 extensions listed at, so where do you start? You can find a lot of "top 10" lists around the web, but I thought I'd add my two cents as well. Here is a list of some of the best Firefox extensions. 1. Adblock Plus - Adblock is a controversial choice because it allows users to block out the advertisements that many websites rely on for income. This website, for example. But again and again I find myself thanking the Flying Spaghetti Monster for Adblock. Some sites fill their pages with flash-based ads that flash, flutter, crawl across the pages, etc. And those are the ads I inevitably block. 2. StumbleUpon - StumbleUpon lets you channel surf the web. Click on the Stumble button and you'll get a new web site - give it a thumbs-up or a thumbs-down and StumbleUpon will suggest sites more to your liking. I should warn you, though, that this extension is very addictive and a terrible time-waster. 3. Procrastato - Now that I've ruined your productivity with StumbleUpon, I'll give you a little bit back. Procrastato watches for notorious time-wasting sites like Digg, MySapce, and YouTube and reminds you every few minutes to get back to work. 4. Firebug - If you are a web developer and you don't use some combination of these next four plugins, you might as well be writing code blindfolded. Firebug lets you inspect pages to find troublesome elements and edit HTML, CSS, and javascript inline. 5. Web Developer - The Web Developer Toolbar isn't quite as powerful as Firebug but it has some nice features that are easy to get to in a pinch. For example you can resize the browser window to make sure your site still works in 800x600. You can also kill all CSS styles, which actually make MySpace tolerable. 6. Tamper Data - If you ever run into a tricky HTTP header problem, or want to see what all is taking so long to load on a site, Tamper Data is the tool for you. 7. User Agent Switcher - You don't need to be a web developer to appreciate this add-on. There are still a lot of sites out there with buggy old code that tries to look for a certain version of IE and locks you out otherwise. Use User Agent Switcher to tell the site that Firefox is IE, and 99 times out of 100 everything runs perfectly well. 8. Bookmarks - I don't know about you but I have been building my bookmarks lists for 10 years, exporting and importing from one browser version to the next. The list is now way too large to be usable, but makes my bookmarks taggable and searchable. This plugin integrates them back into the browser. 9. SiteAdvisor - I was a little worried when SiteAdvisor was bought by McAffee, since I'm not a huge fan of their anti-virus suite. But SiteAdvisor remains an absolutely necessary tool on the wild web. When you do a Google search, you'll see little green checkmarks next to well-behaved sites and red X's next to spammers and spyware purveyors. Go install this on your mom and dad's computers today. Did I miss any? Let me know about your favorite extensions below.

Weird Errors – Fix Timeout Issues in CURL, PHP, and Apache.

Hitting strange errors when trying to execute long-running PHP processes, like large file reads, generating static HTML pages, file uploads, or CURL calls? It might not be just bugs in your code.

Are you getting pages that seem to load, but then nothing shows up in the browser? When you go to a page, does your browser sometimes ask, "You have chosen to open something.php which is a : PHP file. What should Firefox do with this file" or possibly "File name: something.php File type: PHP File Would you like to open the file or save it to your computer" Do you get internal server errors at random intervals?

Depending on what you are trying to, you could be running into timeout issues, either in PHP, in a particular library, in Apache (or IIS or whatever web server you use), or even in the browser. Timeout issues can be a real pain because you don't run into them very often and they don't result in clear error messages.

Let's take a PHP script that does a number of CURL calls as an example. PHP gives you access to libcurl a really powerful tool for calling up other web pages, web services, RSS feeds, and whatever else you can dream up, right in your PHP code. This article is not a general introduction to CURL, so I won't go into detail, but basically the CURL functions allow your code to make requests and get responses from web sites just like a browser. You can then parse the results use the data on your site.

Let's say you have a page on your site where you would like to display the latest posts from a few of your friends' websites, and they don't have RSS feeds set up. When a user comes to your site, you can make a series of CURL calls to get the data:

$curl_session = curl_init(); curl_setopt($curl_session, CURLOPT_HEADER, false); curl_setopt($curl_session, CURLOPT_FOLLOWLOCATION, true); curl_setopt($curl_session, CURLOPT_RETURNTRANSFER, true); curl_setopt ($curl_session, CURLOPT_HTTPGET, true); curl_setopt($curl_session, CURLOPT_URL, ''); $string = curl_exec($curl_session);


You can now parse the results in $string and hack out the most recent post. You would repeat these calls for each of your friends' web sites.

You try running the page and everything seems to work at first, but then you hit reload and get some strange behavior, like the the problems listed above. In the worst cases, you won't get the same exact error each time - sometimes the page will load, some times you'll get an empty $string or errors from curl, sometimes a blank page will appear, and some times you will be asked to download the PHP file - which includes all your source code!

In this situation you could be timing out. CURL is going out to another web server and your code will have to wait for it to finish before moving on to something else. In addition, your web server may be waiting on PHP to finish it's work before sending something to the browser.

Luckily, there are a few ways to control how long the CURL functions, PHP, and Apache wait and you can do a little bit to ensure that the user's browser doesn't just give up either.

CURL has two options worth looking at, CURLOPT_TIMEOUT and CURLOPT_CONNECTTIMEOUT. The former sets how long CURL will run before it gives up and the latter sets how long CURL will wait to even connect to the site you want to pull data from. If you wanted to wait at most 4 seconds to connect and 8 seconds total, you would set it like this:

curl_setopt($curl_session, CURLOPT_CONNECTTIMEOUT, 4); curl_setopt($curl_session, CURLOPT_TIMEOUT, 8);

This can be very helpful if you are connecting to a large number of different web sites or connecting to sites that not always available or are on slow hosts. You may wish to set the timeouts much higher, if you really need to get that data, or fairly low, if you have a lot of CURL calls and don't want PHP to time out. You can get an idea how long things are taking by using curl_getinfo():

echo '

echo '

PHP may also time out if it is running for too long. Luckily, you can control this to some extent by changing a setting in your php.ini or using the set_time_limit() function. If you can make changes to php.ini, it might be worth adding or adjusting the following lines:

max_execution_time = 300 ; Maximum execution time of each script, in seconds max_input_time = 60 ; Maximum amount of time each script may spend parsing request data memory_limit = 8M ; Maximum amount of memory a script may consume (8MB)

If you don't have access to php.ini, you may be able to use set_time_limit() to change the max_execution time on each page where it is needed. If you are in a shared hosting environment, don't monkey with these values too much or you might impact other users. If you raise the time limit too high, you may get an angry email from your admin. Some hosts have programs set up to look out for long-running processes and kill them - check with your admin if you raise the time limit and the script still dies an early death.

Your web server (Apache is used for this example) may also be running into timeout issues. If you have access to your httpd.conf, changing the timeout is pretty easy:

Timeout 300

Unfortunately, not everyone will be able to edit their httpd.conf and this is not something you can add to an .htaccess file to change for just the scripts in a particular directory. Luckily we can work around this limitation, so long as we are sending the webpage to the user in parts, rather than waiting for the entire PHP script to execute and then sending the response.

How do we do it? First, make sure mod_gzip is turned off in an .htaccess file:

mod_gzip_on no mod_gzip_item_include mime ^text/.* mod_gzip_item_exclude mime ^image/.*$

Mod_gzip is a great way to reduce bandwidth use and increase site performance, but it waits until PHP has completed executing before zipping and sending the web page to the user.

Second, take a look at your PHP code and make sure you are not output buffering the whole page, including output buffering to send gz-encoded (gzipped) output. Output buffering can give you a lot of control, but in this case it can cause problems. You can Look for something like this:

ob_start(); // ... // a whole ton of time-consuming code here // ... ob_flush(); //or possibly ob_end_flush();

Finally, if you have a number of time-intensive sections in your code, you can force some data out to the browser to keep Apache going and help make sure the browser doesn't lose interest either. It might look something like this:

echo "Loading Steve's page ..."; // ... // a time-consuming CURL call // ... //do a flush to keep browser interested... echo str_pad(" Loaded. ",8); sleep(1); flush(); echo "Loading Jill's page ..."; // ... // a time-consuming CURL call // ... echo str_pad(" Loaded ... ",8); sleep(1); flush();

The flush() function is the main trick - it tells PHP to send out what it has generated so far. The str_pad() and sleep() calls might not be necessary in this case, but the general idea is that some browsers need a minimum of 8 bytes to start displaying and the delay from the sleep(1) call seems to make IE happy.

This technique is not just useful in getting around timeout problems, it can also be used on long pages to give the user something to start looking at while the rest of the data loads. Also, some browsers might not handle content serves as XML incrementally – in that case you might want to serve it as text/html:

header("Content-Type: text/html");

Hopefully this will help you track down those nasty timeout-related bugs. Have questions or some other tips? Post in the comments below.