If you open a new file, write to it and then call fpassthru() it doesn't work. You need to call rewind() first to set the file pointer to the begining of the file.
(PHP 4, PHP 5, PHP 7)
fpassthru — Выводит все оставшиеся данные из файлового указателя
$handle
) : intЧитает указанный файловый указатель с текущей позиции до EOF (конца файла) и записывает результат в буфер вывода.
Вам может понадобиться вызвать rewind(), чтобы сбросить файловый указатель на начало файла, если вы уже записывали данные в файл.
Если вы хотите просто сбросить содержимое файла в буфер вывода, предварительно не модифицируя его и не начиная с определённого смещения, вы можете воспользоваться readfile(), которая избавит вас от лишнего вызова fopen().
handle
Указатель на файл должен быть корректным и указывать на файл, успешно открытый функциями fopen() или fsockopen() (и все еще не закрытый функцией fclose()).
В случае возникновения ошибки fpassthru()
возвращает FALSE
. Иначе fpassthru()
возвращает количество символов, прочтенных из
handle
и переданных на вывод.
Пример #1 Использование fpassthru() с бинарными файлами
<?php
// открываем файл в бинарном режиме
$name = './img/ok.png';
$fp = fopen($name, 'rb');
// отправляем нужные заголовки
header("Content-Type: image/png");
header("Content-Length: " . filesize($name));
// сбрасываем картинку и останавливаем выполнение скрипта
fpassthru($fp);
exit;
?>
Замечание:
При использовании fpassthru() на бинарном файле в Windows, вы должны убедиться в том, что открыли файл в бинарном режиме при помощи добавления b к режиму открытия файла, использованному в fopen().
Рекомендуется использовать флаг b при работе с бинарными файлами, даже если ваша система этого не требует, чтобы обеспечить более высокую портируемость скриптов.
If you open a new file, write to it and then call fpassthru() it doesn't work. You need to call rewind() first to set the file pointer to the begining of the file.
Passthru didn't work for me for files greater than about 5Mb. Just adding "ob_end_clean()", all works fine now, including > 50Mb files.
$ToProtectedFile=$pathUnder.$filename
$handle = @fopen($ToProtectedFile, "rb");
@header("Cache-Control: no-cache, must-revalidate");
@header("Pragma: no-cache"); //keeps ie happy
@header("Content-Disposition: attachment; filename= ".$NomFichier);
@header("Content-type: application/octet-stream");
@header("Content-Length: ".$SizeOfFile);
@header('Content-Transfer-Encoding: binary');
ob_end_clean();//required here or large files will not work
@fpassthru($handle);//works fine now
If your downloaded files are getting corrupted, one of the scripts included/required in your download script or page may have whitespace around the <?php ?> tags. A common enough problem, but most often recognized when header() fails, due to headers already being sent, but one worth mention here.
This one bit me just recently with my download script. Somewhere along the way adding functionality to my website, I wound up with a space (not a blank line, which I usually spot right away, but a single space character) after the closing ?> tag in one of the require()'d files. Oddly enough, all the downloads seemed to work ok, but the files were corrupted: that space character wound up at the beginning of each file.
also it is possible to make your php script resume downloads, to do this you need to check $_SERVER['HTTP_RANGE'] which may contain something like this
"bytes=10-" - resume from position 10, and to end of file
when sending response it is also needed to send with headers
Accept-Ranges: bytes
Content-Length: {filesize}
Content-Range: bytes 10-{filesize-1}/{ffilesize}
hope its usefull
This code works fine with a download manager... maybe not the best solution, but the only one that works with IE!!!!!
It forces download, but gif file don't want to be downloaded!!! so I need to simply display them in browser...
NB $file is the result of a query on the file table...
require_once("auth.inc.php");
$attachment = (strstr($HTTP_USER_AGENT, "MSIE")) ? "" : " attachment"; // IE 5.5 fix.
//Content of file
if (!headers_sent()){
$ficexp=explode('.',$file["orig_name"]);
$ext=$ficexp[sizeof($ficexp)-1];
if ($ext!='gif'){
header('Cache-Control: no-cache, must-revalidate');
header('Pragma: no-cache');
header("Content-Type: application/force-download");
header("Content-Length: ".filesize("files/".$file["save_name"]));
header("Content-Disposition: ".$attachment."; filename=".$file["orig_name"]);
}
$fn=fopen("files/".$file["save_name"], "rb");
fpassthru($fn);
}
else {
MessageBox('Headers already sent, cannot force download!');
}
Min's
The above method worked for me after trying everything else imaginable to get Explorer to download a file via PHP. However, I had to change the content-length line. No need to "stringify" the $size variable as in the above post. The method below works for both small and very large file (tested on files larger than 30MB with no probs)...
<?php
$distribution="/path/to/a/file.exe"
if ($fd = fopen ($distribution, "r")){
$size=filesize($distribution);
$fname = basename ($distribution);
//This is some really weak code I used just to redirect to the file before I fixed
//this problem...it makes the browser handle the download via Apache instead of PHP
//but it would be really easy to then find out the true location of the file
//header("Location: $distribution");
//fclose ($fd);
//exit;
//below is a much better way to do it...
header("Pragma: ");
header("Cache-Control: ");
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"".$fname."\"");
header("Content-length: $size");
while(!feof($fd)) {
$buffer = fread($fd, 2048);
print $buffer;
}
fclose ($fd);
exit;
}
?>
Good luck.
Brett Brewer.
Here is a final working copy that won't freak out Microsoft Explorer if you are using sessions. Thanks to everyone else who came before. This is not as simple as I thought it would be.
the user would pass a call to the page:
http://mysite/getfile.php?file=products.pdf
include 'base.inc'; // inlcude base code, start session and manage users
// This loads the file global from the post/get variables
// For security reasons register globals is disabled
LoadPostGet('file');
$filename = '/data/files/' . $file;
if(file_exists($filename)){
$FILECMD = '/usr/bin/file';
$contentType = '';
$fp=popen("$FILECMD -bin $filename", 'r');
if (!$fp) $contentType='application/octet-stream';
else {
while($string=fgets($fp, 1024)) $contentType .= $string;
pclose($fp);
}
if(strpos($HTTP_SERVER_VARS['HTTP_USER_AGENT'], 'MSIE')){
// IE cannot download from sessions without a cache
header('Cache-Control: public');
}
header("Content-type: $contentType");
header("Content-Disposition:inline; filename=\"".$file."\"");
header("Content-length:".(string)(filesize($filename)));
$fd=fopen($filename,'rb');
while(!feof($fd)) {
print fread($fd, 4096);
}
fclose($fd);
}else{
print "File Not Found";
}
Update to the above. This also sets the correct mime type for the file you're sending. It's a small hack since it relies on the "file" system command but it should work well.
<?
// full path to the file command
$FILECMD='/usr/bin/file';
// directory where the file resides
$fileDir='/home/mcaserta';
// full file name
$fileName='test.sh';
// END CONFIG
$completeFilePath=$fileDir.'/'.$fileName;
$fp=popen("$FILECMD -bin $completeFilePath", 'r');
if (! $fp) $contentType='application/octet-stream';
else {
while($string=fgets($fp, 1024)) $contentType .= $string;
pclose($fp);
}
header('Content-type: '.($contentType));
header('Content-Disposition: inline; filename="'.($fileName).'"');
header('Content-length: '.(string)(filesize($completeFilePath)));
$fd=fopen($completeFilePath,'r');
fpassthru($fd);
?>
Note that the above comment about the "Connection: close" header is incorrect: it does not guarantee that the connection will be closed immediately after the transfer is complete. Instead, it informs the client that it can no longer use the existing HTTP connection to perform other HTTP requests on the same server, and that the client MUST close the connection as soon as it has finished handling the current request.
If the client (for example an old HTTP proxy) is using HTTP/1.0, it may not recognize this header, and could could the connection open; the web server should detect this and close the connection and ignore any further request attempt on that connection.
HTTP/1.1 clients MUST honor this header and close their connection as soon as they detect the end of the answer.
In any case, the web server will initiate a watchdog after script completion, and will force the deconnection after about 15 to 30 seconds if the client does not honor this header.
The exact time to wait for the "socket closed by remote" event is configurable in the web server.
It is generally smaller when the "Connection: close" header has been sent by the server, than when no "Connection:close" has been sent (in which case the connection persists for longer time, to let the client navigate on the server without enduring new connection costs in terms of: connection delays, number of socket control blocks in final wait state, number of used ports).
Don't abuse "Connection: close" on your server for every hosted page: this creates more incoming TCP connection attempts than necessary, and slows the navigation on your site. Use it only if your script cannot generate explicit content length in the result header, as the client will have difficulties to determine the end of the results.
If you want to save connection resources to your server, always send an Explicit "Content-Length" header within your script, or use the "chunked" transfer-encoding to explicitly send the result by delimited fragments (if the client is using HTTP/1.1, it MUST support this chunked transfer encoding, per specification). See RFC2616 for details.
In relation to using sessions and fpassthru together.
Try adding: session_write_close()
somewhere near the top of the download script, before you start sending the video, and that should take care of it.
I've implemented and tested session_write_close() and it works like a dream. Other links can now be clicked and loaded whilst a big file is being passed using fpassthru.
Big thanks to Greg for this tip. What a helpful community we live in :0)
Just a little thing more from the ssharma's script (thx to him for his great help ...) :
Don't forget to put the fopen with the "rb" argument and not just with the "r"
or you won't be able to make the script work with all pdf file.
My final script (working for Open and Save on a 1.9 Mb complex PDF file) :
<?php
//The filename is stored in the $produitFilename variable in my script (the only thing you need)
// You need to specify the REAL path for your file and not the URL
$fullPath = getcwd()."./directory_where_the_file_is/".$produitFilename;
if ($fd = fopen ($fullPath, "rb")) {
$fsize =filesize($fullPath);
$fname = basename ($fullPath);
header("Pragma: ");
header("Cache-Control: ");
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"".$fname."\"");
header("Content-length: $fsize");
fpassthru($fd);
}
?>
Have fun and thx u all 4 ur great help ...
Simon (from Paris - France)
If you trying to output a user-written file on a page for verifying, editing, etc, you'll want to use fopen(), fread(), htmlentities() to avoid malicious code. Text from fpassthru, while not parsed per se can still mess up the display of a page (or at least it did for me!) --mt.
In case of multiple buffering possibility try running ob_end_clean() from the example below in a loop:
while (@ob_end_clean());
It will help for example in case of automatic gz compression of output.
While trying to "passthru" a file to the browser via PHP, but using the FEOF loop, the script tried to buffer the entire file before passing it to the browser. This is my original script. When calling it with a 15M PHP memory limit and a 16M file, apache killed the script.
<?php
$name = $tempDir . $_GET["file"];
$fd = fopen($name, 'rb');
if($fd == false)
die("<font color=red>ERROR: File not found.</font>");
// send the right headers
header("Cache-Control: ");// leave blank to avoid IE errors
header("Pragma: ");// leave blank to avoid IE errors
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"" . $_GET["file"] . "\"");
header("Content-length:".(string)(filesize($name)));
sleep(1);
session_write_close();
ob_flush();
flush();
while(!feof($fd)) {
$buffer = fread($fd, 2048);
print $buffer;
}
fclose ($fd);
exit;
?>
Apache error log read:
Allowed memory size of 15728640 bytes exhausted (tried to allocate 10240 bytes)
I tried everything, including a flush() inside the loop. But the solution was forcing the flush other way:
<?php
$buffer = fread($fd, 32 * 1024);
?>
voila... works just fine for me.
I believe the following problem is a result of using sessions and fpassthru together.
I have a subscription based site which protects large video files (WMV format between 100-120MB) by storing them beneath web root. Downloading a video file requires the user to click a HTML link which requests a PHP script e.g. download-video.php?video_id=123. If the user is valid (session vars created from sucessful login) the script then creates the necessary headers to trigger a 'Save As' download box, opens the file from beneath web root and sends it using fpassthru.
The problem is as follows:
The user should be able to click other links on the site whilst a file is downloading. But when they do so, the requested page won't load until the download is complete.
As this download script is a seperate PHP request, the user should be able to load other pages on the site whilst the file is downloading.
At time of writing, I've tried almost everything to remove this bug. There must be a problem with using a PHP script rather than a direct web server link to download files.
I have also perused this list of examples which I am sure work for that person, but, as others have mentioned here, do not work for me or (anyone else).
So what I did was try out all of these examples, check other sources of information, and put together what I think to be an example of what works on 'more than a few' systems. The following example works for me wherever I need to create a download using fpassthru(), which works with IE6 (among other browsers):
<?
/*/
Download a file using fpassthru()
/*/
$fileDir = "/home/pathto/myfiles"; // supply a path name.
$fileName = "myfile.zip"; // supply a file name.
$fileString=$fileDir.'/'.$fileName; // combine the path and file
// translate file name properly for Internet Explorer.
if (strstr($_SERVER['HTTP_USER_AGENT'], "MSIE")){
$fileName = preg_replace('/\./', '%2e', $fileName, substr_count($fileName, '.') - 1);
}
// make sure the file exists before sending headers
if(!$fdl=@fopen($fileString,'r')){
die("Cannot Open File!");
} else {
header("Cache-Control: ");// leave blank to avoid IE errors
header("Pragma: ");// leave blank to avoid IE errors
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"".$fileName."\"");
header("Content-length:".(string)(filesize($fileString)));
sleep(1);
fpassthru($fdl);
}
?>
All that should require editing is the $fileDir and $fileName variables. Upload the file and point to it with your browser to see if the script will prompt you for a download.
NOTE : Regarding File Types : Leaving the 'Content-type' header as-is should allow you to download pretty much any file. I have tested it on some of the more popular file types including zip, css, php, inc, htm, png, gif and jpg. During these tests, I did note that if I selected 'cancel' or 'open' when prompted to download either a gif or jpg, that it would indeed cancel or open in my image browser as it should, but subsequent attempts at 'downloading only' yielded a web page view of the image. Closing the window and opening a new one reset this, allowing me save a jpeg or gif to the hard drive directly. I believe the problem lies in the way the caching headers are treated, since if any info is specified in the 'cache-control' header, the browser download fails completely (in IE, anyways).
Enjoy! Mail me if it works! ;-)
here is my code, i tried several combinations, but most of them didnt work, and had all kinds of unnecessary headers in them, etc. this has additional good features, such as it stops sending the file if the connection stops (hopefully it does anyways), and it fixes IE filename problems when sending files that contain more than one dot in them by using a simple preg_replace (IE likes to terminate the filename and messes everything up):
<?
function send_file($path) {
session_write_close();
ob_end_clean();
if (!is_file($path) || connection_status()!=0)
return(FALSE);
//to prevent long file from getting cut off from //max_execution_time
set_time_limit(0);
$name=basename($path);
//filenames in IE containing dots will screw up the
//filename unless we add this
if (strstr($_SERVER['HTTP_USER_AGENT'], "MSIE"))
$name = preg_replace('/\./', '%2e', $name, substr_count($name, '.') - 1);
//required, or it might try to send the serving //document instead of the file
header("Cache-Control: ");
header("Pragma: ");
header("Content-Type: application/octet-stream");
header("Content-Length: " .(string)(filesize($path)) );
header('Content-Disposition: attachment; filename="'.$name.'"');
header("Content-Transfer-Encoding: binary\n");
if($file = fopen($path, 'rb')){
while( (!feof($file)) && (connection_status()==0) ){
print(fread($file, 1024*8));
flush();
}
fclose($file);
}
return((connection_status()==0) and !connection_aborted());
}
?>
To throttle download-speed of specific files this works fine in my board hosted on my local machine:
//#######################################
$big_file=filesize($completeFilePath)/1024; //size of file in kb
header('Content-Type: '.$mime_type);
header('Content-disposition: '.$content_disp.'filename="'.$attachment_name.'"');
header('Cache-Control: no-cache');
header('Pragma: no-cache');
header('Expires: 0');
header('Content-Length: '.(string)(filesize($completeFilePath)));
$fp=fopen($completeFilePath,'r');
while(!feof($fp)) {
$buffer = fread($fp, 1024*6); //speed-limit 6kb/s
if ($big_file>32 &&
$extension!="jpg" &&
$extension!="jpeg" &&
$extension!="gif" &&
$extension!="png" &&
$extension!="txt")
sleep(1); //if filesize>32kb and no smallfile like jpg,gif or so - wait 1 second
print $buffer;
}
fclose($fp);
header ("Connection: close");
//#######################################
I think it's the easiest way to slow down downloading files without using a loop or for-next - this really saves performace of php and is quite exact by using 1024*number_of_kb in one second...
Thats all
Greetings, omega2k.dynu.com
Here's a summary the different headers you need to set to make downloads *always* work with IE and Mozilla:
[SNIP]
$disposition = "inline"; // "inline" to view file in browser or "attachment" to download to hard disk
$mime = "image/jpeg"; // or whatever the mime type is
$name = "foo.jpg"; // file name
$path = "/path/to/foo.jpg"; // full path and file name
if (isset($_SERVER["HTTPS"])) {
/**
* We need to set the following headers to make downloads work using IE in HTTPS mode.
*/
header("Pragma: ");
header("Cache-Control: ");
header("Expires: Mon, 26 Jul 1997 05:00:00 GMT");
header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT");
header("Cache-Control: no-store, no-cache, must-revalidate"); // HTTP/1.1
header("Cache-Control: post-check=0, pre-check=0", false);
}
else if ($disposition == "attachment") {
header("Cache-control: private");
}
else {
header("Cache-Control: no-cache, must-revalidate");
header("Pragma: no-cache");
}
header("Content-Type: $mime");
header("Content-Disposition:$disposition; filename=\"".trim(htmlentities($name))."\"");
header("Content-Description: ".trim(htmlentities($name)));
header("Content-Length: ".(string)(filesize($path)));
header("Connection: close");
[/SNIP]
This way all kinds of download work for me. Hope that helps
Note that if you use these two headers from a previous example:
header('Cache-Control: no-cache, must-revalidate');
header('Pragma: no-cache');
before sending a file to the browser, the "Open" option on Internet Explorer's file download dialog will not work properly. If the user clicks "Open" instead of "Save," the target application will open an empty file, because the downloaded file was not cached. The user will have to save the file to their hard drive in order to use it.
Make sure to leave these headers out if you'd like your visitors to be able to use IE's "Open" option.
A few notes on using fpassthru() to php-driven download links that pop up a "Save As.." dialog:
1. I found that the download progress dialog was remaining up for several seconds after the transfer was completed, before telling the user it was complete. This was fixed by adding the following header:
header ("Connection: close");
This will cause the connection to be closed as soon as the transfer is complete, rather than waiting for a timeout.
2. If you have multiple periods in the filename, you might wind up with a filename with numbers in brackets (such as myfile-[1][0]-windows.zip when you put myfile-1.0-windows.zip in the headers) with MSIE. According to Microsoft's KB, his is a "known" bug having to due with MSIE's cache and there's no workaround that I was able to find.
3. Through no amount of futzing of headers was I able to get the filename to be set properly when the actual transfer was initiated via a refresh (META or via headers). I don't know if this is also an MSIE only issue or not. If 'download.php?dl=now' (for example) had a refresh back to 'download.php', such that it was intended to show some information (e.g. install instructions) as well as launch the download, then the MSIE insisted that the downloaded file was supposed to be named 'download.php?dl=now' or 'download.php', ignoring the filename in the headers.
I've tried all of these renditions of this elusive task. NONE of them have worked for me. And when i say work, i mean where i can click some sort of link and have a file Save As... dialog box come up on MSIE 6.0. In every other browser i've tried (Safari,Firebird,Netscape pc and mac) all have worked where it downloads to my desktop or asks me to save it in a certain place.
on MSIE 6.0. the file i'm trying to download appears in it's own window. it's an image. BUT, the only thing i can do with it is SAVE IT AS A BMP. ugh.
I'm using the fpassthru function because i have files that must not be served by the webserver.
This might save someone some time. I created a program to list some rather large files and create links for the end user to click on in order to download them (using the php function fpassthru()).
The problem I was having was it would make it half way through the download (about 377 megs) and the script would terminate and the download would stop.
After doing some shotgun troubleshooting I discovered the php config option 'max_execution_time = 30'. Upon changing it to 'max_execution_time = -1' the files >370 megs can be downloaded without the script aborting.
Jon
Found a workaround for the MSIE cache bug that puts brackets around dotted items I posted about a while back (e.g. "somefile1.0-xyz.zip" becoming "somefile[1][0]-xyz.zip").
It turns out if you encode all but the last dot as %2e, then MSIE won't do this. If you encode all of them (including the last dot), then MSIE sticks an extra bracketed number at the end of the file (e.g. "somefile1.0-xyz.zip[1]"). Unfortunately, however, some other browsers then want to save the file with the %2e in the filename instead of the dots.
if (strstr($_SERVER['HTTP_USER_AGENT'], "MSIE"))
{
$fileName = preg_replace('/\./', '%2e', $fileName,
substr_count($fileName, '.') - 1);
}
Viola. Properly named files. This works at least with MSIE 6.0.
fpassthru() works best for small files. In download manager scripts, it's best to determine the URL of the file to download (you may generate it locally in your session data if you need so), and then use HTTP __temporary__ redirects (302 status code, with a "Location:" header specifying the effective download URL).
This saves your web server from maintaining PHP scripts running for long times during the file downloadn and instead the download will be managed directly by the web server without scripting support (consequence: less memory resources used by parallel downloads)...
I wrote a page which authenticates the user, then calls fpassthru() to download an Acrobat document. It worked great up to about 1MB, but for larger files, the script was dying in the middle. My ISP told me they were killing my script because it was a memory hog. I tried readfile() instead, to no avail.
I replaced the fpassthru() with this workaround. It works great:
while(!feof($fn)) {
$buffer = fread($fn, 4096);
print $buffer;
}
The way the PHP page is generated (buffered or not, and how if buffered) has an impact of the download function made using fpassthru (or fread, ...). I mean a download function may work just fine when it is called from a simple php file (no buffering here):
<?php
function download($file) { ... }
$filename = "/tmp/test.zip";
download($filename);
?>
but may fails "in the real life" when the page is buffered:
<?php
ob_start("ob_gzhandler");
...
require_once(download.php);
...
$filename = "/files/file.zip";
download($filename);
?>
In my particular case, only Firefox 1.0 English did not perform the download, because of the ob_start("ob_gzhandler"). Replacing it by ob_start() solved the problem.
Hope that helps
Laurent from Paris, France
I've modified the example given by straz at -removethispart-mac dot com to count each byte of the file out. This can then be compared with the filesize once the file sending is complete to determine whether the file was sent succesfully or not.
Of course, this doesn't guarantee that the user actually recieved the file successfully though will let us know if something goes wrong half way through reading/sending the file at our end.
<?
/* fpassthru is apparantly a memory-hog. Use this instead */
while(!feof($fp)) {
$buf = fread($fp, 4096);
echo $buf;
$bytesSent+=strlen($buf); /* We know how many bytes were sent to the user */
}
?>
I've then got this code to update my database to say that the file was downloaded successfully.
<?
if($bytesSent==filesize($file)) {
/* Do some cool stuff here! */
}
?>
I could not get the above examples to work. This is what I used instead:
header("Content-Disposition: attachment; filename=$file");
header("Content-Description: Image File");
$fd = fopen($file,'r');
fpassthru($fd);
In reply to spam at flatwan dot net
This might save someone some time. I created a program to list some rather large files and create links for the end user to click on in order to download them (using the php function fpassthru()).
The problem I was having was it would make it half way through the download (about 377 megs) and the script would terminate and the download would stop.
After doing some shotgun troubleshooting I discovered the php config option 'max_execution_time = 30'. Upon changing it to 'max_execution_time = -1' the files >370 megs can be downloaded without the script aborting.
The best way to do this would to be:
<?php
@ignore_user_abort();
@set_time_limit(0);
?>
This only changes these settings for the script that calls them. (Thanks to (I don't remember who) who wrote a form mail script that used these two lines)
In reply to:
"3. Through no amount of futzing of headers was I able to get the filename to be set properly when the actual transfer was initiated via a refresh (META or via headers). I don't know if this is also an MSIE only issue or not. If 'download.php?dl=now' (for example) had a refresh back to 'download.php', such that it was intended to show some information (e.g. install instructions) as well as launch the download, then the MSIE insisted that the downloaded file was supposed to be named 'download.php?dl=now' or 'download.php', ignoring the filename in the headers."
I recently had the exact same issue. What I found is that this was due to my session initialization on the page. For some reason doing a session_start() caused the script to try and download itself, not what I was indicating through various header() calls.
The solution was to move the download portion above the session initialization. At first glance this may seem dangerous, but I only process it if there are POST vars and the script is reloading itself. This way I know the form was submitted by that page and before they can submit it, they have to have a session! Adding an .htaccess rule to deny all for the directory where the files are stored also helps because then only my script can access the files.
Found a workaround to another headache that just cropped up tonight. Apparently Opera 6.1 on Linux (unsure of other versions/platforms) has problems downloading files using the above methods if you have enabled compression via zlib.output_compression in php.ini.
It seems that Opera sees that the actual transfer size is less than the size in the "Content-length" header for the download and decides that the transfer was incomplete or corrupted. It then either continuously retries the download or else leaves you with a corrupted file.
Solution: Make sure your download script/section is off in its own directory. and add the following to your .htaccess file for that directory:
php_flag zlib.output_compression off
Interesting results using fpassthru() vs. fread() under UNIX.
Using fread(fp, length) to read from a valid, open pointer, in which the filename has a special character (single quote, comma, open paren, etc) fails on the read (no debug statements written after that). However, using fpassthru() works like a champ.
Thanks for the helpful notes on IE session info, have seen this before but didn't know what was causing it.
(Do not delete this, it is NOT a bug report. It is a *follow up* to vague comments on the page about fpassthru() using excessive memory, and a usage tip that if you want to use pass-through processing, PHP 5 is strongly advised. It may LOOK like a bug report because unlike earlier tips, I've tried to clarify the situation. It is not a bug report because the problem is RESOLVED in PHP 5. Rather, anyone still using PHP 4 (for example, for compatibility reasons) should simply be aware that the problem is now resolved.)
In PHP 4 (4.4.4 tested as a CGI, Apache 2, Linux), use of both fpassthru() and fread() in a loop suffer the SAME memory "leakage". This is characterised by all data that is sent to the client also being kept inside PHP and not released. It would appear to be a failure to garbage collect the data.
In PHP 5 (5.2.1 tested as a CGI, Apache 2, Linux) this flaw is resolved in both cases. Neither fpassthru() nor fread() in a loop "leak" memory during execution.
The issue of which one to use appears not to be an issue of memory as both were equally flawed in PHP 4 and both are equally fixed in PHP 5.
Speed, is left to a reader exercise to test in the latest PHPs.