phpday 2025 - Call For Papers

readfile

(PHP 4, PHP 5, PHP 7, PHP 8)

readfileGibt eine Datei aus

Beschreibung

readfile(string $filename, bool $use_include_path = false, ?resource $context = null): int|false

Liest den Inhalt einer Datei und schreibt ihn in den Ausgabepuffer.

Parameter-Liste

filename

Der Name der Datei, die gelesen werden soll.

use_include_path

Sie können optional den zweiten Parameter benutzen und diesen auf true setzen, wenn Sie auch im include_path nach der Datei suchen möchten.

context

Eine Stream-Kontext-Ressource.

Rückgabewerte

Gibt bei Erfolg die Anzahl der gelesenen Bytes einer Datei zurück. Bei einem Fehler wird false zurückgegeben.

Fehler/Exceptions

Im Fehlerfall wird eine E_WARNING ausgegeben.

Beispiele

Beispiel #1 Einen Download unter Verwendung von readfile() erzwingen

<?php
$file
= 'monkey.gif';

if (
file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.basename($file).'"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
readfile($file);
exit;
}
?>

Das oben gezeigte Beispiel erzeugt eine ähnliche Ausgabe wie:

Öffnen/Speichern-Dialog

Anmerkungen

Hinweis:

readfile() weist für sich allein keine Speicherprobleme auf, selbst wenn große Dateien gesendet werden. Wenn Sie auf einen out-of-memory-Fehler treffen, stellen Sie mit ob_get_level() sicher, dass die Ausgabepufferung deaktiviert ist.

Tipp

Wenn fopen wrappers aktiviert ist, kann mit dieser Funktion eine URL als Dateiname verwendet werden. Mehr Details dazu, wie der Dateiname angeben werden muss, sind bei fopen() zu finden. Eine Liste der unterstützten URL-Protokolle, die Fähigkeiten der verschiedenen Wrapper, Hinweise zu deren Verwendung und Informationen zu den eventuell vorhandenen vordefinierten Variablen sind unter Unterstützte Protokolle und Wrapper zu finden.

Siehe auch

add a note

User Contributed Notes 24 notes

up
66
riksoft at gmail dot com
10 years ago
Just a note for those who face problems on names containing spaces (e.g. "test test.pdf").

In the examples (99% of the time) you can find
header('Content-Disposition: attachment; filename='.basename($file));

but the correct way to set the filename is quoting it (double quote):
header('Content-Disposition: attachment; filename="'.basename($file).'"' );

Some browsers may work without quotation, but for sure not Firefox and as Mozilla explains, the quotation of the filename in the content-disposition is according to the RFC
http://kb.mozillazine.org/Filenames_with_spaces_are_truncated_upon_download
up
60
yura_imbp at mail dot ru
16 years ago
if you need to limit download rate, use this code

<?php
$local_file
= 'file.zip';
$download_file = 'name.zip';

// set the download rate limit (=> 20,5 kb/s)
$download_rate = 20.5;
if(
file_exists($local_file) && is_file($local_file))
{
header('Cache-control: private');
header('Content-Type: application/octet-stream');
header('Content-Length: '.filesize($local_file));
header('Content-Disposition: filename='.$download_file);

flush();
$file = fopen($local_file, "r");
while(!
feof($file))
{
// send the current file part to the browser
print fread($file, round($download_rate * 1024));
// flush the content to the browser
flush();
// sleep one second
sleep(1);
}
fclose($file);}
else {
die(
'Error: The file '.$local_file.' does not exist!');
}

?>
up
21
marro at email dot cz
16 years ago
My script working correctly on IE6 and Firefox 2 with any typ e of files (I hope :))

function DownloadFile($file) { // $file = include path
if(file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}

}

Run on Apache 2 (WIN32) PHP5
up
8
levhita at gmail dot com
16 years ago
A note on the smartReadFile function from gaosipov:

Change the indexes on the preg_match matches to:

$begin = intval($matches[1]);
if( !empty($matches[2]) ) {
$end = intval($matches[2]);
}

Otherwise the $begin would be set to the entire section matched and the $end to what should be the begin.

See preg_match for more details on this.
up
12
Hayley Watson
17 years ago
To avoid the risk of choosing themselves which files to download by messing with the request and doing things like inserting "../" into the "filename", simply remember that URLs are not file paths, and there's no reason why the mapping between them has to be so literal as "download.php?file=thingy.mpg" resulting in the download of the file "thingy.mpg".

It's your script and you have full control over how it maps file requests to file names, and which requests retrieve which files.

But even then, as ever, never trust ANYTHING in the request. Basic first-day-at-school security principle, that.
up
12
flobee at gmail dot com
19 years ago
regarding php5:
i found out that there is already a disscussion @php-dev about readfile() and fpassthru() where only exactly 2 MB will be delivered.

so you may use this on php5 to get lager files
<?php
function readfile_chunked($filename,$retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$cnt =0;
// $handle = fopen($filename, 'rb');
$handle = fopen($filename, 'rb');
if (
$handle === false) {
return
false;
}
while (!
feof($handle)) {
$buffer = fread($handle, $chunksize);
echo
$buffer;
if (
$retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if (
$retbytes && $status) {
return
$cnt; // return num. bytes delivered like readfile() does.
}
return
$status;

}
?>
up
15
TimB
16 years ago
To anyone that's had problems with Readfile() reading large files into memory the problem is not Readfile() itself, it's because you have output buffering on. Just turn off output buffering immediately before the call to Readfile(). Use something like ob_end_flush().
up
5
Paulinator
6 years ago
Always using MIME-Type 'application/octet-stream' is not optimal. Most if not all browsers will simply download files with that type.

If you use proper MIME types (and inline Content-Disposition), browsers will have better default actions for some of them. Eg. in case of images, browsers will display them, which is probably what you'd want.

To deliver the file with the proper MIME type, the easiest way is to use:

header('Content-Type: ' . mime_content_type($file));
header('Content-Disposition: inline; filename="'.basename($file).'"');
up
6
gaosipov at gmail dot com
16 years ago
Send file with HTTPRange support (partial download):

<?php
function smartReadFile($location, $filename, $mimeType='application/octet-stream')
{ if(!
file_exists($location))
{
header ("HTTP/1.0 404 Not Found");
return;
}

$size=filesize($location);
$time=date('r',filemtime($location));

$fm=@fopen($location,'rb');
if(!
$fm)
{
header ("HTTP/1.0 505 Internal server error");
return;
}

$begin=0;
$end=$size;

if(isset(
$_SERVER['HTTP_RANGE']))
{ if(
preg_match('/bytes=\h*(\d+)-(\d*)[\D.*]?/i', $_SERVER['HTTP_RANGE'], $matches))
{
$begin=intval($matches[0]);
if(!empty(
$matches[1]))
$end=intval($matches[1]);
}
}

if(
$begin>0||$end<$size)
header('HTTP/1.0 206 Partial Content');
else
header('HTTP/1.0 200 OK');

header("Content-Type: $mimeType");
header('Cache-Control: public, must-revalidate, max-age=0');
header('Pragma: no-cache');
header('Accept-Ranges: bytes');
header('Content-Length:'.($end-$begin));
header("Content-Range: bytes $begin-$end/$size");
header("Content-Disposition: inline; filename=$filename");
header("Content-Transfer-Encoding: binary\n");
header("Last-Modified: $time");
header('Connection: close');

$cur=$begin;
fseek($fm,$begin,0);

while(!
feof($fm)&&$cur<$end&&(connection_status()==0))
{ print
fread($fm,min(1024*16,$end-$cur));
$cur+=1024*16;
}
}
?>

Usage:

<?php
smartReadFile
("/tmp/filename","myfile.mp3","audio/mpeg")
?>

It can be slow for big files to read by fread, but this is a single way to read file in strict bounds. You can modify this and add fpassthru instead of fread and while, but it sends all data from begin --- it would be not fruitful if request is bytes from 100 to 200 from 100mb file.
up
1
jorensmerenjanu at gmail dot com
3 years ago
For anyone having the problem of your html page being outputted in the downloaded file: call the functions ob_clean() and flush() before readfile()
up
3
daren -remove-me- schwenke
13 years ago
If you are lucky enough to not be on shared hosting and have apache, look at installing mod_xsendfile.
This was the only way I found to both protect and transfer very large files with PHP (gigabytes).
It's also proved to be much faster for basically any file.
Available directives have changed since the other note on this and XSendFileAllowAbove was replaced with XSendFilePath to allow more control over access to files outside of webroot.

Download the source.

Install with: apxs -cia mod_xsendfile.c

Add the appropriate configuration directives to your .htaccess or httpd.conf files:
# Turn it on
XSendFile on
# Whitelist a target directory.
XSendFilePath /tmp/blah

Then to use it in your script:
<?php
$file
= '/tmp/blah/foo.iso';
$download_name = basename($file);
if (
file_exists($file)) {
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.$download_name);
header('X-Sendfile: '.$file);
exit;
}
?>
up
3
chrisputnam at gmail dot com
19 years ago
In response to flowbee@gmail.com --

When using the readfile_chunked function noted here with files larger than 10MB or so I am still having memory errors. It's because the writers have left out the all important flush() after each read. So this is the proper chunked readfile (which isn't really readfile at all, and should probably be crossposted to passthru(), fopen(), and popen() just so browsers can find this information):

<?php
function readfile_chunked($filename,$retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$cnt =0;
// $handle = fopen($filename, 'rb');
$handle = fopen($filename, 'rb');
if (
$handle === false) {
return
false;
}
while (!
feof($handle)) {
$buffer = fread($handle, $chunksize);
echo
$buffer;
ob_flush();
flush();
if (
$retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if (
$retbytes && $status) {
return
$cnt; // return num. bytes delivered like readfile() does.
}
return
$status;

}
?>

All I've added is a flush(); after the echo line. Be sure to include this!
up
0
simbiat at outlook dot com
3 years ago
flobee.at.gmail.dot.com shared "readfile_chunked" function. It does work, but you may encounter memory exhaustion using "fread". Meanwhile "stream_copy_to_stream" seems to utilize the same amount of memory as "readfile". At least, when I was testing "download" function for my https://github.com/Simbiat/HTTP20 library on 1.5G file with 256M memory limitation that was the case: "fread" I got peak memory usage of ~240M, while with "stream_copy_to_stream" - ~150M.
It does not mean that you can fully escape memory exhaustion, though: if you are reading too much at a time, you can still encounter it. That is why in my library I use a helper function ("speedLimit") to calculate whether selected speed limit will fit the available memory (while allowing some headroom).
You can read comments in the code itself for more details and raise issues for the library, if you think something is incorrect there (especially since it's WIP at the moment of writing this), but so far I am able to get consistent behavior with it.
up
0
mAu
18 years ago
Instead of using
<?php
header
('Content-Type: application/force-download');
?>
use
<?php
header
('Content-Type: application/octet-stream');
?>
Some browsers have troubles with force-download.
up
-2
antispam [at] rdx page [dot] com
19 years ago
Just a note: If you're using bw_mod (current version 0.6) to limit bandwidth in Apache 2, it *will not* limit bandwidth during readfile events.
up
-5
Brian
10 years ago
If you are looking for an algorithm that will allow you to download (force download) a big file, may this one will help you.

$filename = "file.csv";
$filepath = "/path/to/file/" . $filename;

// Close sessions to prevent user from waiting until
// download will finish (uncomment if needed)
//session_write_close();

set_time_limit(0);
ignore_user_abort(false);
ini_set('output_buffering', 0);
ini_set('zlib.output_compression', 0);

$chunk = 10 * 1024 * 1024; // bytes per chunk (10 MB)

$fh = fopen($filepath, "rb");

if ($fh === false) {
echo "Unable open file";
}

header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="' . $filename . '"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($filepath));

// Repeat reading until EOF
while (!feof($fh)) {
echo fread($handle, $chunk);

ob_flush(); // flush output
flush();
}

exit;
up
-2
Anonymous
5 years ago
To avoid errors,
just be careful whether slash "/" is allowed or not at the beginning of $file_name parameter.

In my case, trying to send PDF files thru PHP after access-logging,
the beginning "/" must be removed in PHP 7.1.
up
-4
planetmaster at planetgac dot com
19 years ago
Using pieces of the forced download script, adding in MySQL database functions, and hiding the file location for security was what we needed for downloading wmv files from our members creations without prompting Media player as well as secure the file itself and use only database queries. Something to the effect below, very customizable for private access, remote files, and keeping order of your online media.

<?
# Protect Script against SQL-Injections
$fileid=intval($_GET[id]);
# setup SQL statement
$sql = " SELECT id, fileurl, filename, filesize FROM ibf_movies WHERE id=' $fileid' ";

# execute SQL statement
$res = mysql_query($sql);

# display results
while ($row = mysql_fetch_array($res)) {
$fileurl = $row['fileurl'];
$filename= $row['filename'];
$filesize= $row['filesize'];

$file_extension = strtolower(substr(strrchr($filename,"."),1));

switch ($file_extension) {
case "wmv": $ctype="video/x-ms-wmv"; break;
default: $ctype="application/force-download";
}

// required for IE, otherwise Content-disposition is ignored
if(ini_get('zlib.output_compression'))
ini_set('zlib.output_compression', 'Off');

header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false);
header("Content-Type: video/x-ms-wmv");
header("Content-Type: $ctype");
header("Content-Disposition: attachment; filename=\"".basename($filename)."\";");
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".@filesize($filename));
set_time_limit(0);
@readfile("$fileurl") or die("File not found.");

}

$donwloaded = "downloads + 1";

if ($_GET["hit"]) {
mysql_query("UPDATE ibf_movies SET downloads = $donwloaded WHERE id=' $fileid'");

}

?>

While at it I added into download.php a hit (download) counter. Of course you need to setup the DB, table, and columns. Email me for Full setup// Session marker is also a security/logging option
Used in the context of linking:
http://www.yourdomain.com/download.php?id=xx&hit=1

[Edited by sp@php.net: Added Protection against SQL-Injection]
up
-4
peavey at pixelpickers dot com
19 years ago
A mime-type-independent forced download can also be conducted by using:

<?
(...)
header("Expires: Mon, 26 Jul 1997 05:00:00 GMT"); // some day in the past
header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT");
header("Content-type: application/x-download");
header("Content-Disposition: attachment; filename={$new_name}");
header("Content-Transfer-Encoding: binary");
?>

Cheers,

Peavey
up
-4
Thomas Jespersen
20 years ago
Remember if you make a "force download" script like mentioned below that you SANITIZE YOUR INPUT!

I have seen a lot of download scripts that does not test so you are able to download anything you want on the server.

Test especially for strings like ".." which makes directory traversal possible. If possible only permit characters a-z, A-Z and 0-9 and make it possible to only download from one "download-folder".
up
-3
TheDayOfCondor
19 years ago
Beware - the chunky readfile suggested by Rob Funk can easily exceed you maximum script execution time (30 seconds by default).

I suggest you to use the set_time_limit function inside the while loop to reset the php watchdog.
up
-5
Zambz
14 years ago
If you are using the procedures outlined in this article to force sending a file to a user, you may find that the "Content-Length" header is not being sent on some servers.

The reason this occurs is because some servers are setup by default to enable gzip compression, which sends an additional header for such operations. This additional header is "Transfer-Encoding: chunked" which essentially overrides the "Content-Length" header and forces a chunked download. Of course, this is not required if you are using the intelligent versions of readfile in this article.

A missing Content-Length header implies the following:

1) Your browser will not show a progress bar on downloads because it doesn't know their length
2) If you output anything (e.g. white space) after the readfile function (by mistake), the browser will add that to the end of the download, resulting in corrupt data.

The easiest way to disable this behaviour is with the following .htaccess directive.

SetEnv no-gzip dont-vary
up
-5
anon
8 years ago
In the C source, this function simply opens the path in read+binary mode, without a lock, and uses fpassthru()

If you need a locked read, use fopen(), flock(), and then fpassthru() directly.
up
-5
TheDayOfCondor
19 years ago
I think that readfile suffers from the maximum script execution time. The readfile is always completed even if it exceed the default 30 seconds limit, then the script is aborted.
Be warned that you can get very odd behaviour not only on large files, but also on small files if the user has a slow connection.

The best thing to do is to use

<?
set_time_limit(0);
?>

just before the readfile, to disable completely the watchdog if you intend to use the readfile call to tranfer a file to the user.
To Top