[ale] OT: Perl and VERY large files
Jeff Rose
jojerose at mindspring.com
Thu Aug 7 09:22:19 EDT 2003
>Could you show us the line(s) of code around this ?
open IN,"<$ARGV[0]" or die $!;
open OUT,">$ARGV[1]" or die $!;
while (read IN,$data,1750) {
syswrite OUT, $data, 5, 181;
syswrite OUT, $data, 2, 179;
syswrite OUT, $data, 1, 34;
syswrite OUT, $data, 10, 50;
syswrite OUT, $data, 4, 79;
syswrite OUT, $data, 7, 72;
syswrite OUT, $data, 10, 50;
syswrite OUT, "\n", 1;
} # while
close IN;
close OUT;
>The error looks more like it's trying to READ the file in, not just open
>it ...
maybe so but the error points to the line where IN handle is assigned to
$ARGV[0]
The script works for files <2GB but not >2GB. Large file support has to
be compiled into perl. The version we have was compiled with large file
support.
>Is it using some object/datatype? AFAIK "open(FILE, "<filename");" just
>allocates a file handle to reference the file and isn't reading the file
>(and thus shouldn't overly CARE how big the file is, but I'll stipulate
>that there might be something that does quick checks)
I don't think $data is the problem as the script works for files <2GB.
>OTOH, if something is then doing "@lines = <FILE>;", that's gonna blow up
>fast, since 17G of file probably has a few lines ;o (and I think slurpy
>file reading like that reads the whole file into memory and then splits,
>which could be causing more problems)
I don't think it reads the whole thing into memory because I don't think
we don't have 2GB of memory.
Thanks for the help
Jeff Rose
_______________________________________________
Ale mailing list
Ale at ale.org
http://www.ale.org/mailman/listinfo/ale
More information about the Ale
mailing list