[ale] OT: Perl and VERY large files
David Corbin
dcorbin at machturtle.com
Thu Aug 7 12:40:10 EDT 2003
Jeff Rose wrote:
> Fletch wrote:
>
>>
>> D'oh. Never mind me, that's the system error text for EOVERFLOW.
>> Shows how long it's been since I've touched Solaris. :/
>>
>>
>> Only other thing I can think of is to run truss and watch to make sure
>> it's calling the open64() syscall rather than the plain open() call.
>>
>>
>>
>>
> I'm wondering if this is a problem with Solaris. I found a site that
> said open() is really a C function and it might need to be recompiled
> with large file support. Another suggested that upgrading the OS
> would resolve the problem. I have the power to do neither. Our admin
> might be persuaded to do one or both if I can nail down the actual
> problem. We regularly work with files as large as ~30GB. We
> historically have used C, but recently found perl is sooooo much
> quicker to program in. And almost as fast. But the file limitation
> would seriously limit our ability to use perl.
Is it practical to code where you read from stdin instead of opening the
file? If it's a problem with "open", that might get around it...(since
all you're doing is linear reads). If not, I'd ask on perlmonks.org.
David
>
> Thanks for the help
>
> Jeff Rose
>
> _______________________________________________
> Ale mailing list
> Ale at ale.org
> http://www.ale.org/mailman/listinfo/ale
>
_______________________________________________
Ale mailing list
Ale at ale.org
http://www.ale.org/mailman/listinfo/ale
More information about the Ale
mailing list