[ale] Dealing with really big log files....
Kenneth Ratliff
lists at noctum.net
Sun Mar 22 09:41:49 EDT 2009
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
I need to extract data from a mysql log in a 12 hour window. The
particular problem here is that the log file is 114 gigs, and goes
back to november 2008 (yes, someone screwed the pooch with the log
rotation on this one, already fixed *that* particular problem, but
still have the resulting big log file!)
Now, my normal methods of parsing through a log file take a really
really long time due to it's size.
I know about what line number the data I want begins on. Is there an
easy way to just chop off all the lines before that and leaving
everything else intact? Obviously, due to the size of the file, I
can't load it in vi to do my usual voodoo for this crap.
I'm thinking of running sed -e '1,<really big number>d' mysql.log
against it, but does anyone know of a better method to just chunk out
a big section of a text file (and by better, I mean faster, it takes
upwards of 3 hours to process this damn thing)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.9 (Darwin)
iEYEARECAAYFAknGQCAACgkQXzanDlV0VY7CHACgsfmnV4YuXSFbQyBV2gTsa/r5
29cAn0ZlZcz7YSnSw6WbNHH4is2GXpHp
=2JWk
-----END PGP SIGNATURE-----
More information about the Ale
mailing list