From: Darold Gilles Date: Wed, 13 Feb 2013 09:33:52 +0000 (+0100) Subject: Force multiprocess per file when files are compressed. Thanks to Julien Rouhaud for... X-Git-Tag: v3.2~40 X-Git-Url: https://granicus.if.org/sourcecode?a=commitdiff_plain;h=bd9ffab2697dcc8e2267549305c78e27315d04fc;p=pgbadger Force multiprocess per file when files are compressed. Thanks to Julien Rouhaud for the report. --- diff --git a/pgbadger b/pgbadger index 9e3ca5c..55921a1 100755 --- a/pgbadger +++ b/pgbadger @@ -6017,6 +6017,10 @@ sub get_log_file $totalsize = `$cmd_file_size`; chomp($totalsize); } + if ($queue_size) { + $job_per_file = $queue_size; + $queue_size = 0; + } } # In list context returns the filehandle and the size of the file @@ -6049,6 +6053,10 @@ sub split_logfile $cmd_file_size =~ s/\%f/$logf/g; $totalsize = `$cmd_file_size`; chomp($totalsize); + if ($queue_size) { + $job_per_file = $queue_size; + $queue_size = 0; + } } elsif ($logf =~ /\.bz2/i) { $totalsize = 0; }