From 660902067752ad7752f3b87326181baa938f0a8c Mon Sep 17 00:00:00 2001 From: josephmarlin Date: Mon, 25 Jun 2012 12:01:16 -0400 Subject: [PATCH] Mirrored English changes in the doc --- doc/pgBadger.pod | 45 +++++++++++++++++++++++---------------------- 1 file changed, 23 insertions(+), 22 deletions(-) diff --git a/doc/pgBadger.pod b/doc/pgBadger.pod index bda4b28..19793e0 100644 --- a/doc/pgBadger.pod +++ b/doc/pgBadger.pod @@ -10,7 +10,7 @@ pgbadger [options] logfile [...] Arguments: - logfile can be a single log file, a list of files or a shell command + logfile can be a single log file, a list of files, or a shell command returning a list of file. Options: @@ -18,32 +18,32 @@ Options: -l | --logfile filename: path to the PostgreSQL log file to parse. It can be a plain text log or a gzip compressed file with the .gz extension. Note that this option is - DEPRECATED, set logfile as a command line argument + DEPRECATED: set logfile as a command line argument instead. -f | --format logtype : possible values: syslog,stderr,csv. Default: stderr -o | --outfile filename: define the filename for the output. Default depends - of the output format: out.html or out.txt. To dump + on the output format: out.html or out.txt. To dump output to stdout use - as filename. -i | --ident name : programname used as syslog ident. Default: postgres -h | --help : show this message and exit. - -d | --dbname database : only report what concern the given database - -u | --dbuser username : only report what concern the given user - -t | --top number : number of query to store/display. Default: 20 - -s | --sample number : number of query sample to store/display. Default: 3 + -d | --dbname database : only report what concerns the given database + -u | --dbuser username : only report what concerns the given user + -t | --top number : number of queries to store/display. Default: 20 + -s | --sample number : number of query samples to store/display. Default:3 -x | --extension : output format. Values: text or html. Default: html - -m | --maxlength size : maximum length of a query, it will be cutted above + -m | --maxlength size : maximum length of a query: it will be cut above the given size. Default: no truncate -g | --graph : generate graphs using the Flotr2 javascript library -b | --begin datetime : start date/time for the data to be parsed in log. -e | --end datetime : end date/time for the data to be parsed in log. -q | --quiet : don't print anything to stdout. - -p | --progress : show a progress bar, quiet mode is automaticaly + -p | --progress show a progress bar, quiet mode is automatically enabled with this option. --pie-limit num : pie data lower than num% will show a sum instead. -w | -watch-mode : only report errors just like logwatch could do. --exclude-query regex : any query matching the given regex will be excluded - from the report. For example: "^(VACUUM|COMMIT)" - you can use this option multiple time. + from the report- for example: "^(VACUUM|COMMIT)". + You can use this option multiple times. --disable-error : do not generate error report. --disable-hourly : do not generate hourly reports. --disable-type : do not generate query type report. @@ -60,24 +60,25 @@ Examples: pgbadger -p -g /var/log/postgresql.log pgbadger -p -g /var/log/postgres.log.2.gz /var/log/postgres.log.1.gz /var/log/postgres.log pgbadger -p -g /var/log/postgresql/postgresql-2012-05-* - pgbadger -p -g --exclude-query="^(COPY|COMMIT)" /var/log/postgresql.log + pgbadger -p -g --exclude-query="^(COPY|COMMIT)" /var/log/postgresql.log + pgbadger -p -g -b "2012-06-25 10:56:11" -e "2012-06-25 10:59:11" /var/log/postgresql.log Reporting errors every week by cron job: 30 23 * * 1 /usr/bin/pgbadger -q -w /var/log/postgresql.log -o /var/reports/pg_errors.html -this suppose that your log file and HTML report are also rotated every weeks. + This suppose that your log file and HTML report are also rotated every week. =head1 DESCRIPTION -pgBadger is a PostgreSQL log analyzer build for speed with fully detailed reports from your PostgreSQL log file. It's a single and small Perl script that aims to replace and outperform the old php script pgFouine. +pgBadger is a PostgreSQL log analyzer built for speed with fully detailed reports from your PostgreSQL log file. It's a single and small Perl script that aims to replace and outperform the old php script pgFouine. -By the way, we would like to thank Guillaume Smet for all the work he has done on this really nice tool. We've been using it a long time, it was a really great tool! +By the way, we would like to thank Guillaume Smet for all the work he has done on this really nice tool. We've been using it a long time, it is a really great tool! -pgBadger is written in pure Perl language. It uses a javascript library to draw graphs so that you don't need additional Perl modules or any other package to install. Furthermore, this library gives us more features such as zooming. +pgBadger is written in pure Perl language. It uses a javascript library to draw graphs so that you don't need additional Perl modules or any other package to install. Furthermore, this library gives us additional features, such as zooming. -pgBadger is able to autodetect your log file format (syslog, stderr or csvlog). It is designed to parse huge log files as well as gzip compressed file. See a complete list of features below. +pgBadger is able to autodetect your log file format (syslog, stderr or csvlog). It is designed to parse huge log files, as well as gzip compressed file. See a complete list of features below. =head1 FEATURE @@ -103,7 +104,7 @@ All charts are zoomable and can be saved as PNG images. =head1 REQUIREMENT -PgBadger comes as a single Perl script, you do not need anything else than a modern Perl distribution. Charts are rendered using a Javascript library so you don't need anything, your browser will do all the work. +PgBadger comes as a single Perl script- you do not need anything else than a modern Perl distribution. Charts are rendered using a Javascript library so you don't need anything. Your browser will do all the work. If you planned to parse PostgreSQL CSV log files you might need some Perl Modules: @@ -113,7 +114,7 @@ This module is optional, if you don't have PostgreSQL log in the CSV format you =head1 POSTGRESQL CONFIGURATION -You must enable some configuration directives into your postgresql.conf before starting. +You must enable some configuration directives in your postgresql.conf before starting. You must first enable SQL query logging to have something to parse: @@ -125,7 +126,7 @@ With 'stderr' log format, log_line_prefix must be at least: log_line_prefix = '%t [%p]: [%l-1] ' -Log line prefix could add user and database name as follow: +Log line prefix could add user and database name as follows: log_line_prefix = '%t [%p]: [%l-1] user=%u,db=%d ' @@ -158,13 +159,13 @@ Download the tarball from github and unpack the archive as follow: tar xzf pgbadger-1.x.tar.gz cd pgbadger-1.x/ perl Makefile.PL - make && make install + make && sudo make install This will copy the Perl script pgbadger in /usr/local/bin/pgbadger directory by default and the man page into /usr/local/share/man/man1/pgbadger.1. Those are the default installation directory for 'site' install. If you want to install all under /usr/ location, use INSTALLDIRS='perl' as argument of Makefile.PL. The script will be installed into /usr/bin/pgbadger and the manpage into /usr/share/man/man1/pgbadger.1. -For example to install everything just like Debian does, proceed as follow: +For example, to install everything just like Debian does, proceed as follow: perl Makefile.PL INSTALLDIRS=vendor -- 2.40.0