Using Csv-format Log Output

Source: Internet
Author: User
Tags postgresql syslog table definition

IncludingCsvlogIn thelog_destinationList provides a convenient-to-import log files into a database table. This option emits the log lines in Comma-separated-values (CSV) format, with these columns:time stamp with milliseconds, user name, database name, process ID, client host:port number, Session ID, Per-session line number, command tag, session start time, virtual transaction ID, regular transaction ID, err or severity, SQLSTATE code, error message, error message detail, hint, internal query that led to the error (if any), Char Acter count of the error position therein, error context, user query, the led to the error (if any and enabled bylog_min_error_statement), character count of the error position therein, location of the "error" in the PostgreSQL source code (iflog_error_verbosityis set toverbose), and application name. Here are a sample table definition for storing Csv-format log output:

CREATE TABLEPostgres_log (Log_timetimestamp(3) withTime Zone,user_name text, database_nametext, process_idinteger, Connection_fromtext, session_idtext, Session_line_numbigint, Command_tagtext, Session_start_timetimestamp  withTime zone, virtual_transaction_idtext, transaction_idbigint, Error_severitytext, Sql_state_codetext, Messagetext, Detailtext, hinttext, Internal_querytext, Internal_query_posinteger, Contexttext, Querytext, Query_posinteger, locationtext, Application_nametext,  PRIMARY KEY(session_id, Session_line_num));

To import a log file into this table with the COPY from command:

 from ' /full/path/to/logfile.csv '  with CSV;

There is a few things you need to does to simplify importing CSV log files:

  1. Set log_filename and log_rotation_age to provide a consistent, predictable naming scheme for your log fi Les. This lets your predict what the file name would be and know if an individual log file was complete and therefore ready to B E imported.

  2. Set log_rotation_size to 0 to disable size-based log rotation, as it makes the log file name difficult to predict .

  3. Set log_truncate_on_rotation to on So, the old log data isn ' t mixed with the new in the same file.

  4. The table definition above includes a primary key specification. This was useful to protect against accidentally importing the same information twice. the COPY Command commits all of the data it imports at one time, so any error would cause the entire import T o fail. If you import a partial log file and later import the file again when it was complete, the primary key violation would cause The import to fail. Wait until the log is complete and closed before importing. This procedure would also protect against accidentally importing a partial line that hasn ' t been completely written, which would also cause COPY to fail.

Logging_collector (boolean)

This parameter enables the logging collector, which are a background process that captures logs messages sent to stderr and redirects them into log files. This approach was often more useful than logging to syslog, since some types of messages might not appear in syslog output. (One common example is dynamic-linker failure messages; Another was error messages produced by scripts such as archive _command.) This parameter can only is set at server start.

Note: It is possible to log to stderr without using the logging collector; The log messages would just go to W Herever the server ' s stderr is directed. However, this method is only suitable for the low log volumes, since it provides no convenient-to-rotate log files. Also, on some platforms not using the-logging collector can result in lost or garbled log output, because multiple process Es writing concurrently to the same log file can overwrite each other ' s output.

Note: The logging collector is designed to never lose messages. This means, the extremely high load, the server processes could is blocked while the trying to send additional log mess Ages when the collector had fallen behind. In contrast, Syslog prefers to drop messages if it cannot write them, which means it could fail to log some messages in Such cases but it would not block the rest of the system.

Reference:

Http://www.postgresql.org/docs/current/interactive/runtime-config-logging.html

Using Csv-format Log Output

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.