postgresql9.5.9 Related log file description

Source: Internet
Author: User
Tags postgres database


One. PostgreSQL has 3 types of logs, Pg_log (database run log), Pg_xlog (WAL log, redo log), Pg_clog (transaction commit log, transaction metadata logged)


Pg_log is off by default and needs to be set to enable this log. Pg_xlog and Pg_clog are both forced to open and cannot be closed


1. Enable Pg_log and configure log parameters

log_destination = ' Csvlog '

Logging_collector = On

log_directory = ' Pg_log '

Log_filename = ' Postgresql-%y-%m-%d_%h%m%s.log '

Log_rotation_age = 1d

Log_rotation_size = 100MB

Log_min_messages = Warning


Parameter explanation:

log_directory = '/data/pgsql086/log '

This parameter can only be set in the Postgresql.conf file. It determines the directory where the database runs the log files. The default value is Pg_log. can be an absolute path, or a relative path (relative to the path where the database file resides).


Log_filename = ' Postgresql-%y-%m-%d_%h%m%s.log '

It determines the name of the database run log file. The default value is Postgresql-%y-%m-%d_%h%m%s.log. Its value can contain strings such as%y,%m,%d,%H,%m, and%s, each representing the year, month, day, hour, minute, and second. If no time information is specified in the value of the parameter (no%y,%m,%d,%H,%m, and%s are present), the system automatically appends the file creation time to the end of the Log_filename value as the file name, for example, if the Log_filename value is Server_ Log, then the name of the logfile created in Sun 19:02:33 2004 MST will be server_log.1093827753,1093827753 is a representation of Sun's 19:02:33 2004 MST within the database. This parameter can only be set in the Postgresql.conf file.


Log_rotation_age = 1d

Lifetime of a single log file, default 1 days, when the log file size does not reach log_rotation_size, only one log file is generated per day


Log_rotation_size = 10MB

This parameter can only be set in the Postgresql.conf file. The size of a single log file, if the time does not exceed log_rotation_age, a log file can be up to 10M, otherwise it will be reborn as a log file.


Log_min_messages = warning default is warning,

Controls the level of messages written to the database log file. Valid values are DEBUG5, DEBUG4, DEBUG3, DEBUG2, DEBUG1, info, NOTICE, WARNING, ERROR, LOG, fatal, and panic, each containing information from all levels following it. The higher the level, the more messages are logged in the database run log.


To restart the PostgreSQL service:

[Email protected] log]$ pg_ctl status

Pg_ctl:no server running

[[email protected] log]$ pg_ctl start

Server starting

[[email protected] log]$ log:redirecting log output to logging collector process

Hint:future log output would appear in directory "/data/pgsql086/log".


[Email protected] log]$ ll/data/pgsql086/log/*

-RW-------. 1 postgres postgres 30477 Nov 2 12:01/data/pgsql086/log/postgres.log

-RW-------. 1 postgres postgres 794 Nov 2 15:14/data/pgsql086/log/postgresql-2017-11-02_151410.csv

-RW-------. 1 Postgres postgres 2 15:14/data/pgsql086/log/postgresql-2017-11-02_151410.log

[Email protected] log]$ tailf/data/pgsql086/log/postgresql-2017-11-02_151410.csv

2017-11-02 15:14:10.385 CST,,, 21315,,59fac5c2.5343,1,,2017-11-02 15:14:10 cst,,0,log,00000, "ending LOG output to StdErr "," future log output would go to log destination "" Csvlog "". ",,,,,,," "

2017-11-02 15:14:10.387 CST,,, 21317,,59fac5c2.5345,1,,2017-11-02 15:14:10 cst,,0,log,00000, "database system was shut Down at 2017-11-02 15:13:29 CST ",,,,,,,,," "

2017-11-02 15:14:10.389 CST,,, 21317,,59fac5c2.5345,2,,2017-11-02 15:14:10 cst,,0,log,00000, "MultiXact member Wraparound protections is now enabled ",,,,,,,,," "

2017-11-02 15:14:10.392 CST,,, 21321,,59fac5c2.5349,1,,2017-11-02 15:14:10 cst,,0,log,00000, "autovacuum launcher Started ",,,,,,,,," "

2017-11-02 15:14:10.392 CST,,, 21315,,59fac5c2.5343,2,,2017-11-02 15:14:10 cst,,0,log,00000, "Database system is ready To accept connections ",,,,,,,,," "

2017-11-02 15:14:43.211 CST,,, 21338, "127.0.0.1:33506", 59fac5e3.535a,1, "", 2017-11-02 15:14:43 CST,,0,LOG,00000, " Connection received:host=127.0.0.1 port=33506 ",,,,,,,,," "

2017-11-02 15:14:43.212 CST, "Postgres", "Postgres", 21338, "127.0.0.1:33506", 59fac5e3.535a,2, "Authentication", 2017-11-02 15:14:43 cst,2/5,0,log,00000, "Connection authorized:user=postgres database=postgres",,,,,,,,, ""

2017-11-02 15:14:55.416 CST,,, 21339, "127.0.0.1:33507", 59fac5ef.535b,1, "", 2017-11-02 15:14:55 CST,,0,LOG,00000, " Connection received:host=127.0.0.1 port=33507 ",,,,,,,,," "

2017-11-02 15:14:55.417 CST, "Postgres", "testdb03", 21339, "127.0.0.1:33507", 59fac5ef.535b,2, "Authentication", 2017-11-02 15:14:55 cst,3/1,0,log,00000, "Connection authorized:user=postgres database=testdb03",,,,,,,,, ""

2017-11-02 15:15:26.873 CST, "Postgres", "testdb03", 21339, "127.0.0.1:33507", 59fac5ef.535b,3, "select", 2017-11-02 15:14:55 cst,3/4,0,error,42p01, "relation" "Weather1" "does not Exist",,,,,, "select * from Weather1;", ",", "Psql.bin"

The above two are part of the Postgresql-2011-03-15_000000.csv log file, due to the readability of the log file
Is poor, you can import the CSV log into the database table using the following method. Details are as follows


Import a CSV log into the database table
Adjustment parameters
log_destination = ' Csvlog '
Logging_collector = On

After these two parameters are modified, the PG SERVER needs to be restarted.

2--Creating a log Record table
CREATE TABLE Postgres_log
(
Log_time Timestamp (3) with time zone,
User_name text,
database_name text,
process_id Integer,
Connection_from text,
session_id text,
Session_line_num bigint,
Command_tag text,
Session_start_time timestamp with time zone,
virtual_transaction_id text,
transaction_id bigint,
Error_severity text,
Sql_state_code text,
Message text,
Detail text,
Hint text,
Internal_query text,
Internal_query_pos Integer,
Context text,
Query text,
Query_pos Integer,
Location text,
Application_name text,
PRIMARY KEY (session_id, Session_line_num)
);

Notice:create table/primary KEY would CREATE implicit index "Postgres_log_pkey" for TABLE "Postgres_log"
CREATE TABLE;

Note: Create log table Postgres_log to save CSV log data.

3--import operating system CSV log to table Postgres_log table
skytf=# copy skytf.postgres_log from '/var/applog/pg_log/postgresql-2011-03-14_000000.csv ' with CSV;
COPY 26031

skytf=# copy skytf.postgres_log from '/var/applog/pg_log/postgresql-2011-03-15_000000.csv ' with CSV;
COPY 1297

      Note: Importing and exporting data as a file requires Superuser Postgres to connect to the target library.

4--Common Log Parsing sql
skytf=# Select min (log_time), Max (log_time) from Skytf.postgres_log;
Min | Max
----------------------------+----------------------------
2011-03-14 14:04:07.275+08 | 2011-03-16 05:04:34.427+08
(1 row)

Skytf=> Select Log_time,database_name,user_name,application_name,message from postgres_log where message like '% duration% ';
Log_time | database_name | User_name |                                                    Application_name | Mess
Age
----------------------------+---------------+-----------+------------------+----------------------------------- ---------------------
-------------------------------------------------------
2011-03-15 00:23:38.957+08 | db_lbs |                  lbs | | duration:1297.440 MS Execute <UNNAMED>: SELECT cit
Yname,province,the_geom as The_geom from china_city
.......
For ease of display, only one record is taken.


Tips:

A new PostgreSQL run log file is generated as soon as the database is restarted, and Postgresql-2017-11-02_151410.csv logs only the error log

[[email protected] log]$ Pg_ctl stop

Waiting for server to shut down ... done

Server stopped

[[email protected] log]$ pg_ctl start

Server starting

[[email protected] log]$ log:redirecting log output to logging collector process

Hint:future log output would appear in directory "/data/pgsql086/log".


[Email protected] log]$ ll/data/pgsql086/log/

Total 56

-RW-------. 1 postgres postgres 30477 Nov 2 12:01 postgres.log

-RW-------. 1 postgres postgres 4971 Nov 2 15:37 postgresql-2017-11-02_151410.csv

-RW-------. 1 postgres postgres 2 15:14 postgresql-2017-11-02_151410.log

-RW-------. 1 postgres postgres 794 Nov 2 15:37 postgresql-2017-11-02_153751.csv

-RW-------. 1 postgres postgres 2 15:37 postgresql-2017-11-02_153751.log



2. Log slow SQL execution

Log_min_duration_statement = 60

Log_checkpoints = On

Log_connections = On

Log_disconnections = On

Log_duration = On

Log_line_prefix = '%m '

# Monitor long-time locks in the database

Log_lock_waits = On

# Log DDL operations

log_statement = ' DDL '


Reference blog:

Http://www.cnblogs.com/alianbog/p/5596921.html

http://blog.csdn.net/shanzhizi/article/details/47616645

This article from "10931853" blog, declined reprint!

postgresql9.5.9 Related log file description

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.