Stop PostgreSQL from logging statements on error

Posted on

Question :

My postgresql 9.3 logs are sprinkled with entries like these (redacted to protect the guilty):

STATEMENT:  INSERT INTO table (key, value) VALUES ($1, $2)
ERROR:  duplicate key value violates unique constraint "table_pkey"
DETAIL:  Key (key)=(xyz) already exists.
STATEMENT:  INSERT INTO table (key, value) VALUES ('xyz', '{...json...}')

I know the source of the problem (a race condition where two app servers do the same work) and that code gracefully handles the insertion failure, but for the life of me I can’t seem to get PostgreSQL to stop logging the statement which generated the error to begin with.

I’ve set log_statements for this db to ‘none’ (which persists across connections):

show log_statement;
 log_statement 
---------------
none
(1 row)

But to no avail. These entries still keep hitting the logs. I wouldn’t mind except the “values” from the logged statement are JSON blobs which aren’t tiny. How can I keep PostgreSQL from logging whenever an insert violates the unique constraint?

Answer :

The amount of details logged is controlled by log_error_verbosity. Setting it to TERSE will exclude DETAIL, HINT, QUERY, and CONTEXT error information:

ALTER DATABASE db_name SET log_error_verbosity to 'TERSE';

Set log_error_verbosity to TERSE and log_min_error_statement to LOG in postgresql.conf and reload with

pc_ctl -D <data_dir> reload

Leave a Reply

Your email address will not be published. Required fields are marked *