PostgreSQL Bugs

Collected from the PG bugs email list.

Bug ID15730
PG Version9.6.6
OSRedHat Enterprise linux 7.3
Opened2019-04-03 03:58:15+00
Reported byShuLin Du
StatusNew

Body of first available message related to this bug follows.

The following bug has been logged on the website:

Bug reference:      15730
Logged by:          ShuLin Du
Email address:      (redacted)
PostgreSQL version: 9.6.6
Operating system:   RedHat Enterprise linux 7.3
Description:        

I use the pg_bulkload(3.1.14) to load the file data into table .  
The input data(file data) needs to be processed by filtering function(Define
filter function in CTL file) before insert into table.
But I found that if I used a filter function to process input data,then
output to db table,the maximum length limit for table fields is invalid. 

filter function as below (input data :2 items, output data: 3 items): 
create function FVUA001(varchar,varchar) returns record as $BODY$ 
  select row($1,$2,null) $BODY$ 
 language sql volatile
cost 100;

But, Even if the length of a field in the input data is longer than the
length of the table definition,data will also be successfully inserted into
tables. So I wonder if this would be a postgreSQL bug.  
Please help me solve this problem. Thanks a lot.

Messages

DateAuthorSubject
2019-04-03 03:58:15+00PG Bug reporting formBUG #15730: Using filter function to interpolate data, the length limit of table field is invalid