Skip to main content

Hi, I have this dynamic setup where a postgres FeatureReader reads the schema and the results are written to another database using this schema information to create the new tables. However, a table having colums of type unqualified varchar will be regenerated with varchar(80) and if records are loaded having length >80 the workspace will crash, saying:

 

Error was 'ERROR:  value too long for type character varying(80)

I attached the workspace. Is this a bug in FME or am I doing something unbelieveable silly?😕 I'm using FME(R) 2020.2.3.0 (20210129 - Build 20820 - WIN64) on a postgres 12.9 (Ubuntu 12.9-1.pgdg18.04+1).

 

@Helmoet de Zwijger​ Thank you for including the nice reproduction workspace. I was able to reproduce your issue. This does look like a deficiency in the PostGreSQL reader. FME doesn't fully support the unbounded 'varchar'. Also, the PostgreSQL writer feature type does not support an unbounded varchar.

But... it looks like the dynamic writer can be made to support this. On the schema feature:

attribute{0}.native_data_type = varchar(80)

If you replace the varchar(80) with 'varchar' then FME will create the unbounded varchar (character varying) as expected.

 

You can do this using ListSearcher and an AttributeCreator - see the attached workspace. But if you're likely to have more than one 'varchar' column in your data then a short PythonCaller script is probably more robust.


Gracias! This was a workaround we indeed applied.


Hi
I have the same issue with FME2023.2.2..0 (23781). I can’t find the attached workspace @markatsafe 

Best regards

Felipe Verdú


Reply