Hi @badams,
Have you tried setting the Scan for Schema Type to Yes on the CSV Reader? By default this is set to No. This is found when adding the CSV reader here: Advanced > Schema Generation.
With this setting set to Yes, FME will attempt to determine the correct type for each attribute.
https://www.screencast.com/t/l5w3qRBkHKQAlternatively, you can also set the data types in the Attribute Definition setting when Manual mode is selected.
Thanks @AndreaAtSafe, the numerical fields came out numerical!
Something else is going on now - when I look at the attribute tables in Arc, all the numerical fields in some of the feature classes are blank (<null>). In the excel sheet, each cell has a value.
atttab.jpg
Any idea what might be going on here?
Thanks @AndreaAtSafe, the numerical fields came out numerical!
Something else is going on now - when I look at the attribute tables in Arc, all the numerical fields in some of the feature classes are blank (<null>). In the excel sheet, each cell has a value.
atttab.jpg
Any idea what might be going on here?
Glad it worked @badams!
With the <null> values: The
data type that FME automatically selected for that field is likely not large enough to hold some of the values. If a value is outside of the data type size limit, it will make the value null. This can occur if the value occurred in a row that was not included when scanning for the schema. You would have seen a warning in the log file stating something similar to:
CSV reader: The string '283' read in file 'data file' for field 'num_measures' is not a valid value for type 'UInt8'. The value will be set to <null>. ...
In order to read the value correctly, recreate the source feature type and set a larger value for the 'Maximum Rows to Scan' option to scan additional rows for fields.
This should help solve it!