How do I prepare CSV files for use in ArcGIS Desktop.
I ask because I have some troubles using CSV files because ArcGIS attributes wrong field types to my columns and also misinterprets special characters such as á or ê.
I have read in the Esri forum that there is a so-called schema.ini file that defines somehow the field types e.g "Col22=V002 Text" see here http://forums.esri.com/Thread.asp?c=93&f=1149&t=64464
That's kind of funny because I have often seen these .ini files on my disc but never actually wondered what they are good for. It is kind of weird that Excel stores such metadata in an extra file since other programs like R don't do so.
I already tried to manipulate this .ini file with little success since I didn't find out how to apply for example "string" type. There are some information on MS sites, see here: http://msdn.microsoft.com/en-us/library/windows/desktop/ms709353%28v=vs.85%29.aspx but I couldn't find a solution.
Also I didn't really like the idea to work with this .ini file because it is quite a bunch of work to define and type all the fieldnames when I have let's say 50 columns. And these .ini files might get lost, etc.
Answer
My quick fix is to create the first row all with dummy values, and then delete this row/record after bringing into in ArcGIS.
This first row contains representative values or often wildly different values (e.g. alphabetic characters even if the column contains numbers that I want to be text data type) and with the largest number of characters needed for that row (because text fields tend to get truncated).
Date/time values are subject to import errors (especially between Canada/U.S. default date formats) so my work around is to split the date/time parts in to separate columns (e.g. year, month, day, hour, minute), and then concatenate these in a new field calculation after successfully bringing into in ArcGIS.
The geographic coordinates tip from Jamie is also necessary - specify negative values for western hemisphere longitude and southern hemisphere latitude. And unicode takes care of special characters.
Lastly, if a field data type is still misinterpreted after bringing into ArcGIS I will add a new field in the correct data type and calculate/convert the values from the original field, but usually the dummy row/record takes care of most, if not all, problems.
No comments:
Post a Comment