Skip to main content
Question

Intergraph GeoMedia SQL Server Warehouse

  • 29 January 2013
  • 4 replies
  • 5 views

Dear FME People, 

 

 

I've recently changed jobs to Intergraph, hoping to continue working with FME. For that, I will need the 'Intergraph GeoMedia SQL Server Warehouse' writer, which doesn't seem to be invincible.

 

 

However, here is my first issue. I read *.shp data, do some Attribute Renaming and write to the SQL Server. Everything goes well, according to the FME log file. Nevertheless, no table is created. Since FME connects to the database, this seems a bit odd to me.

 

What I then did, was creating the table with Management Studio to check whether the DROP, defined in the Warehouse writer, gets executed or not. It doesn't seem the case. Yet, this is copied from the FME Workbench log file:

 

 

  Successfully connected to source dataset 'MapStaging_Gent'

  GeoMedia SQL Writer: Dropped table `RoadNetwork4BCN_GeoMedia'

 

 

Although the log file states that the table is dropped, it is still there after the Workbench has finished. It is not recreated by FME, since I defined different field names.

 

 

  =-=-=-=-=-=-=Features Written Summary=-=-=-=-=-=-=-=-=-=-

  RoadNetwork4BCN_GeoMedia 14576

  ================

  Total Features Written 14576

  =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

 Translation was SUCCESSFUL with 0 warning(s) (14576 feature(s) output)

 

 

Does anybody have a clue why FME gives feedback that everything is done right, but in reality nothing happens? This was happening in FME 2012, issue remained after upgrading to FME 2013.

 

 

 

 

best regards,

 

Jelle

 

 

And if you open the data in FME Universal Viewer? And if you instead of writing the data to Geomedia writes to an FFS and opens it in FME Universal Viewer - is it displayed correctly?

 

Try to narrow down by reading for instance only the 10 first features and see it these are displayed correctly in FFS / Geomedia.

 

 

If there are attributes with long names, or lists I have encountered that when writing to for instance MSSQL Total Features Written is OK - however not all features was written. When FME handles the features to some writers it has no control if they actually got written, only that they successfully got delivered to that writer.
writing to ffs works fine. All the features are processed.

 

 

If I write to MS SQL LMSSQL_ADO], the table gets dropped well and all the attributes are written. Unfortunately, this writer is not eligible since the geometry cannot be stored.

 

 

If I understood the 'How to Improve Data Exchange in Intergraph using FME' webinar well, the tFM0_SQL] writer takes care of the Metadata Tables needed to read the Geometry from a MSSQL Database, using a GeoMedia Warehouse. I checked that all the connections are actually closed, so that can't be the issue.

 

 

I do not have the feeling that this problem has to do with the features that I try to insert. In fact, I think FME doesn't even come to that point. Why would the log file confirm the connection to the DB and state that the table is dropped, when it is not the case? Any ideas?
could it have anything to do with the fact that I am writing to a SQL Server 2005 Database? 

 

I've checked permissions of the user. This seems al right as well. Besides, the standard SQL writer does drop the table and writes all the records to the DB.
Such a relief, I found the cause and solved the issue. I hope, sharing it with you will help somebody in the future.

 

 

FME was writing all the time, just not to the right dataset. The Default Database of the user in MS SQL was on 'master'. Changing this default Database to the database where FME should write to, solved the issue.

 

 

In my opinion, there is some inconsistency in the FME writers for SQL and the GeoMedia SQL Warehouse. I connect with exactly the same user name and password to exactly the same dataset. The Standard SQL writer does write to the correct dataset, but the GeoMedia SQL Warehouse writer uses the value of the Default Database for the user instead of the one defined in the workbench. Moreover, the Metadata is written to the right dataset (so not the default one of the user in MS SQL, but the one defined in FME).  So, there is an extra inconsistency in the Warehouse writer itself.

 

 

Unfortunately, I can't attach screenshots to this message. It would make things clearer. Just ask if you want them.

 

 

Could I ask for the opinion of Safe on this matter? Maybe there is something I misunderstood in the parameters of the writers. Otherwise, I think I encountered a bug in the software.

 

 

 

Reply