Skip to main content

Hello everybody,

I hope this is a the right place for a FME Realize question as this is about mobile apps (or is there a Realize subforum I missed?) otherwise please don’t hesitate to move this thread.

 

I am just starting out with FME Realize and I noticed some weird behaviour and was wondering if this is only because of GPS position fluctuations or because of something else.

I placed spheres at these two points:

When I place one via rotating/moving and lock the model the other one is quite a bit offset. 
I doubt it is because of my data, as it is measured by a trusted surveyor. 

 

Correctly placed sphere and line.

When I now walk over to the point where the other sphere is located it looks like this:

Green circle showing where it should be, red where it is

Weirdly when I now go back to the first sphere it is also offset:
 

 

I tried it with several coordinate systems and which anchor/georef.anchor on and off, but it is never correct. At the moment I am in scalefree UTM.ETRS89 - EPSG25832, though that shouldnt matter as my area is only around 50m.

Also: even though the model is locked it does weird jumps from time to time 


Anyone else noticed something like this before or has suggestions what I could test?
It feels like once I move more than 3 meters it starts to offset. Though it is weird that the spheres are offset in just one direction and not completely off as I would suspect because of GPS.

Hi ​@max_h,

 

A couple of things come to my mind. Are your spheres placed at 0 elevation? And if so, is your surface flat (that is, the surface you point your device at when loading model does not change elevation)? Even small differences in elevation may lead to some visible offset in AR object locations when viewed at some distance/angle (not directly over/at the objects). If elevation is the case, then you can address the issue by draping the dataset relative to the point where you load the model. I do it the following way - I drape the location point I send to FME Flow from FME Realize, and then pass its elevation to all other objects. Then I offset the absolute elevation values to the values relative to the loading point (Z of an object minus Z of the point of the submitted location). This works well even with pretty steep terrain.

 

About model jumps. We actually don’t use GPS once the model is placed—we rely on IMU sensors and surface tracking, so even if your GPS sends location with an excessive error, once you adjusted and lock the model, the GPS won’t influence your experience. The model may slightly shift as you walk and point the device in different directions, but in my experience, it is not significant. Some minor jumps are possible, but it is rather the device limitations—iPhones and iPads are no match (at least for now) to headsets, which usually have multiple cameras pointing in all directions.  Do you have a LiDAR sensor in you device? With it, I would expect better results (I don’t have a device without LiDAR, so it is hard for me to test).

 

In general, we always should move the devices smoothly, avoid fast device/camera movements, and ideally, we need to keep the surface in our field of view. It is possible to point the device up, but only for brief periods of time. Longer views up can lead to losing the surface tracking.

 

I can have a look at your workspace and your data on my side, feel free to send a small test template directly to me to dmitri@safe.com. And if you have more questions about building AR apps, feel free to ask any questions.

 

Dmitri

 


Hi ​@dmitribagh,

thank you for this detailed answer! 
My surface is flat and I set the heights to 0, but I did extrude the lines a few cms though.
So I dont think this is the issue, I did look from above and from the side to double check as
I wanted to make sure it's not simply some perspective error.

That is good to know! We got quite some line-of-sight blockage around here, but that makes sense,
was just puzzled why it jumped though I locked it.. 
I did walk slowly (the gif is sped up, so I could upload it) and tried facing only down sometimes,
but no avail. But in fact, my IPad doesnt have a Lidar sensor.

I will have an Ipad Pro to test in a few weeks, so I will check then again and compare the results!


My workspace is aligned to my area, as in I placed the lines/spheres directly on measured fixed points,
which are around 50 meters away from each other. 
So I just have: VertexCreator->Bufferer->CoordinateSystemSetter->FME-AR-Writer
And a export of my geodatabase which I transformed to a geojson (and one to gml)->Reprojector->3DForcer->Bufferer->Extruder->FME-AR-Writer

Thanks again for all the information, will try again and give feedback once I am able to test with a Lidar Ipad.