FME Data Express : Add more sensors ( compass, accelerometer...)

Related products: FME Form

Add more sensors ( compass, accelerometer...)

Hi @antoine


what sensors would you like to be able to access through FME Data Express? Compass and accelerometer? Anything else? What are you precision requirements for compass? As far as I know, usually phone compass precision is withing 15 degrees - is this sufficient?


Also, could you please tell more about your scenarios? We can discuss details here or, if you prefer, please e-mail me at Lena.Bagh@safe.com. This information will help us define and prioritize the new functionality better.


Hi!

 

 

I am doing a demo app to collect data on the field, specially through pictures. The user can point at the target using the "gps"/map component on the app but sometimes he can be wrong (or lazy) so getting the orientation would allow more robust results (depending on the target distance) without much work from the user.

I do not know about other scenarios implying sensors but I think sensor data / phone interaction are the strong points of the app compared to the webpage for now, so the more sensors, the better.

To collect data on the field, the ability to draw polygons or line on a map would be a plus (I created a different post for this). It can be done "easily" through a leaflet page. We could also imagine to optionnaly allow voice record to help with reporting for incident/emergency report apps.

I wish we could also use the app to download the forms and use it to store the request data in a sqlite database / or a zipped folder offline and push it when network is back but I guess it is a lot to implement.

 


It is an interesting scenario - and I believe it will eventually become very popular. I was thinking about extracting GPSDestBearing from image EXIF headers, but for whatever reason photos from my phone do not have any GPS tags in their EXIFs (despite Location and Geotagging turned on). I will look further into this option. Are you going to use iOS or Android devices?


We do plan to add more sensors support. Compass is definitely the #1 candidate. We rely on users' feedback and requests, so your input is very much appreciated.



To collect data on the field, the ability to draw polygons or line on a map would be a plus (I created a different post for this). It can be done "easily" through a leaflet page. We could also imagine to optionally allow voice record to help with reporting for incident/emergency report apps.

Drawing on the map as another input type sounds like an awesome idea: https://knowledge.safe.com/idea/97756/fme-data-express-allow-user-to-draw-geometries-as.html It is not trivial to implement, but I hope we will get this feature.


Voice recording might happen reasonably soon. I filed an idea https://knowledge.safe.com/idea/98453/fme-data-express-voiceaudio-recorder-for-parameter.html? based on your suggestion. Please vote for it - the more votes this idea gets, the higher will be its priority.



I wish we could also use the app to download the forms and use it to store the request data in a sqlite database / or a zipped folder offline and push it when network is back but I guess it is a lot to implement.

This is indeed a lot to implement, but we are considering this functionality. Would you mind posting this as a separate idea? It would be interesting to know how exactly users are going to use the off-line mode to make sure what we implement meets the expectations.


"...I was thinking about extracting GPSDestBearing from image EXIF headers, but for whatever reason photos from my phone do not have any GPS tags in their EXIFs (despite Location and Geotagging turned on)..."

Not sure what phone you tested with but my iPhone (SE) does record bearing in the image. ( jpeg_exif_gpsdestbearing / jpeg_exif_gpsimgdirection )


Thanks for the fast answers.

 

 

I collect all the EXIF (btw AttributeJSONPacker is handy and you still need to play with ascii codes for some EXIF) but they depend too much on the app/phone you use. Using directly FME Data Express would be "safer" and would allow a usage without the camera. Of course the quality still varies a lot ( being inside/outside, hardware, calibration by the user or not... ), without much quality metadata (GPSProcessingMethod and GPSDifferential might be present in EXIF).

 

 

In this perspective, I like that in the app you can point anywhere (as a draw point function) to get the FMEEXPRESS_LOCATION but it makes it impossible to know if the person pointed there or used the gps location (no key/value for this in the json). You could also take more info from sensors in the json keys/values.

 

 

Living in Europe, I have been using Android for testing (way more popular here).

 

 

The less restriction on the phone/camera app, the greater the chance for the app to be used as we all tend to be lazy and impatient.

@nielsgerrits: I was testing with Android (LG G6). I do expect to have location and bearing in EXIFs... will give it another try. Thank you for the confirmation.


@antoine: our goal is to support all devices. Right now, FME Data Express has pretty much the same functionality on iOS and Android devices. With FME AR, iOS users have access to more advanced functionality. This is a temporary situation, and we hope to bring Android version of FME AR up to the iOS version standard.


Regarding the compass functionality: would you need to capture the bearing OR whether it was the device location or user selected location?



Using directly FME Data Express would be "safer" and would allow a usage without the camera.

100% agree



The less restriction on the phone/camera app, the greater the chance for the app to be used as we all tend to be lazy and impatient.

Absolutely!


@lenaatsafe I do not know well how those are stored in EXIF but I need the heading of the target (heading of the person with camera + bearing).


Thank you for the clarification. I've documented all the details. We definitely consider adding compass support, so far I don't have any timeline for this functionality though. If it becomes critical for you, please let me know. More votes on this idea will also be helpful.


Hi!

As AI is becoming more accessible, good sensor input becomes more and more relevant. For example one user could talk to his phone as input to the app. Other sensors input are also very relevant to allow sensor fusion analysis. I saw that Lena left, who might be in charge of FME Mobile now and toward what direction is the product going?
As written by other people, I fear that people will just use other apps as input/survey and a webhook to their FME Automations/workflows if nothing is done. @lenaatsafe