Skip to main content
Question

Problem handling JSON from a table column in the same way as the JSON reader

  • December 4, 2019
  • 4 replies
  • 49 views

rob_m_esriuk
Contributor
Forum|alt.badge.img+6

I have a workbench which takes a file of JSon using the JSON reader and this automatically formats the data array in the JSON into a fme list which I can then explode and process. In doing this I've not had to provide any map of the JSON. i now need that JSON text to be accessed from a column in a database table. I have therefore a SQL Server reader to read that table and have experimented with JSON flattener and fragmenter to push the attribute into. Neither seem to give the required result and I'm thinking that to use either I need to describe the entire file in some way which is not practical.

i've tried a couple of the examples that others have provided and I cannot see any yet that give me what the JSON reader provides.

Help!

 

Rob

This post is closed to further activity.
It may be an old question, an answered question, an implemented idea, or a notification-only post.
Please check post dates before relying on any information in a question or answer.
For follow-up or related questions, please post a new question or idea.
If there is a genuine update to be made, please contact us and request that the post is reopened.

4 replies

Forum|alt.badge.img+2
  • 1891 replies
  • December 6, 2019

@rob_m_esriuk It should b doable with the JSONFlattener or JSONFragmenter. If you can attach a small example of the JSON, and part of the workspace where your reading the JSON as a file, then someone in the community might be able to chip in from there.


rob_m_esriuk
Contributor
Forum|alt.badge.img+6
  • Author
  • Contributor
  • 26 replies
  • December 10, 2019

@markatsafe, I'm sure it is doable, but really can't get the same result. I've obfuscated a bit of JSON (it's for a customer so I'm not sure I can just share it and the structure looks like this

{

"version" : "#.##",

"timeZone" : "0.0",

"timeSeries" : [ {

"header" : {

"type" : "?????????",

"locationId" : "????????",

"parameterId" : "????????????",

"ensembleId" : "??????",

"ensembleMemberIndex" : "?",

"timeStep" : {

"unit" : "second",

"multiplier" : "???"

},

"startDate" : {

"date" : "????-??-??",

"time" : "??:??:??"

},

"endDate" : {

"date" : "????-??-??",

"time" : "??:??:??"

},

"missVal" : "?????",

"stationName" : "????????????",

"lat" : "??.?????????????",

"lon" : "??.?????????????",

"x" : "??????",

"y" : "??????",

"z" : "????",

"units" : "m"

}

},

"events" : [ {

"date" : "????-??-??",

"time" : "??:??:??",

"value" : "?????",

"flag" : "?"

},"events" : [ {

"date" : "????-??-??",

"time" : "??:??:??",

"value" : "?????",

"flag" : "?"

},"events" : [ {

"date" : "????-??-??",

"time" : "??:??:??",

"value" : "?????",

"flag" : "?"

},"events" : [ {

"date" : "????-??-??",

"time" : "??:??:??",

"value" : "?????",

"flag" : "?"

},"events" : [ {

"date" : "????-??-??",

"time" : "??:??:??",

"value" : "?????",

"flag" : "?"

},

and when I run it through the JSON reader it instantly interprets the timeSeries as shown in the attached image. I've tried the Json fragmenter (picture) and then used the attribute exposer, importing the time series as described by @deanatsafe but when I run the subsequent list exploder appears to have trouble with finding the list.

 


nielsgerrits
VIP
Forum|alt.badge.img+60
  • 2938 replies
  • December 10, 2019

Why not just write the json (FeatureWriter) you read from the database to a temp file (TempPathnameCreator) and read the file like you are used to? (FeatureReader)


rob_m_esriuk
Contributor
Forum|alt.badge.img+6
  • Author
  • Contributor
  • 26 replies
  • December 10, 2019

Why not just write the json (FeatureWriter) you read from the database to a temp file (TempPathnameCreator) and read the file like you are used to? (FeatureReader)

This has occurred to me - and I guess may be the end result, it just seemed a bit clunky, however it now appears that I need to get a list of locations in the json to predicate a reader so it may be the way that I have to go to pre-write and pre-process the Json