Skip to main content
Solved

How to extract medium value of XYZ in FME

  • May 18, 2023
  • 7 replies
  • 30 views

vimva679
Supporter
Forum|alt.badge.img+9

I have got data as below

 

AssetID X Y Z

A. 0 0 1

B. 1 0 0

C. 0 1 0

D. 1 0 1

E. 4 5. 6

F. 9. 8. 7

G. 6. 9. 5

 

am able to extract

max of each row

min of each row

 

how to obtain medium of each row as simple as possible ?

 

am using

@max ( value of x , value of y, value z)

@min ( value of x , value of y, value z)

 

medium ?????( plz note it’s not average or median of 3 value ) It’s exactly absolutely value the same as it is in excel files.

 

 

Best answer by caracadrian

Test file

 

this is input data

 

image 

this is my expectation

 

image 

plz any help / support appreciated

Here you go:

This post is closed to further activity.
It may be an old question, an answered question, an implemented idea, or a notification-only post.
Please check post dates before relying on any information in a question or answer.
For follow-up or related questions, please post a new question or idea.
If there is a genuine update to be made, please contact us and request that the post is reopened.

7 replies

caracadrian
Contributor
Forum|alt.badge.img+23
  • Contributor
  • 571 replies
  • May 18, 2023

You can put them in a list:

l{0}=@Value(X), l{1}=@Value(Y), l{2}=@Value(Z)

then you can either use the ListStatisticsCalculator custom transformer from the FME Hub or use a ListSorter followed by an AttributeManager that creates a new attribute with it's value equal to the sorted l{1}.


vimva679
Supporter
Forum|alt.badge.img+9
  • Author
  • Supporter
  • 96 replies
  • May 19, 2023

You can put them in a list:

l{0}=@Value(X), l{1}=@Value(Y), l{2}=@Value(Z)

then you can either use the ListStatisticsCalculator custom transformer from the FME Hub or use a ListSorter followed by an AttributeManager that creates a new attribute with it's value equal to the sorted l{1}.

@caracadrian​ hello Please am not really getting this right, please is it possible for you to create demo ?


vimva679
Supporter
Forum|alt.badge.img+9
  • Author
  • Supporter
  • 96 replies
  • May 19, 2023

You can put them in a list:

l{0}=@Value(X), l{1}=@Value(Y), l{2}=@Value(Z)

then you can either use the ListStatisticsCalculator custom transformer from the FME Hub or use a ListSorter followed by an AttributeManager that creates a new attribute with it's value equal to the sorted l{1}.

Test file

 

this is input data

 

image 

this is my expectation

 

image 

plz any help / support appreciated


caracadrian
Contributor
Forum|alt.badge.img+23
  • Contributor
  • 571 replies
  • Best Answer
  • May 21, 2023

Test file

 

this is input data

 

image 

this is my expectation

 

image 

plz any help / support appreciated

Here you go:


vimva679
Supporter
Forum|alt.badge.img+9
  • Author
  • Supporter
  • 96 replies
  • May 22, 2023

Test file

 

this is input data

 

image 

this is my expectation

 

image 

plz any help / support appreciated

@caracadrian​ I didn't realize that the CELL value was a combination of numeric and alphanumeric.

 

Irrespective of its Alphanumeric character (ignore the alphabets), the Max, Min an Medium was that i am aiming for .

 

image


caracadrian
Contributor
Forum|alt.badge.img+23
  • Contributor
  • 571 replies
  • May 22, 2023

Test file

 

this is input data

 

image 

this is my expectation

 

image 

plz any help / support appreciated

A little StringReplacer and some conditional voodoo and you can easily achieve this.

I created copies for each attribute(AttributeCopier), eliminated the whitespace and letter part (StringReplacer- \\s?[a-zA-Z]?), got the statistics (ListStatisticsCalculator) and replaced the initial values back (AttributeManager).


vimva679
Supporter
Forum|alt.badge.img+9
  • Author
  • Supporter
  • 96 replies
  • May 22, 2023

Test file

 

this is input data

 

image 

this is my expectation

 

image 

plz any help / support appreciated

Perfect , that works so well thank you

image @caracadrian​ , am so much grateful once again for your quick response as well.