I have run a hadoop job and set multiple reducers in the reduce phase, then I have customize a counter to sum the results of multiple reducers. However, the data type of the result in each reducer is double, while the increment method of conuter only supports the long data type. If the conversion from double to long is forced, the precision will be lost. How can I solve this problem?
I'm wondering if you can share your workspace, or a simplified version of your workspace demonstrating how you've setup the custom counter?
I'm wondering if you can share your workspace, or a simplified version of your workspace demonstrating how you've setup the custom counter?
the custom counter is as follow:
the source code of the increment method of the counter is as follow:
the part code in the reduce phase is as the figure depicts. The 'f' is a double num, so needs to call Math.round to convert double to long,which causes the loss of accuracy.
I'm sorry, unless you are trying to do this in FME, this might be out of my expertise. Are you trying to implement this using FME? Otherwise, hopefully other FME users with experience with Hadoop have some hints or you can post to another forum specifically for Hadoop.