Conditional Average (average if)

Solved!
nshapir2
Level 1
Conditional Average (average if)

I want to calculate an average of all entries greater than 0 in a recipe. I am trying to do this in a group recipe. How would I do this? Is there a better way?

0 Kudos
1 Solution
HarizoR
Developer Advocate

It treats them as blank, meaning that they won't be taken into account in your average computation.

View solution in original post

0 Kudos
10 Replies
tgb417

@nshapir2 ,

You don't say if you want to calculate this average over a bunch of rows,  or across columns in a row.  So, I can't be very specific here.

Second, you didn't say anything about your backend server.   

If your backend is based on a SQL besed server.  You might be able to do something with a custom agrigation using a SQL case statement to count the non-zero values and a SQK Sum, and do the division for an average.

If you are working with "tall" or tidy style data the window recipe might give you an option to do what you want.  It already knows how to do averages across groups of rows.

Hope that helps a bit.  If not and you are willing to share a bit more about your data and use case. We might be able to help you a little bit better.

 

--Tom
0 Kudos
nshapir2
Level 1
Author

I  have multiple columns that I want to compute averages for, grouped on timestep. Its economic scenario data so its like

Trial | Timestep | Asset A | Asset B | Asset C

I want to create columns that take the average at each timestep across all trials for Asset A where A is greater than 0, then again for B where B >0 and again for C where C > 0. I can take an average in the group recipe easily but its not "conditional". I have the option of using spark or DSS. I tried 'averageif' in spark but that did not work

 

0 Kudos
tgb417

@nshapir2 ,

Thanks for sharing that you are using spark.

I don't have access to a Spark set up at the moment.

However, I'm wondering if you have set your visual recipe to use the SPARK Engine or if the recipe is running in the DSS engine.  See below for how you can check.

Choose Spark Engine.png

 

If you have an averageif SPARK function.  Then this might work.  

As I've said I don't have access to a SPARK server.  However, based on a quick Google Search I don't know if SPARK server actually provides an averageif function.  In doing a quick search I'm only finding references to MS Excel when it comes to the function averageif.

Let us know how you get on with this.  I'd like to invite anyone else here in the community with access to SPARK to pick up this thread and provide some additional help.

 

--Tom
0 Kudos
nshapir2
Level 1
Author

so I can change it back and forth from DSS to Spark (like how you have in the photo). The averageif does not work on spark. I am not sure how to do here. Do you know how I can accomplish my goal on DSS?

0 Kudos
HarizoR
Developer Advocate

Hi,

If you want to apply a specific filter on your input data before performing the aggregation, you can use the "pre-filter" option on the left and select the "filter" category.

Screenshot 2021-07-26 at 13.39.24.png

 

Best,

Harizo

0 Kudos
nshapir2
Level 1
Author

but would this reject rows where conditions are not met in other columns?

In the below example I want to compute the average across all Trials at a given timestep. I want to reject the negative values BUT since the second row has a negative entry for B would it through away that row for all the other columns?

 

TrialTimeAB
10.02-.02
11.02.01
12-.01.03
0 Kudos
HarizoR
Developer Advocate

Yes, the pre-filter operation removes the entire rows.

If you want to retain the rest of the values that are valid in a given row, you should consider running an upstream Prepare recipe on your Dataset and clear the cells that match your filtering criterion.

 

Best,

Harizo

0 Kudos
nshapir2
Level 1
Author

how could I do that? does it clear them (treat as blank) or make them 0 (Which would lower average)

0 Kudos
HarizoR
Developer Advocate

It treats them as blank, meaning that they won't be taken into account in your average computation.

0 Kudos
nshapir2
Level 1
Author

I found the function to do this. Thank you!

0 Kudos