Want to Stop Rebuilding "Expensive" Parts of your Flow? Explicit Builds are the Answer!READ MORE

How to use available variables in reporters section/Send to dataset in scenarios ?

Solved!
BlueRabit
Level 2
How to use available variables in reporters section/Send to dataset in scenarios ?

Hi, I would like to create a scenario which allows me to have kind of a history dataset build in outputs flow checks. The idea is to trigger the scenario once every week, build some dataset over the flow, run checks,... and at the end if all went good, add a new row in a dataset with the date and the hour of the execution of the scenario, the output of scenario, and more.

I have noticed that we have some variables available in the reporter, that's good!

Now I would like to use them, but foremost understand their JSON constitution in order to get work autonomously.

 

 

 

0 Kudos
1 Solution
dgraham
Dataiker
Dataiker

Hi @BlueRabit ,

The "Send to dataset" reporter adds a new row to an existing dataset  at the beginning or end of the scenario run. Note, the Dataset (as specified in the "Dataset name" field) must point to a Dataset that is on a writable connection (i.e. "Allow write" must be enabled under "Usage params" of the the connection settings). Additionally, the Timestamp column must be a date column (i.e. storage type of Date) in the dataset. 

As for your question, the JSON specified in the input text area should be a JSON string with one or more key-value pairs that correspond to the <column_name> : <value> for the row which is added to the dataset. For example, if we had a dataset with the following three columns: Timestamp, Scenario, Status

 

example_dataset.png

 

Then we could configure the send to dataset reporter with the following JSON:

example_send_to_dataset.png

{
  "Scenario": "${scenarioName}",
  "Status": "${outcome}"
}
 

 

 

 

View solution in original post

10 Replies
dgraham
Dataiker
Dataiker

Hi @BlueRabit ,

The "Send to dataset" reporter adds a new row to an existing dataset  at the beginning or end of the scenario run. Note, the Dataset (as specified in the "Dataset name" field) must point to a Dataset that is on a writable connection (i.e. "Allow write" must be enabled under "Usage params" of the the connection settings). Additionally, the Timestamp column must be a date column (i.e. storage type of Date) in the dataset. 

As for your question, the JSON specified in the input text area should be a JSON string with one or more key-value pairs that correspond to the <column_name> : <value> for the row which is added to the dataset. For example, if we had a dataset with the following three columns: Timestamp, Scenario, Status

 

example_dataset.png

 

Then we could configure the send to dataset reporter with the following JSON:

example_send_to_dataset.png

{
  "Scenario": "${scenarioName}",
  "Status": "${outcome}"
}
 

 

 

 

BlueRabit
Level 2
Author

Thanks a lot for your informations, I figured it out my self in mid time what you told me, but still appreciate !

0 Kudos
BlueRabit
Level 2
Author

Hi again, I solved my problem, thanks to you !
I have an other prob now, Im using a variable has we saw earlier in the column "timestamp" so I can save the date and hour of scenario run.

My probleme is, the hours saved are not corresponding to the actual hours, there is an hour difference between the actual execution time and the time recorded in my dataset.

For example, when a scenario runs at 8AM, I'll have a new row in my dataset with a timestamp registered at 7AM.

Do you know why ? @dgraham 

Thanks for answers

 

0 Kudos
dgraham
Dataiker
Dataiker

Hi @BlueRabit ,

Timestamps are added to the Timestamp column of the dataset following the UTC time standard. Depending on your geographic location, this may explain the 1 hour difference you observe.  

0 Kudos
BlueRabit
Level 2
Author

Yes of this, I've been told.
Do you know how I can solve the problem ?

Because I'm on Paris, and I have the same issue even with EUW/Paris registered.. :'(
Thx @dgraham 

0 Kudos
Yasmine
Level 2

Hello, 

I have the same problem @dgraham@BlueRabit  , could you, please, tell us how to fix this?

Thank you in advance for your return,

Kind regards,

Yasmine

0 Kudos
dgraham
Dataiker
Dataiker

Hi @BlueRabit @Yasmine ,

While it isn't possible to change the timestamp representation from UTC in the send to dataset scenario reporter, you could, however, reformat the Timestamp column using the format date processor of the Prepare recipe. For example, we could specify "yyyy-MM-dd'T'HH:mm:ss.SSSZ" as the date format in the format date processor, then select our desired local timezone in which we would like to reformat the date, as shown in the attached screenshots.

 
 

 

 

 

0 Kudos
tgb417
Neuron
Neuron

@dgraham 

When reformatting in this way how do charts and pivot tables treat the reformatted values?  As strings to be enumerated or as continuous time values that get date time treatment in the charts?  In the past I think I’ve had some mixed results on this area.  

--Tom
0 Kudos
dgraham
Dataiker
Dataiker

Hi @tgb417 ,

Since the meaning and storage type of the reformatted timestamp column is of type Date, they are treated as regular DSS dates, and, so the date ranges can be plotted on either a fixed timeline (e.g. "Year","Day","Hour", etc.)  or regrouped into larger ranges (e.g. "Quarter of year", "Day of month", "Hour of day" etc.).

0 Kudos
Yasmine
Level 2

Thank you very much for your reply @dgraham 

 

Kind regards,

Yasmine

Labels

?
Labels (2)
A banner prompting to get Dataiku