Submit your innovative use case or inspiring success story to the 2023 Dataiku Frontrunner Awards! LET'S GO

Sharepoint plugin - scenario trigger

Solved!
e_pap
Level 1
Sharepoint plugin - scenario trigger

Hi all, 

I have the following case:

3 different files in sharepoint that have as filename title and timestamp. I have created connection with the Sharepoint and I have created the databases but I want to ask:

1)If there is a way DSS ignore the timestamp and create the dataset based only on the title?

2) Can I trigger a scenario to update the database in dss when the file in the sharepoint site gets updated?

Thank you!

0 Kudos
1 Solution
AlexT
Dataiker

1) You can add a Folder pointing to share point where the files will be present.
Use a Files in Folder dataset where you can use glob/regex

Screenshot 2023-05-17 at 5.33.54 PM.png

If this doesn't suffice, you could use Python recipe to read from the folder.
Another option is to use a partitioned folder/dataset and sync the latest available if you just need the last file. 

https://doc.dataiku.com/dss/latest/partitions/index.html

If the dataset is pointing to a path and new files is added it won't be detect you can use a folder instead of dataset that should detect any changes in that folders path.

Thanks

View solution in original post

0 Kudos
4 Replies
AlexT
Dataiker

Hi @e_pap ,

I am not sure I understand what you mean for (1). Could explain perhaps provide a screenshot of what you want to include? 

2) You can use file modified trigger  https://doc.dataiku.com/dss/latest/scenarios/triggers.html#dataset-modification-triggers on Sharepoint files datasets. File modified on Sharepoint Lists datasets is not supported.



0 Kudos
e_pap
Level 1
Author

Hi @AlexT ,

Thank you for your response!

To clarify my first question, I have a scenario like this where the "Sample" part if always the same but the timestamp changes. Is there a way to connect with the files in sharepoint with a regex or something similar?

Screenshot 2023-05-17 at 5.55.01 PM.png

Regarding the second question I created a scenario like the below

Screenshot 2023-05-17 at 5.57.28 PM.png

but what I observed is that I had to go to the dataset I created from the sharepoint file and update the path in the settings tab to trigger the scenario. When I just dropping an updated file in the sharepoint site the scenario didn't seem to recognise the change. Maybe my configuration is not correct or I missed a step.

0 Kudos
AlexT
Dataiker

1) You can add a Folder pointing to share point where the files will be present.
Use a Files in Folder dataset where you can use glob/regex

Screenshot 2023-05-17 at 5.33.54 PM.png

If this doesn't suffice, you could use Python recipe to read from the folder.
Another option is to use a partitioned folder/dataset and sync the latest available if you just need the last file. 

https://doc.dataiku.com/dss/latest/partitions/index.html

If the dataset is pointing to a path and new files is added it won't be detect you can use a folder instead of dataset that should detect any changes in that folders path.

Thanks

0 Kudos
e_pap
Level 1
Author

It worked!

Thank you @AlexT 

0 Kudos