Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
I have a Python script that hits internal website and scrapes it, creating CSV files (2 per product). The creation of CSV files is because I was running it manually on my desktop. In the end, I just want to load the data contained in these CSV files into two distinct Snowflake tables. What is the best approach?
Operating system used: Windows 10
If you search Selenium in this community, you will find a few different posts, including this one, https://community.dataiku.com/t5/Using-Dataiku/Web-scrapping-any-tuto-to-explain-how-to-sart-with-da...
I hope this helps.
I used a Python recipe and got the dataframes I needed but having issues with the write_with_schema function. Started a specific ticket related to that so I think this ticket can be closed.