Hey everyone! Beginner here using W&B and I’m really excited to showcase my work through this platform. For my side project, I want to run Python code on a weekly basis to update an external MongoDB database containing my dataset for my ML project. I have two W&B related questions:
-
What is the best way to schedule a job that executes Python code on a scheduled basis in W&B? Can this be accomplished through W&B Launch using a Docker container that stores my Python script, then following the steps here? Launch with Docker | Weights & Biases Documentation
-
Does W&B Artifacts support JSON-like data structures? Would I have to download all of the responses as .json files from an API response and then upload that to W&B Artifacts? Is there a way to link a MongoDB database to W&B and use that directly?
The main reason for these questions is to see if there is a way for me to centralize all of my processes to services in W&B. For (1), my initial plan was to run a scheduled job through Heroku, and for (2) I have an existing free-tier MongoDB database that contains all of my data from the API extract contained in the Python code mentioned in (1).
Thanks!