Hi W&B Support Team,
I am currently working on research involving hyperparameter searches and require downloading a substantial number of artifacts from W&B. However, I have encountered a 429 Client Error: Too Many Requests for the URL: https://api.wandb.ai/graphql, along with the following message:
wandb: Network error (HTTPError), entering retry loop.
This issue significantly hampers my workflow. Could you please assist me in increasing the rate limit for my account? Additionally, I would appreciate it if you could inform me of the current rate limit to better understand and manage my usage.
Thank you for your assistance and support.
wandb: WARNING A graphql request initiated by the public wandb API timed out (timeout=19 sec). Create a new API with an integer timeout larger than 19, e.g.,
api = wandb.Api(timeout=29) to increase the graphql timeout.
also getting this
I’m also encountering the following error while using WandB:
bash
wandb: 429 encountered (Filestream rate limit exceeded, retrying in 4.3 seconds.), retrying request
Could you please provide more details on this error? Specifically, I’d like to understand what triggers this rate limit and any best practices or optimizations I can implement to prevent it in the future.
Thank you for your help!
Hi!
W&B SaaS Cloud API implements a rate limit to maintain system integrity and ensure availability. This measure prevents any single user from monopolizing available resources in the shared infrastructure, ensuring that the service remains accessible to all users. You may encounter a lower rate limit for a variety of reasons.
For more info please review our comprehensive docs here!
Let me know if there is anything else I can do for you
Hi Sang,
We wanted to follow up with you regarding your support request as we have not heard back from you. Please let us know if we can be of further assistance or if your issue has been resolved.
Best,
Weights & Biases
Hi Sang, since we have not heard back from you we are going to close this request. If you would like to re-open the conversation, please let us know!