Could someone help me out with Understanding Best Practices for Hyperparameter Tuning?

Hello there,

I am relatively new to machine learning and have been using the Weights & Biases platform to track my experiments. Right now I am working on a project where I need to optimize the hyperparameters of my model for better performance.

I have read some tutorials and documentation on hyperparameter tuning, but I’m still a bit unsure about the best practices. I have a few doubts like

How do you decide which hyperparameters to tune? Are there any guidelines or heuristics to follow?

What are some common strategies for searching the hyperparameter space? Grid search, random search, Bayesian optimization?

How do you prevent overfitting during hyperparameter tuning? Are there any techniques or tricks that have worked well for you?

Once you’ve found the optimal hyperparameters, how do you validate them? Do you use a separate validation set or cross-validation?

Also I have take reference from this too: https://aws.amazon.com/what-is/hyperparameter-tuning/aws

I would appreciate any insights, tips, or resources you can share on these topics.

Also, if you have any specific examples or experiences with hyperparameter tuning using W&B, I would love to hear about them.

Thankyou in advance.