Loading

Quipoin Menu

Learn • Practice • Grow

machine-learning / Hyperparameter Tuning
tutorial

Hyperparameter Tuning

Hyperparameters are settings that you choose before training (e.g., tree depth, learning rate, number of neighbors). Hyperparameter tuning systematically searches for the best combination to improve model performance.

Grid Search

Exhaustively tries every combination from a grid of hyperparameter values.
from sklearn.model_selection import GridSearchCV

param_grid = {
'n_estimators': [50, 100, 200],
'max_depth': [None, 10, 20],
'min_samples_split': [2, 5, 10]
}
grid_search = GridSearchCV(RandomForestClassifier(), param_grid, cv=5)
grid_search.fit(X_train, y_train)
print(grid_search.best_params_)

Random Search

Samples hyperparameter combinations randomly from distributions. More efficient than grid search when many hyperparameters.
from sklearn.model_selection import RandomizedSearchCV
from scipy.stats import randint

param_dist = {
'n_estimators': randint(50, 500),
'max_depth': randint(5, 50)
}
random_search = RandomizedSearchCV(RandomForestClassifier(), param_dist, n_iter=50, cv=5)
random_search.fit(X_train, y_train)

Important: Use Validation Set

Never tune hyperparameters on the test set. Use cross‑validation (as above) or a separate validation set. Test set should be used only once at the end.


Two Minute Drill
  • Hyperparameters control model training (not learned from data).
  • Grid search tries all combinations.
  • Random search samples randomly – more efficient.
  • Always tune using cross‑validation, not test set.

Need more clarification?

Drop us an email at career@quipoinfotech.com