AI7 min read
Hyperparameter Tuning
Optimize model settings for best performance.
Robert Anderson
December 18, 2025
0.0k0
Fine-tune your models.
What are Hyperparameters?
Settings you choose BEFORE training.
Model parameters: Learned during training
Hyperparameters: Set by you
Examples
Random Forest:
- n_estimators (number of trees)
- max_depth (tree depth)
- min_samples_split
Neural Network:
- learning_rate
- batch_size
- number of layers
Grid Search
Try all combinations:
from sklearn.model_selection import GridSearchCV
from sklearn.ensemble import RandomForestClassifier
# Define parameter grid
param_grid = {
'n_estimators': [50, 100, 200],
'max_depth': [5, 10, 15],
'min_samples_split': [2, 5, 10]
}
model = RandomForestClassifier()
# Try all combinations
grid_search = GridSearchCV(model, param_grid, cv=5)
grid_search.fit(X, y)
print(f"Best params: {grid_search.best_params_}")
print(f"Best score: {grid_search.best_score_:.2f}")
Random Search
Try random combinations (faster!):
from sklearn.model_selection import RandomizedSearchCV
random_search = RandomizedSearchCV(
model,
param_grid,
n_iter=20, # Try 20 random combinations
cv=5
)
random_search.fit(X, y)
Bayesian Optimization
Smart search that learns from previous tries:
# Using optuna library
import optuna
def objective(trial):
n_estimators = trial.suggest_int('n_estimators', 50, 200)
max_depth = trial.suggest_int('max_depth', 5, 15)
model = RandomForestClassifier(
n_estimators=n_estimators,
max_depth=max_depth
)
return cross_val_score(model, X, y, cv=5).mean()
study = optuna.create_study(direction='maximize')
study.optimize(objective, n_trials=50)
Best Practices
- Start with default parameters
- Use cross-validation
- Random search first (fast)
- Grid search for fine-tuning
- Don't overtune on test data!
Common Parameters to Tune
Tree Models: max_depth, n_estimators
SVM: C, gamma, kernel
Neural Networks: learning_rate, batch_size
Remember
- Good features > Perfect hyperparameters
- Start simple
- Use random search for many parameters
#AI#Intermediate#Tuning