MatCraft's defaults work well for most problems, but tuning key hyperparameters can improve convergence speed and solution quality. Here is a guide to the most impactful settings.
surrogate:
hidden_layers: [64, 64] # Default[32, 32] or even [32] to avoid overfitting.[128, 128] or [128, 64, 32].[128, 64].surrogate:
learning_rate: 0.001 # Default. Decrease to 0.0005 for noisy data.
epochs: 200 # Default. Increase to 500 for complex objectives.
early_stopping_patience: 20 # Prevents overfittingUsing an ensemble of surrogates provides better uncertainty estimates for the acquisition function:
surrogate:
ensemble_size: 5 # Train 5 independent MLPs (default: 1 with MC Dropout)Ensembles are more expensive to train but significantly improve active learning performance on small datasets.
optimizer:
sigma0: 0.3 # Start broad; decrease to 0.1 if you have a good prior
population_size: 20 # Increase for noisy objectives or high dimensions4 + floor(3 * ln(n_params)) is usually good. Double it for very noisy problems.acquisition:
type: expected_improvement # Best default
# type: upper_confidence_bound
# exploration_weight: 2.0 # Higher = more explorationexploration_weight (2.0-5.0) is useful when you suspect the design space has multiple local optima and want to explore more broadly.campaign:
batch_size: 5 # Candidates per iterationsigma0 or switch to UCB with higher exploration weight.