# Documentation

Filter Content By
Version
Languages

## Command-Line Tuning

The grbtune command-line tool provides a very simple way to invoke parameter tuning on a model (or a set of models). You specify a list of parameter=value arguments first, followed by the name of the file containing the model to be tuned. For example, you can issue the following command (in a Windows command window, or in a Linux/Mac terminal window)...

> grbtune TuneTimeLimit=10000 c:\gurobi900\win64\examples\data\misc07

(substituting the appropriate path to a model, stored in an MPS or LP file). The tool will try to find parameter settings that reduce the runtime on the specified model. When the tuning run completes, it writes a set of .prm files in the current working directory that capture the best parameter settings that it found. It also writes the Gurobi log files for these runs (in a set of .log files).

You can also invoke the tuning tool through our programming language APIs. That will be discussed shortly.

If you specify multiple model files at the end of the command line, the tuning tool will try to find settings that minimize the total runtime for the listed models.

Running the Tuning Tool

The first thing the tuning tool does is to perform a baseline run. The parameters for this run are determined by your choice of initial parameter values. If you set a parameter, it will take the chosen value throughout tuning. Thus, for example, if you set the Method parameter to 2, then the baseline run and all subsequent tuning runs will include this setting. In the example above, you'd do this by issuing the command:

> grbtune Method=2 TuneTimeLimit=100 misc07


For a MIP model, you will note that the tuning tool actually performs several baseline runs, and captures the mean runtime over all of these trials. In fact, the tool will perform multiple runs for each parameter set considered. This is done to limit the impact of random effects on the results, as discussed earlier. Use the TuneTrials parameter to adjust the number of trials performed.

Once the baseline run is complete, the time for that run becomes the time to beat. The tool then starts its search for improved parameter settings. Under the default value of the TuneOutput parameter, the tool prints output for each parameter set that it tries...

Testing candidate parameter set 7...

Method 2
MIPFocus 1

Solving with random seed #1 ... runtime 3.63s
Solving with random seed #2 ... runtime 4.12s+

Progress so far: baseline runtime 3.38s, best runtime 2.88s
Total elapsed tuning time 34s (66s remaining)

This output indicates that the tool has tried 7 parameter sets so far. For the seventh set, it changed the value of the MIPFocus parameter (the Method parameter was changed in our initial parameter settings, so this change will appear in every parameter set that the tool tries). The first trial solved the model in 3.63 seconds, while the second hit a a time limit that was set by the tuning tool (as indicated by the + after the runtime output). If any trial hits a time limit, the corresponding parameter set is considered worse any set that didn't hit a time limit. The output also shows that the best parameter set found so far gives a runtime of 2.88s. Finally, it shows elapsed and remaining runtime.

Tuning normally proceeds until the elapsed time exceeds the tuning time limit. However, hitting CTRL-C will also stop the tool.

When the tuning tool finishes, it prints a summary...

Tested 20 parameter sets in 97.89s

Baseline parameter set: runtime 3.38s

Improved parameter set 1 (runtime 1.62s):

Method 2
Heuristics 0
VarBranch 1
CutPasses 3
GomoryPasses 0

Improved parameter set 2 (runtime 2.03s):

Method 2
Heuristics 0
VarBranch 1
CutPasses 3

Improved parameter set 3 (runtime 2.38s):

Method 2
VarBranch 1

Wrote parameter files tune1.prm through tune3.prm
Wrote log files: tune1.log through tune3.log

The summary shows the number of parameter sets it tried, and provides details on a few of the best parameter sets it found. It also shows the names of the .prm and .log files it writes. You can change the names of these files using the ResultFile parameter. If you set ResultFile=model.prm, for example, the tool would write model1.prm through model3.prm and model1.log through model3.log.

The number of sets that are retained by the tuning tool is controlled by the TuneResults parameter. The default behavior is to keep the sets that achieve the best trade-off between runtime and the number of changed parameters. In other words, we report the set that achieves the best result when changing one parameter, when changing two parameters, etc. We actually report a Pareto frontier, so for example we won't report a result for three parameter changes if it is worse than the result for two parameter changes.

Other Tuning Parameters

So far, we've only talked about using the tuning tool to minimize the time to find an optimal solution. For MIP models, you can also minimize the optimality gap after a specified time limit. You don't have to take any special action to do this; you just set a time limit. Whenever a baseline run hits this limit, the tuning tool will automatically try to minimize the MIP gap. To give an example, the command...

> grbtune TimeLimit=100 glass4

...will look for a parameter set that minimizes the optimality gap achieved after 100s of runtime on model glass4. If the tool happens to find a parameter set that solves the model within the time limit, it will then try to find settings that minimize mean runtime.

For models that don't solve to optimality in the specified time limit, you can gain more control over the criterion used to choose a winning parameter set with the TuneCriterion parameter. This parameter allows you to tell the tuning tool to search for parameter settings that produce the best incumbent solution or the best lower bound, rather than always minimizing the MIP gap,

You can modify the TuneOutput parameter to produce more or less output. The default value is 2. A setting of 0 produces no output; a setting of 1 only produces output when an improvement is found; a setting of 3 produces a complete Gurobi log for each run performed.

If you would like to use a MIP start with your tuning run, you can include the name of the start file immediately after the model name in the argument list. For example:

> grbtune misc07.mps misc07.mst

You can also use MIP starts when tuning over multiple models; any model that is immediately followed by a start file in the argument list will use the corresponding start. For example:
> grbtune misc07.mps misc07.mst p0033.mps p0548.mps p0548.mst