Manuals >User's Guide >Optimizing
Print version of this Book (PDF file)
prevnext

Optimization Time

This section explains optimization time for Levenberg-Marquardt, Random, and Hybrid (Random/LM) optimization.

Levenberg-Marquardt Optimization

The largest amount of time spent by the optimizer is in performing a function evaluation (specified as Function_eval). Function_eval calls the model a number of times in order to generate the data points defined by the setup.

The following equations illustrate the relationship between the number of calls to the model (specified as Model_eval) and the other factors involved in the optimization. Model_eval is an integer that is proportional to the time it takes to perform an optimization.

I = number of iterations

A = number of parameters optimized when calculating Jacobian

A = 0 when updating Jacobian by rank 1

Function_eval = I + 1 + (I • A)

Model_eval = total number of data points per Setup • Function_eval

To help reduce optimization time, use good initial values for parameters, use the least practical number of data points, and avoid optimizing large numbers of parameters at the same time.

For relatively simple functions that can be represented by a few equations, a transform can be written to evaluate the function by simply solving the equations using specified parameter values and bias values. By entering the name of this function transform into the simulated field of the optimizer Inputs table, and the name of the measured data set into the Measured field, the optimization is performed using the transform function's equation playback instead of a simulation for each iteration. The advantage is that a function executes in much less time than a simulation. The cj setup in the bjt_npn model provides an example of this.

Random Optimization

There is no Jacobian calculation in a Random optimization, so the optimization time is not affected by the number of parameters optimized. A random guess and function evaluation occurs at the start of each iteration. If the guess reduces the error, the next iteration begins. If the error is not reduced, the parameter values are moved in the opposite direction of the guess and the function is reevaluated. As a result, each iteration will contain either one or two function evaluations, regardless of the number of parameters that are optimized.

However, the additional degrees of freedom that result from additional parameters generally result in a larger number of iterations to achieve the specified exit condition. This effect can be controlled by bounding the parameters as tightly as possible.

Hybrid (Random/LM) Optimization

In Hybrid (Random/LM) optimization, each function evaluation of Random optimization is replaced by a full Levenberg-Marquardt optimization. As a result, Hybrid (Random/LM) optimization begins with a Levenberg-Marquardt optimization. If the initial optimization is successful, no Random optimization is performed and the optimization time is identical to that of performing a standard Levenberg-Marquardt optimization. If the initial optimization is not successful, the optimization time depends on how many iterations are required.

One possible mechanism for speeding up Hybrid (Random/LM) optimization is by reducing the maximum number of function evaluations allowed for Levenberg-Marquardt optimization (Max Evals). This mechanism causes the optimizer to take another random guess if it is not able to reduce the error quickly. This same effect can also be achieved by increasing the Function Tol parameter.


prevnext