Next: Optimization with function evaluation Up: Optimization Previous: Methods   Contents

# Implementation

Preliminary notes:
• for both procedures you may set the global variable ALIAS_Has_Optimum to 1 to indicate that you have already determined a possible optimum value. This value has to be given in the global interval variable ALIAS_Optimum. Note that this variable is used to store the minimum or the maximum of the function to be minimized or maximized. If no a-priori information on the extremum has been given ALIAS_Has_Optimum will be set to 1 as soon as the algorithm has a current estimation of the extremum. If both the minimum and maximum are looked for, then this variable will be used to store the minimum while ALIAS_Optimum_Max will be used to store the maximum. If you have indicated an a-priori estimation of the extremum it may happen that the algorithm is unable to find a better extremum. In that case the global variables ALIAS_Algo_Optimum, ALIAS_Algo_Optimum_Max will be set to the extremum find by the algorithm (hence if no a-priori information has been given ALIAS_Algo_Optimum and ALIAS_Optimum will be the same). The flag ALIAS_Algo_Has_Optimum will be set to 1 as soon as the algorithm has a current estimation of the extremum (which may be worse than the optimum given a priori). If the algorithm find a better optimum than the optimum given a priori the flag ALIAS_Algo_Find_Optimum will be set to 1.

The location of the minimum and maximum can be found in the interval matrix ALIAS_Vector_Optimum, the first line indicating the location of the minimum.

• the flag ALIAS_Optimize is used to indicate which type of optimum you are looking for: -1 for a minimum, 1 for a maximum and 10 for both. This flag is automatically set by the optimization procedures
• the accuracy with which the optimum will be computed is available in the global variable
ALIAS_Accuracy_Extremum
• the function number of the expression to be optimized is available in the global variable ALIAS_Opt_Func
• you may follow the progress of the optimization process by setting the debug flag to at least 1 and setting the flag ALIAS_Allow_Storage to 1. In that case each time the algorithm find a better approximation than the current one he will write down the new value in the file ALIAS_Allow_Storage_File (whose default name is .opti)
• you may also use these procedures just to test if a given function has a minimum lower or a maximum greater than a given value. If you set the flag ALIAS_Stop to 1 and the double ALIAS_Extremum to the desired value the optimization procedure will return as soon as a better minimum or maximum is found. Note that the optimization procedure always reset ALIAS_Stop to 0.
• these procedures are only a specific cases of the general solving procedures: hence the computation time may be largely influenced by the bisection mode
• another factor which may be of importance for the efficiency is the storage mode. We have seen that two storage modes are available in ALIAS: the direct or reverse modes. The underlying principle in the interval analysis method is to examine the leaves of a tree, the terminal leaves satisfying a decision criteria: either they are a solution of the problem or not. In the reverse storage mode we follow all the branches starting from a certain leave until we reach all the terminal leaves of obtained from this leave while in the direct storage mode we skip from one branch to another. This mode has the drawback of an exponential growth in the needed storage while the reverse mode has the advantage to go directly to the smallest leaves that may be deleted, thereby freeing storage space. When solving a system the tree is static i.e. we can construct it (this is what the algorithm is doing) and we explore all the leaves. In an optimization problem the tree is dynamic: as soon as the exploration of a leave enable to update the current estimation of the extremum we in fact all the branches of the tree that cannot lead to an improvement of the extremum. In other words it is important to determine as quickly as possible the best estimation. In an optimization problem we cannot determine beforehand the branches that has to be followed for finding the leaves which is the solution. Thereby the reverse mode has the drawback that if we are following the wrong branches we will never eliminate branches and we will explore a large number of leaves that may have been avoided. With the direct mode we skip from one branch to another and thus we have a good probability of updating quickly the extremum. It will therefore be of good policy to use the direct mode but we then have to deal with the problem of storage. A mixed strategy can be used: we start with the direct mode and as soon as we expect a storage problem we switch to the reverse mode. This may be done by using the flag Switch_Reverse_Storage to a value larger than 1: in that case as soon as the number of stored unexplored boxes exceed the algorithm will switch to the reverse mode. The value of may, for example, be chosen as half the number of storage space that is allocated. In this optimization mode we may also use a mixed strategy between the direct and reverse storage mode: if the flag Reverse_Storage is set to , then if the bisection of the current box leads to new boxes, the first boxes (as ordered using the ordering described in the Order section) are stored in the list starting at the current position of the list, while the remaining boxes are stored at the end of the list. Thus setting to 1 is equivalent to using the direct storage mode, while, if we have unknowns, setting to a value greater than is equivalent to using the reverse storage mode.
• if you use a mixed bisection mode there is a convenient way to indicate some dependencies between the variable, see the note 8.3.5
• the 2B method (see section 2.17) may be used to improve the efficiency of the optimization algorithms. The ALIAS-Maple procedures HullIConsistency, HullConsistency implement the 2B method and allows to deal with an equation or an inequality that involves the current optimum.

Subsections

Next: Optimization with function evaluation Up: Optimization Previous: Methods   Contents
Jean-Pierre Merlet 2012-12-20