Simulated Annealing - 23 August 2011 - BioInformatics Pakistan
Sunday, 2017-01-22, 7:13 PM
Welcome Guest | RSS

Login form

Total online: 1
Guests: 1
Users: 0

Is Online User
Site Search
Main » 2011 » August » 23 » Simulated Annealing
2:43 PM
Simulated Annealing

Simulated Annealing

1. Description:
   Simulated annealing is a generalization of a Monte Carlo method for examining the equations of state and frozen states of n-body systems [Metropolis et al. 1953]. The concept is based on the manner in which liquids freeze or metals recrystallize in the process of annealing. In an annealing process a melt, initially at high temperature and disordered, is slowly cooled so that the system at any time is approximately in thermodynamic equilibrium. As cooling proceeds, the system becomes more ordered and approaches a "frozen" ground state at T=0. Hence the process can be thought of as an adiabatic approach to the lowest energy state. If the initial temperature of the system is too low or cooling is done insufficiently slowly the system may become quenched forming defects or freezing out in meta-stable states (ie. trapped in a local minimum energy state).

   The original Metropolis scheme was that an initial state of a thermodynamic system was chosen at energy E and temperature T, holding T constant the initial configuration is perturbed and the change in energy dE is computed. If the change in energy is negative the new configuration is accepted. If the change in energy is positive it is accepted with a probability given by the Boltzmann factor exp -(dE/T). This processes is then repeated sufficient times to give good sampling statistics for the current temperature, and then the temperature is decremented and the entire process repeated until a frozen state is achieved at T=0.

   By analogy the generalization of this Monte Carlo approach to combinatorial problems is straight forward [Kirkpatrick et al. 1983, Cerny 1985]. The current state of the thermodynamic system is analogous to the current solution to the combinatorial problem, the energy equation for the thermodynamic system is analogous to at the objective function, and ground state is analogous to the global minimum. The major difficulty (art) in implementation of the algorithm is that there is no obvious analogy for the temperature T with respect to a free parameter in the combinatorial problem. Furthermore, avoidance of entrainment in local minima (quenching) is dependent on the "annealing schedule", the choice of initial temperature, how many iterations are performed at each temperature, and how much the temperature is decremented at each step as cooling proceeds.

2. Key Points:
  1. What exploits an analogy between the annealing process and the search for the optimum in a more general system.
  2. Annealing Process
    • Raising the temperature up to a very high level (melting temperature, for example), the atoms have a higher energy state and a high possibility to re-arrange the crystalline structure.
    • Cooling down slowly, the atoms have a lower and lower energy state and a smaller and smaller possibility to re-arrange the crystalline structure.
  3. Analogy:
    • Metal ßà Problem
    • Energy State ßà Cost Function
    • Temperature ßà Control Parameter
    • A completely ordered crystalline structure ßà the optimal solution for the problem
    • Global optimal solution can be achieved as long as the cooling process is slow enough. 
  4. Metropolis Loop:
    1. The essential characteristic of simulated annealing.
    2. Determining how to randomly explore new solution, reject or accept the new solution at a constant temperature T.
    3. Finished until equilibrium is achieved.
  5. Metropolis Criterion:
    • Let
      1. X be the current solution and X’ be the new solution
      2. C(x) (C(x’))be the energy state (cost) of x (x’) 
    • Probability Paccept  = exp [(C(x)-C(x’))/ T]
    • Let N=Random(0,1)
    • Unconditional accepted if
      1. C(x’) < C(x), the new solution is better
    • Probably accepted 
      1. if C(x’) >= C(x), the new solution is worse . Accepted only when N < Paccept
3.  Algorithm Structure:
4.  Algorithm:
Initialize initial solution x , highest temperature Th, and coolest temperature Tl
T= Th
When the temperature is higher than Tl
    While not in equilibrium
 Search for the new solution X’
       Accept or reject X’ according to Metropolis Criterion
   Decrease the temperature T

  • Definition of solution
  • Search mechanism, i.e. the definition of a neighborhood
  • Cost-function 
5. Control Parameters:
  1. Definition of equilibrium
    • Cannot yield any significant improvement after certain number of loops
    • A constant number of loops
  2. Annealing schedule (i.e. How to reduce the temperature)
    • A constant value, T’ = T - Td
    • A constant scale factor, T’= T * Rd
      1. A scale factor usually can achieve better performance
  3. Temperature determination
    • Artificial, without physical significant
    • Initial temperature
      1. 80-90% acceptance rate
    • Final temperature
      1. A constant value, i.e., based on the total number of solutions searched
      2. No improvement during the entire Metropolis loop
      3. Acceptance rate falling below a given (small) value
    • Problem specific and may need to be tuned

6. Example:
  1. Traveling Salesman Problem (TSP)
    • Given 6 cities and the traveling cost between any two cities
    • A salesman need to start from city 1 and travel all other cities then back to city 1
    • Minimize the total traveling cost
  2. Solution Representation
    • An integer list, i.e., (1,4,2,3,6,5)
  3. Search Mechanism
    • Swap any two integers (except for the first one)
    • (1,4,2,3,6,5) ßà (1,4,3,2,6,5)
  4. Cost Function
  5. Temperature
    • Initial temperature determination
      1. Around 80% acceptation rate for "bad move”
      2. Determine acceptable (Cnew Cold
    • Final temperature determination
      1. Stop criteria
      2. Solution space coverage rate
    • Annealing schedule
      1. Constant number (90% for example)
      2. Depending on solution space coverage rate
7. Others:
  1. Global optimal is possible, but near optimal is practical
  2. Parameter Tuning
    • Aarts, E. and Korst, J. (1989). Simulated Annealing and Boltzmann Machines. John Wiley & Sons. 
  3. Not easy for parallel implementation
  4. Randomly generator

Category: Algorithms | Views: 27333 | Added by: Ansari | Rating: 0.0/0
Total comments: 1
cicbyb bxwxla
rrqf vshoog

Name *:
Email *:
Code *:
Tag Board
Your Ads
Site friends

Copyright MyCorp © 2017
Free website builderuCoz