9月6日什么星座| 湿疹是由什么引起的| 黑色签字笔是什么笔| 眉毛上长痣代表什么| 7.17什么星座| 术后病人吃什么营养恢复快| 为什么会得甲减| 玉米须加什么治痛风| 经常过敏是什么原因| 颈部淋巴结肿大挂什么科| 为什么会头晕| 衣食无忧是什么生肖| 孕早期吃什么水果| warning是什么意思| 睡眠不好挂什么科| 为什么人会打喷嚏| 一天吃一个苹果有什么好处| 黄瓜有什么功效| 嗓子干痒咳嗽吃什么药| 油茶是什么| 喝温开水有什么好处| 孕妇喝纯牛奶对胎儿有什么好处| 坐怀不乱是什么意思| pm代表什么| 乙酸是什么| 胃肠炎可以吃什么水果| 凝视的近义词是什么| 处女座男生喜欢什么样的女生| 今年43岁属什么| 为什么会长疤痕疙瘩| 提单是什么| 长春有什么大学| 89年是什么命| 什么汤养胃| hbv是什么| 月经期头疼是什么原因| 上24休24是什么意思| 天德合是什么意思| 怀孕后的分泌物是什么样的| praal00是什么型号| 什么是川崎病是什么病| 得糖尿病的原因是什么| 蝙蝠是什么类动物| 龙和什么生肖最配| 无名指戴戒指是什么意思| hcg低有什么补救的办法| 鼻窦炎吃什么药好| 举案齐眉是什么意思| 什么地眨眼| 大宗商品是什么意思| 一个夸一个瓜念什么| 儿童不长个子去医院挂什么科| 2002年是什么年| 阳历3月是什么星座| 怀孕几天后有什么反应| 史字五行属什么| 入殓师是干什么的| 告加鸟念什么| 手链突然断了预示什么| 人大代表是什么| 栀子黄是什么| 一带一路是指什么| 中医的精髓是什么| 勉强是什么意思| 咽喉疱疹是什么症状| 狮子的天敌是什么动物| 牙齿一吸就出血是什么原因| 陆家嘴为什么叫陆家嘴| 睾丸发炎吃什么药| 驿站是什么意思| 胸有成竹什么意思| 经常口臭的人是什么原因引起的| 舌根发麻是什么原因| 农历六月十三是什么星座| 术前四项检查是什么| 咕咕咕咕叫是什么鸟| 诸葛亮发明了什么| 嘴紫是什么原因| ideal是什么意思| 补气血吃什么中成药最好| 人得猫癣用什么药| 宝宝头发黄是缺什么| 经常咬手指甲是什么原因| 为什么不爱我| 羊肉馅饺子放什么菜| 什么是新时代| 挑食是什么意思| 谷氨酸钠是什么东西| 杏不能和什么一起吃| 薄荷泡水喝有什么好处| 梦到血是什么意思| 龟头上有小红点是什么| 搪塞是什么意思| 甲沟炎看什么科室| 低血压高是什么原因| 为什么今年闰六月| 出是什么意思| 阿司匹林主要治什么病| 桑黄是什么| 虚病是什么意思| 缺锌会有什么症状| 与其让你在我怀中枯萎是什么歌| 西红柿和什么不能一起吃| 蓝精灵是什么药| 8月27日什么星座| 恩替卡韦片是什么药| 梦见自己掉头发是什么意思| 茵陈和什么泡水喝对肝脏最好| 梦见自己和别人结婚是什么意思| 什么食物含维生素b| 咽喉发炎吃什么药| 面包虫长大后变成什么| 口腔溃疡吃什么药好的快| 半月板后角变性什么意思| 东莞五行属什么| 市值是什么意思| 手发胀是什么原因造成的| 花椒有什么功效与作用| 熊猫喜欢吃什么食物| 大于90度的角是什么角| 什么是代孕| 下葬有什么讲究或忌讳| 皲裂什么意思| 被是什么偏旁怎么读| 皮内瘤变到底是什么意思| 白细胞酯酶阳性是什么意思| 舌苔白吃什么药效果好| 脖子肿是什么原因| 茶油有什么功效| 包菜是什么菜| hpv阳性意味着什么| 什么病才吃阿昔洛韦片| 脱发补充什么维生素| 切除子宫有什么影响| 疣是什么原因造成的| 补铁的药什么时候吃最好| 俄罗斯和白俄罗斯有什么区别| 女人蜕变是什么意思| 驻颜是什么意思| nice什么意思| 保教费是什么意思| 糖尿病人可以吃什么| 子宫内膜脱落是什么意思| 二月初五是什么星座| 胃酸反流是什么原因造成| 结婚14年是什么婚| 什么可以消肿快的方法| 尿急尿频吃什么药| 壬字五行属什么| 什么是丁克| 舌尖发麻是什么原因引起的| 肩膀疼挂什么科室最好| 补钾吃什么| 凉拌菜用什么醋好| 尿酸高吃什么| 味淋是什么调料| 什么叫因果| 幽门螺旋杆菌阳性吃什么药| 翠色什么流| 洋姜有什么功效与作用| 水痘长什么样| 核桃壳有什么用处| 胃疼需要做什么检查| 闪光点是什么意思| 牙髓炎是什么| 维生素b12高是什么原因| 各就各位是什么意思| 肝钙化灶什么意思| 溺爱什么意思| 死是什么意思| 出汗对身体有什么好处| 脉浮是什么意思| 什么生肖没有牙齿| 大林木命适合做什么行业| 正常人突然抽搐是什么原因| 男人吃什么大补| 红蜘蛛用什么药| 368什么意思| 大乌龙是什么意思| 匹维溴铵片治什么病| md鞋底是什么材质| 工作单位是什么意思| 尿检阳性是什么意思| 瞌睡是什么意思| 眼镜发黄是什么原因| 热射病是什么原因引起的| 今日冲什么生肖| 左腰疼痛是什么原因男性| 巴基斯坦用什么语言| 什么是超标电动车| 拔罐是什么意思| 吃完饭恶心想吐是什么原因| 居住证是什么意思| nautical什么牌子| 男人性功能太强是什么原因| 混合性皮肤用什么护肤品比较好| 子宫肌瘤手术后吃什么好| 结婚 为什么| 肾构错瘤要注意什么| 鸡腿为什么这么便宜| 梦见铲雪预示着什么| 小孩睡觉出汗多是什么原因| 比中指是什么意思| 发烧惊厥是什么症状| 谢霆锋什么学历| 左氧氟沙星治什么| 鹦鹉爱吃什么| 长公主是什么意思| 咽颊炎吃什么药| 禁的部首是什么| 白头发是缺什么维生素| 植物神经紊乱用什么药| 总胆红素偏高是什么原因| 野格是什么酒| 蚂蚁代表什么风水| 什么的尾巴有什么作用| 散光有什么症状| 富豪是什么意思| 师长相当于地方什么级别| 朋友的反义词是什么| 下眼皮跳是什么原因| 梦见手机坏了是什么意思| 检查脑袋应该挂什么科| 七月十五日是什么节日| 伊拉克是什么人种| 指甲盖凹陷是什么原因| fredperry是什么牌子| 嗓子疼吃什么水果好| 什么水果低糖| 亚临床甲减是什么意思| 早泄吃什么药见效| 真菌感染吃什么药| 锁骨发适合什么脸型| 四月四号是什么星座| 急性心肌炎有什么症状| 阳朔有什么好玩的| 反酸是什么感觉| 七月上旬是什么时候| 紫苏叶有什么功效| 微信上面有个耳朵是什么意思| 甘油三酯高有什么危害| 高危行为是什么意思| 胳膊麻是什么原因| 甲状腺结节不能吃什么食物| 丝瓜络有什么作用| 高危hpv阳性是什么意思| 足石念什么| 肝硬化挂什么科| 拉屎发黑是什么原因| 荞麦长什么样子| 数字8五行属什么| ppsu是什么材质| 黑枸杞泡水喝有什么作用和功效| 小便失禁是什么原因男性| 小孩长白头发是什么原因| 什么样的人容易低血糖| 为什么七星瓢虫是益虫| lanvin是什么牌子| 部队股长是什么级别| 两女 一杯是什么| 猫咪拉稀吃什么药| 清洁度1度是什么意思| 手掌心经常出汗是什么原因| 老公生日送什么礼物| 百度Jump to content

猜忌是什么意思

From Wikipedia, the free encyclopedia
百度 对比一下怀孕前的照片,有喜这点基本上是石锤了。

Simulated annealing can be used to solve combinatorial problems. Here it is applied to the travelling salesman problem to minimize the length of a route that connects all 125 points.
Travelling salesman problem in 3D for 120 points solved with simulated annealing.

Simulated annealing (SA) is a probabilistic technique for approximating the global optimum of a given function. Specifically, it is a metaheuristic to approximate global optimization in a large search space for an optimization problem. For large numbers of local optima, SA can find the global optimum.[1] It is often used when the search space is discrete (for example the traveling salesman problem, the boolean satisfiability problem, protein structure prediction, and job-shop scheduling). For problems where a fixed amount of computing resource is available, finding an approximate global optimum may be more relevant than attempting to find a precise local optimum. In such cases, SA may be preferable to exact algorithms such as gradient descent or branch and bound.

The name of the algorithm comes from annealing in metallurgy, a technique involving heating and controlled cooling of a material to alter its physical properties. Both are attributes of the material that depend on their thermodynamic free energy. Heating and cooling the material affects both the temperature and the thermodynamic free energy or Gibbs energy. Simulated annealing can be used for very hard computational optimization problems where exact algorithms fail; even though it usually only achieves an approximate solution to the global minimum, this is sufficient for many practical problems.

The problems solved by SA are currently formulated by an objective function of many variables, subject to several mathematical constraints. In practice, the constraint can be penalized as part of the objective function.

Similar techniques have been independently introduced on several occasions, including Pincus (1970),[2] Khachaturyan et al (1979,[3] 1981[4]), Kirkpatrick, Gelatt and Vecchi (1983), and Cerny (1985).[5] In 1983, this approach was used by Kirkpatrick, Gelatt Jr., and Vecchi[6] for a solution of the traveling salesman problem. They also proposed its current name, simulated annealing.

This notion of slow cooling implemented in the simulated annealing algorithm is interpreted as a slow decrease in the probability of accepting worse solutions as the solution space is explored. Accepting worse solutions allows for a more extensive search for the global optimal solution. In general, simulated annealing algorithms work as follows. The temperature progressively decreases from an initial positive value to zero. At each time step, the algorithm randomly selects a solution close to the current one, measures its quality, and moves to it according to the temperature-dependent probabilities of selecting better or worse solutions, which during the search respectively remain at 1 (or positive) and decrease toward zero.

The simulation can be performed either by a solution of kinetic equations for probability density functions,[7][8] or by using a stochastic sampling method.[6][9] The method is an adaptation of the Metropolis–Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, published by N. Metropolis et al. in 1953.[10]

Overview

[edit]

The state s of some physical systems, and the function E(s) to be minimized, is analogous to the internal energy of the system in that state. The goal is to bring the system, from an arbitrary initial state, to a state with the minimum possible energy.

Simulated annealing searching for a maximum. The objective here is to get to the highest point. In this example, it is not enough to use a simple hill climb algorithm, as there are many local maxima. By cooling the temperature slowly the global maximum is found.

The basic iteration

[edit]

At each step, the simulated annealing heuristic considers some neighboring state s* of the current state s, and probabilistically decides between moving the system to state s* or staying in state s. These probabilities ultimately lead the system to move to states of lower energy. Typically this step is repeated until the system reaches a state that is good enough for the application, or until a given computation budget has been exhausted.

The neighbors of a state

[edit]

Optimization of a solution involves evaluating the neighbors of a state of the problem, which are new states produced through conservatively altering a given state. For example, in the traveling salesman problem each state is typically defined as a permutation of the cities to be visited, and the neighbors of any state are the set of permutations produced by swapping any two of these cities. The well-defined way in which the states are altered to produce neighboring states is called a "move", and different moves give different sets of neighboring states. These moves usually result in minimal alterations of the last state, in an attempt to progressively improve the solution through iteratively improving its parts (such as the city connections in the traveling salesman problem). It is even better to reverse the order of an interval of cities. This is a smaller move since swapping two cities can be achieved by twice reversing an interval.

Simple heuristics like hill climbing, which move by finding better neighbor after better neighbor and stop when they have reached a solution which has no neighbors that are better solutions, cannot guarantee to lead to any of the existing better solutions – their outcome may easily be just a local optimum, while the actual best solution would be a global optimum that could be different. Metaheuristics use the neighbors of a solution as a way to explore the solution space, and although they prefer better neighbors, they also accept worse neighbors in order to avoid getting stuck in local optima; they can find the global optimum if run for a long enough amount of time.

Acceptance probabilities

[edit]

The probability of making the transition from the current state to a candidate new state is specified by an acceptance probability function , that depends on the energies and of the two states, and on a global time-varying parameter called the temperature. States with a smaller energy are better than those with a greater energy. The probability function must be positive even when is greater than . This feature prevents the method from becoming stuck at a local minimum that is worse than the global one.

When tends to zero, the probability must tend to zero if and to a positive value otherwise. For sufficiently small values of , the system will then increasingly favor moves that go "downhill" (i.e., to lower energy values), and avoid those that go "uphill." With the procedure reduces to the greedy algorithm, which makes only the downhill transitions.

In the original description of simulated annealing, the probability was equal to 1 when —i.e., the procedure always moved downhill when it found a way to do so, irrespective of the temperature. Many descriptions and implementations of simulated annealing still take this condition as part of the method's definition. However, this condition is not essential for the method to work.

The function is usually chosen so that the probability of accepting a move decreases when the difference increases—that is, small uphill moves are more likely than large ones. However, this requirement is not strictly necessary, provided that the above requirements are met.

Given these properties, the temperature plays a crucial role in controlling the evolution of the state of the system with regard to its sensitivity to the variations of system energies. To be precise, for a large , the evolution of is sensitive to coarser energy variations, while it is sensitive to finer energy variations when is small.

The annealing schedule

[edit]
Fast
Fast
Slow
Slow
Example illustrating the effect of cooling schedule on the performance of simulated annealing. The problem is to rearrange the pixels of an image so as to minimize a certain potential energy function, which causes similar colors to attract at short range and repel at a slightly larger distance. The elementary moves swap two adjacent pixels. These images were obtained with a fast cooling schedule (left) and a slow cooling schedule (right), producing results similar to amorphous and crystalline solids, respectively.

The name and inspiration of the algorithm demand an interesting feature related to the temperature variation to be embedded in the operational characteristics of the algorithm. This necessitates a gradual reduction of the temperature as the simulation proceeds. The algorithm starts initially with set to a high value (or infinity), and then it is decreased at each step following some annealing schedule—which may be specified by the user but must end with towards the end of the allotted time budget. In this way, the system is expected to wander initially towards a broad region of the search space containing good solutions, ignoring small features of the energy function; then drift towards low-energy regions that become narrower and narrower, and finally move downhill according to the steepest descent heuristic.

For any given finite problem, the probability that the simulated annealing algorithm terminates with a global optimal solution approaches 1 as the annealing schedule is extended.[11] This theoretical result, however, is not particularly helpful, since the time required to ensure a significant probability of success will usually exceed the time required for a complete search of the solution space.[12]

Pseudocode

[edit]

The following pseudocode presents the simulated annealing heuristic as described above. It starts from a state s0 and continues until a maximum of kmax steps have been taken. In the process, the call neighbour(s) should generate a randomly chosen neighbour of a given state s; the call random(0, 1) should pick and return a value in the range [0, 1], uniformly at random. The annealing schedule is defined by the call temperature(r), which should yield the temperature to use, given the fraction r of the time budget that has been expended so far.

  • Let s = s0
  • For k = 0 through kmax (exclusive):
    • T ← temperature( 1 - (k+1)/kmax )
    • Pick a random neighbour, snew ← neighbour(s)
    • If P(E(s), E(snew), T) ≥ random(0, 1):
      • ssnew
  • Output: the final state s

Selecting the parameters

[edit]

In order to apply the simulated annealing method to a specific problem, one must specify the following parameters: the state space, the energy (goal) function E(), the candidate generator procedure neighbour(), the acceptance probability function P(), and the annealing schedule temperature() AND initial temperature init_temp. These choices can have a significant impact on the method's effectiveness. Unfortunately, there are no choices of these parameters that will be good for all problems, and there is no general way to find the best choices for a given problem. The following sections give some general guidelines.

Sufficiently near neighbour

[edit]

Simulated annealing may be modeled as a random walk on a search graph, whose vertices are all possible states, and whose edges are the candidate moves. An essential requirement for the neighbour() function is that it must provide a sufficiently short path on this graph from the initial state to any state which may be the global optimum – the diameter of the search graph must be small. In the traveling salesman example above, for instance, the search space for n = 20 cities has n! = 2,432,902,008,176,640,000 (2.4 quintillion) states; yet the number of neighbors of each vertex is edges (coming from ), and the diameter of the graph is .

Transition probabilities

[edit]

To investigate the behavior of simulated annealing on a particular problem, it can be useful to consider the transition probabilities that result from the various design choices made in the implementation of the algorithm. For each edge of the search graph, the transition probability is defined as the probability that the simulated annealing algorithm will move to state when its current state is . This probability depends on the current temperature as specified by temperature(), on the order in which the candidate moves are generated by the neighbour() function, and on the acceptance probability function P(). (Note that the transition probability is not simply , because the candidates are tested serially.)

Acceptance probabilities

[edit]

The specification of neighbour(), P(), and temperature() is partially redundant. In practice, it's common to use the same acceptance function P() for many problems and adjust the other two functions according to the specific problem.

In the formulation of the method by Kirkpatrick et al., the acceptance probability function was defined as 1 if , and otherwise. This formula was superficially justified by analogy with the transitions of a physical system; it corresponds to the Metropolis–Hastings algorithm, in the case where T=1 and the proposal distribution of Metropolis–Hastings is symmetric. However, this acceptance probability is often used for simulated annealing even when the neighbour() function, which is analogous to the proposal distribution in Metropolis–Hastings, is not symmetric, or not probabilistic at all. As a result, the transition probabilities of the simulated annealing algorithm do not correspond to the transitions of the analogous physical system, and the long-term distribution of states at a constant temperature need not bear any resemblance to the thermodynamic equilibrium distribution over states of that physical system, at any temperature. Nevertheless, most descriptions of simulated annealing assume the original acceptance function, which is probably hard-coded in many implementations of SA.

In 1990, Moscato and Fontanari,[13] and independently Dueck and Scheuer,[14] proposed that a deterministic update (i.e. one that is not based on the probabilistic acceptance rule) could speed-up the optimization process without impacting on the final quality. Moscato and Fontanari conclude from observing the analogous of the "specific heat" curve of the "threshold updating" annealing originating from their study that "the stochasticity of the Metropolis updating in the simulated annealing algorithm does not play a major role in the search of near-optimal minima". Instead, they proposed that "the smoothening of the cost function landscape at high temperature and the gradual definition of the minima during the cooling process are the fundamental ingredients for the success of simulated annealing." The method subsequently popularized under the denomination of "threshold accepting" due to Dueck and Scheuer's denomination. In 2001, Franz, Hoffmann and Salamon showed that the deterministic update strategy is indeed the optimal one within the large class of algorithms that simulate a random walk on the cost/energy landscape.[15]

Efficient candidate generation

[edit]

When choosing the candidate generator neighbour(), one must consider that after a few iterations of the simulated annealing algorithm, the current state is expected to have much lower energy than a random state. Therefore, as a general rule, one should skew the generator towards candidate moves where the energy of the destination state is likely to be similar to that of the current state. This heuristic (which is the main principle of the Metropolis–Hastings algorithm) tends to exclude very good candidate moves as well as very bad ones; however, the former are usually much less common than the latter, so the heuristic is generally quite effective.

In the traveling salesman problem above, for example, swapping two consecutive cities in a low-energy tour is expected to have a modest effect on its energy (length); whereas swapping two arbitrary cities is far more likely to increase its length than to decrease it. Thus, the consecutive-swap neighbor generator is expected to perform better than the arbitrary-swap one, even though the latter could provide a somewhat shorter path to the optimum (with swaps, instead of ).

A more precise statement of the heuristic is that one should try the first candidate states for which is large. For the "standard" acceptance function above, it means that is on the order of or less. Thus, in the traveling salesman example above, one could use a neighbour() function that swaps two random cities, where the probability of choosing a city-pair vanishes as their distance increases beyond .

Barrier avoidance

[edit]

When choosing the candidate generator neighbor () one must also try to reduce the number of "deep" local minima—states (or sets of connected states) that have much lower energy than all its neighboring states. Such "closed catchment basins" of the energy function may trap the simulated annealing algorithm with high probability (roughly proportional to the number of states in the basin) and for a very long time (roughly exponential on the energy difference between the surrounding states and the bottom of the basin).

As a rule, it is impossible to design a candidate generator that will satisfy this goal and also prioritize candidates with similar energy. On the other hand, one can often vastly improve the efficiency of simulated annealing by relatively simple changes to the generator. In the traveling salesman problem, for instance, it is not hard to exhibit two tours , , with nearly equal lengths, such that (1) is optimal, (2) every sequence of city-pair swaps that converts to goes through tours that are much longer than both, and (3) can be transformed into by flipping (reversing the order of) a set of consecutive cities. In this example, and lie in different "deep basins" if the generator performs only random pair-swaps; but they will be in the same basin if the generator performs random segment-flips.

Cooling schedule

[edit]

The physical analogy that is used to justify simulated annealing assumes that the cooling rate is low enough for the probability distribution of the current state to be near thermodynamic equilibrium at all times. Unfortunately, the relaxation time—the time one must wait for the equilibrium to be restored after a change in temperature—strongly depends on the "topography" of the energy function and on the current temperature. In the simulated annealing algorithm, the relaxation time also depends on the candidate generator, in a very complicated way. Note that all these parameters are usually provided as black box functions to the simulated annealing algorithm. Therefore, the ideal cooling rate cannot be determined beforehand and should be empirically adjusted for each problem. Adaptive simulated annealing algorithms address this problem by connecting the cooling schedule to the search progress. Other adaptive approaches such as Thermodynamic Simulated Annealing,[16] automatically adjusts the temperature at each step based on the energy difference between the two states, according to the laws of thermodynamics.

Restarts

[edit]

Sometimes it is better to move back to a solution that was significantly better rather than always moving from the current state. This process is called restarting of simulated annealing. To do this we set and to and and perhaps restart the annealing schedule. The decision to restart could be based on several criteria. Notable among these include restarting based on a fixed number of steps, based on whether the current energy is too high compared to the best energy obtained so far, restarting randomly, etc.

[edit]
  • Interacting Metropolis–Hasting algorithms (a.k.a. sequential Monte Carlo[17]) combines simulated annealing moves with an acceptance-rejection of the best-fitted individuals equipped with an interacting recycling mechanism.
  • Quantum annealing uses "quantum fluctuations" instead of thermal fluctuations to get through high but thin barriers in the target function.
  • Stochastic tunneling attempts to overcome the increasing difficulty simulated annealing runs have in escaping from local minima as the temperature decreases, by 'tunneling' through barriers.
  • Tabu search normally moves to neighbouring states of lower energy, but will take uphill moves when it finds itself stuck in a local minimum; and avoids cycles by keeping a "taboo list" of solutions already seen.
  • Dual-phase evolution is a family of algorithms and processes (to which simulated annealing belongs) that mediate between local and global search by exploiting phase changes in the search space.
  • Reactive search optimization focuses on combining machine learning with optimization, by adding an internal feedback loop to self-tune the free parameters of an algorithm to the characteristics of the problem, of the instance, and of the local situation around the current solution.
  • Genetic algorithms maintain a pool of solutions rather than just one. New candidate solutions are generated not only by "mutation" (as in SA), but also by "recombination" of two solutions from the pool. Probabilistic criteria, similar to those used in SA, are used to select the candidates for mutation or combination, and for discarding excess solutions from the pool.
  • Memetic algorithms search for solutions by employing a set of agents that both cooperate and compete in the process; sometimes the agents' strategies involve simulated annealing procedures for obtaining high-quality solutions before recombining them.[18] Annealing has also been suggested as a mechanism for increasing the diversity of the search.[19]
  • Graduated optimization digressively "smooths" the target function while optimizing.
  • Ant colony optimization (ACO) uses many ants (or agents) to traverse the solution space and find locally productive areas.
  • The cross-entropy method (CE) generates candidate solutions via a parameterized probability distribution. The parameters are updated via cross-entropy minimization, so as to generate better samples in the next iteration.
  • Harmony search mimics musicians in improvisation where each musician plays a note to find the best harmony together.
  • Stochastic optimization is an umbrella set of methods that includes simulated annealing and numerous other approaches.
  • Particle swarm optimization is an algorithm modeled on swarm intelligence that finds a solution to an optimization problem in a search space, or models and predicts social behavior in the presence of objectives.
  • The runner-root algorithm (RRA) is a meta-heuristic optimization algorithm for solving unimodal and multimodal problems inspired by the runners and roots of plants in nature.
  • Intelligent water drops algorithm (IWD) which mimics the behavior of natural water drops to solve optimization problems
  • Parallel tempering is a simulation of model copies at different temperatures (or Hamiltonians) to overcome the potential barriers.
  • Multi-objective simulated annealing algorithms have been used in multi-objective optimization.[20]

See also

[edit]

References

[edit]
  1. ^ "What is Simulated Annealing?". www.cs.cmu.edu. Retrieved 2025-08-06.
  2. ^ Pincus, Martin (Nov–Dec 1970). "A Monte-Carlo Method for the Approximate Solution of Certain Types of Constrained Optimization Problems". Journal of the Operations Research Society of America. 18 (6): 967–1235. doi:10.1287/opre.18.6.1225.
  3. ^ Khachaturyan, A.: Semenovskaya, S.: Vainshtein B., Armen (1979). "Statistical-Thermodynamic Approach to Determination of Structure Amplitude Phases". Soviet Physics Crystallography. 24 (5): 519–524.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  4. ^ Khachaturyan, A.; Semenovskaya, S.; Vainshtein, B. (1981). "The Thermodynamic Approach to the Structure Analysis of Crystals". Acta Crystallographica. A37 (5): 742–754. Bibcode:1981AcCrA..37..742K. doi:10.1107/S0567739481001630.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  5. ^ Laarhoven, P. J. M. van (Peter J. M.) (1987). Simulated annealing : theory and applications. Aarts, E. H. L. (Emile H. L.). Dordrecht: D. Reidel. ISBN 90-277-2513-6. OCLC 15548651.
  6. ^ a b Kirkpatrick, S.; Gelatt Jr, C. D.; Vecchi, M. P. (1983). "Optimization by Simulated Annealing". Science. 220 (4598): 671–680. Bibcode:1983Sci...220..671K. CiteSeerX 10.1.1.123.7607. doi:10.1126/science.220.4598.671. JSTOR 1690046. PMID 17813860. S2CID 205939.
  7. ^ Khachaturyan, A.; Semenovskaya, S.; Vainshtein, B. (1979). "Statistical-Thermodynamic Approach to Determination of Structure Amplitude Phases". Sov.Phys. Crystallography. 24 (5): 519–524.
  8. ^ Khachaturyan, A.; Semenovskaya, S.; Vainshtein, B. (1981). "The Thermodynamic Approach to the Structure Analysis of Crystals". Acta Crystallographica. 37 (A37): 742–754. Bibcode:1981AcCrA..37..742K. doi:10.1107/S0567739481001630.
  9. ^ ?erny, V. (1985). "Thermodynamical approach to the traveling salesman problem: An efficient simulation algorithm". Journal of Optimization Theory and Applications. 45: 41–51. doi:10.1007/BF00940812. S2CID 122729427.
  10. ^ Metropolis, Nicholas; Rosenbluth, Arianna W.; Rosenbluth, Marshall N.; Teller, Augusta H.; Teller, Edward (1953). "Equation of State Calculations by Fast Computing Machines". The Journal of Chemical Physics. 21 (6): 1087. Bibcode:1953JChPh..21.1087M. doi:10.1063/1.1699114. OSTI 4390578. S2CID 1046577.
  11. ^ Granville, V.; Krivanek, M.; Rasson, J.-P. (1994). "Simulated annealing: A proof of convergence". IEEE Transactions on Pattern Analysis and Machine Intelligence. 16 (6): 652–656. doi:10.1109/34.295910.
  12. ^ Nolte, Andreas; Schrader, Rainer (1997), "A Note on the Finite Time Behaviour of Simulated Annealing", Operations Research Proceedings 1996, vol. 1996, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 175–180, doi:10.1007/978-3-642-60744-8_32, ISBN 978-3-540-62630-5, retrieved 2025-08-06
  13. ^ Moscato, P.; Fontanari, J.F. (1990), "Stochastic versus deterministic update in simulated annealing", Physics Letters A, 146 (4): 204–208, Bibcode:1990PhLA..146..204M, doi:10.1016/0375-9601(90)90166-L
  14. ^ Dueck, G.; Scheuer, T. (1990), "Threshold accepting: A general purpose optimization algorithm appearing superior to simulated annealing", Journal of Computational Physics, 90 (1): 161–175, Bibcode:1990JCoPh..90..161D, doi:10.1016/0021-9991(90)90201-B, ISSN 0021-9991
  15. ^ Franz, A.; Hoffmann, K.H.; Salamon, P (2001), "Best optimal strategy for finding ground states", Physical Review Letters, 86 (3): 5219–5222, doi:10.1103/PhysRevLett.86.5219, PMID 11384462
  16. ^ De Vicente, Juan; Lanchares, Juan; Hermida, Román (2003). "Placement by thermodynamic simulated annealing". Physics Letters A. 317 (5–6): 415–423. Bibcode:2003PhLA..317..415D. doi:10.1016/j.physleta.2003.08.070.
  17. ^ Del Moral, Pierre; Doucet, Arnaud; Jasra, Ajay (2006). "Sequential Monte Carlo samplers". Journal of the Royal Statistical Society, Series B. 68 (3): 411–436. arXiv:cond-mat/0212648. doi:10.1111/j.1467-9868.2006.00553.x. S2CID 12074789.
  18. ^ Moscato, Pablo (June 1993). "An introduction to population approaches for optimization and hierarchical objective functions: A discussion on the role of tabu search". Annals of Operations Research. 41 (2): 85–121. doi:10.1007/BF02022564. S2CID 35382644.
  19. ^ Moscato, P. (1989). "On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts: Towards Memetic Algorithms". Caltech Concurrent Computation Program (report 826).
  20. ^ Deb, Bandyopadhyay (June 2008). "A Simulated Annealing-Based Multiobjective Optimization Algorithm: AMOSA". IEEE Transactions on Evolutionary Computation. 12 (3): 269–283. doi:10.1109/TEVC.2007.900837. S2CID 12107321.

Further reading

[edit]
[edit]
内分泌失调是什么 咽喉炎吃什么药 rfc是什么意思 为什么会得痛风 保健是什么意思
睡觉做噩梦是什么原因 风湿病吃什么药 空调睡眠是什么意思 为什么会便血 红红的苹果像什么句子
01什么意思 气虚什么症状 吸烟有害健康为什么国家还生产烟 大学是什么学历 睾丸是什么形状的
双顶径是什么意思 什么时候闰正月 经期适合喝什么汤 逼格是什么意思 去医院看头发挂什么科
腰椎退行性改变什么意思hcv7jop5ns0r.cn 手麻是什么原因hcv7jop6ns1r.cn 麝香保心丸治什么病hcv8jop5ns7r.cn 晚上扫地有什么说法hcv9jop1ns4r.cn ur是什么牌子hcv9jop5ns2r.cn
五心烦热吃什么药hcv7jop7ns3r.cn 小肚右边疼是什么原因creativexi.com 淀粉酶是什么hcv8jop1ns6r.cn 暹什么意思mmeoe.com 兔子跟什么生肖最配对fenrenren.com
黄梅时节是什么季节hcv9jop4ns5r.cn 解脲脲原体阳性是什么hcv8jop2ns7r.cn 吴亦凡属什么生肖hcv8jop7ns9r.cn 傲慢表情是什么意思1949doufunao.com 坐月子是什么意思hcv9jop3ns1r.cn
滢是什么意思bjcbxg.com 接风是什么意思jinxinzhichuang.com ims是什么意思zhongyiyatai.com c肽是什么意思cl108k.com 葬礼穿什么衣服hcv8jop1ns2r.cn
百度