Sie sind auf Seite 1von 2

Literature Review

Dynamic market trends like consumers demand for variety, increasingly shortening product
life cycles and competitive pressure has forced the manufacturers to cut costs to remain
relevant in the market. The conflicting requirements of responsiveness and minimum
inventory demands efficient, effective and accurate scheduling. This situation has resulted in
a great need for good scheduling algorithms and heuristics. Scheduling involves Constraint
Optimisation Problem (COP) to find a sequential allocation of scarce resources that optimises
a particular objective function.
Evolution of research in scheduling theory has taken place in the space of last sixty years
ranging from unrefined dispatching rules to highly sophisticated parallel branch and bound
algorithms and bottleneck heuristics. This field of study has taken inspiration from various
fields including biology, genetics and neurophysiology making this afield a multidisciplinary
field.
The deterministic job-shop scheduling problem is the most general of classical scheduling
problems wherein the duration in which all operations for all jobs are completed termed as
makespan (C
max
) is minimised while satisfying all precedence and capacity constraints.
The first recognised effort in scheduling theory comes from Jhonson(1950) who developed an
efficient algorithm for a simple two machine flow shop which minimised the maximum flow
time .This problem is described as n/2/F/Fmax (Conway,1967) where each job has to be
processed by both machines.This early effort has shaped subsequent research which used the
criterion of minimising the makespan(Cmax ). Other efficient methods developed for the
jobshop include those by Akers (1956) for the 2m problem and Jackson (1956) for the n2
instance where there are no more than 2 operations per job. More recently Hefetz and Adiri
(1982) have developed an efficient approach for the n2 problem where all operations are of
unit processing time, while Williamson(1997)prove that determining the existence of a
schedule with a makespan of three can be done in polynomial time as long as the total
processing time required by all the operations on each machine is no more than three.

In 1960s the focus shifted from crude heuristics to finding exact solutions by the application
of enumerative algorithms which adopted more elaborate and sophisticated mathematical
methods. During 1970s and until late 1980s the focus was on justifying the complexity but
due to general approximation methods became suitable alternatives as they provided faster
solutions to larger problems at the cost of optimality. Various rules were created which
involved linear or randomised combination of several priority dispatch rules (Fisher and
Thompson,1963; Panwalker and Iskander,1977,1984). Fuzzy logic (Grabot and
Geneste,1994)and Genetic Local Search (Dorndorf and Pesh, 1995) . It was consequently
realized that no single rule is superior in the case of makespan minimisation as they only
considered only the current state of the machine and its immediate surroundings and the
solution quality decreased as problem dimensionality increased. Later on the emphasis shifted
towards solving solving the problem by the application of approximation methods. In this
scenario Fisher and Rinnooy Kan (1988) proposed solutions to meet the requirement.

Local search techniques have improved in recent years and reached to a certain degree of
sophistication. Future work in the field lies in hybrid system where some attention has
returned to the issues of complexity and derivation of efficient algorithms.

Reference:
Jain A.S and Meeran S (1998). Deterministic job-shop scheduling:Past,present and
future.European Journal of Operational Research. Vol 113,Page 390-434.

Das könnte Ihnen auch gefallen