The formal start of Operational Research took place in England in late 1939. The aim was to achieve the highest possible efficiency. Thus in August 1940 the Physicist PMS Balckett at the University of Manchester was charged with forming a task force to study the radar-governed anti-aircraft defense system.

One of the first efforts of this group was directed at studying the air attack on submarines. But although the reasoning was valid, the results obtained with this policy were very limited.

In short, the depth of thirty meters was adequate when the submarine spotted the bomber in advance, but the lack of precision prevented results.

It was concluded that the most suitable alternative was to choose to cause damage when the submarine was on the surface. Aspects that characterize Operations Research studies:

  1. Direct Data Collection.
  2. Use of mathematical models
  3. Obtaining optimal policies
  4. Modification of said policies in accordance with real factors not considered in the model.

In the United States, funds for research in the military field increased, so most groups consolidated their number and size.

Instead in Britain the components of the groups had developed in the military environment passed to civil society.

Another important aspect in this context is that the development of the traditional Industrial Organization in Great Britain had been more limited and with the exception of the Labor Study it was still a novelty in industrial circles. In the mid-XNUMXs, operational research was entrenched in the industrial world.

The IO uses results from many scientific areas, although its fundamental base is in mathematics, economics, and the calculation of probabilities and statistics.

The first studies that were labeled as Operational Research, the most characteristic technical aspect consisted of the statistical structuring of the data and the use of descriptive models of a probabilistic type.

The mathematical foundations of discrete linear models are found in the theory of linear inequalities developed in the last century. In the rest of the XNUMXs, Linear Programming was fully established with Charnes' works on Lemke's degeneration on duality, Dantzing, Orden and Wolfe on compact form and the decomposition of great programs.

However, Whole Linear Programming does not receive attention until the end of this decade when Gomory gets the general expression.

Despite hopes that general procedure remains a field with limited and unsatisfactory methods

In non-linear models, the fundamental results come from the development of mathematical calculation in the XNUMXth century, the basic concept being that of the Langrangian.

Nonlinear Programming progressed during the XNUMXs and XNUMXs, and medium-sized problem solving could be attacked with several dozen constraints and a few hundred variables.

Dynamic Programming its beginning and basic development is due to Richard Bellman in the early XNUMXs. This methodology is not limited to Operational Research but is also of great importance in Optimal Control Theory. Many authors still consider Dynamic Programming as a conceptual point of view and a theoretical baggage for the Insights from problems; and not as a method.

Queuing Theory begins with the work of the engineer Dánes AK Erlang in the telephone industry at the beginning of the Century. The most usual models in which both the distribution of arrivals to the system and the distribution of service time are known and belong to well-established categories. The existence of a multitude of simulation languages ​​available to computer users from the most important companies in the sector should be highlighted.

It may interest you:  Ice Cream History

Game Theory begins with von Neumann's first results on the minimax theorem in 1926.

In any case, the influence of this theory on the organization of production has been very limited.

Decision Theory is based on Bayesian statistics and subjective estimation of event probabilities. Currently it is considered a valid instrument for structuring the taking of strategic with uncertainty when the information is not complete.

Since its inception, Operational Research has been faced with problems for which there is no analytical method that allows the theoretical optimum to be obtained safely and in a convenient time.

Operations Research has established, for these reasons, so-called heuristic methods, incapable of providing the formal optimum, but capable of reaching good solutions, all the more reliable since they allow the determination of a quota (higher or lower) of the theoretical optimum with the one being compared.

The great diffusion that optimization software has suffered due to the increase in computing power of computers and lower cost of applications and hardware.

A number of methods have emerged in recent years. Among them we can list the genetic algorithms, the recognized simulated, the taboo search and the neural networks.

Genetic algorithms were introduced by Holland to mimic some of the mechanisms seen in the evolution of species. Holland created an algorithm that generates new solutions from the union of progenitor solutions, using operators similar to those of the reproduction, without needing to know the type of problem to solve.

The simulated recognition algorithms do not search for the best solution in the current situation environment, but they randomly generate a close solution and accept it as the best if it has a lower cost, otherwise with a certain probability; This probability of acceptance will decrease with the number of iterations and is related to the worsening of the cost.

The Taboo search algorithm, unlike other algorithms based on random techniques for searching for close solutions, uses a strategy based on the use of memory structures to escape the local optimums in which one can fall when moving from one solution to another. for the solutions space. Contrary to local searching, movements to solutions in the environment are allowed even if there is a worsening of the objective function.

Neural Networks are analog models that aim to reproduce as far as possible the characteristics and information processing capacity of the set of neurons present in the brain of living beings.

In summary, it could be said that the use of these techniques supposes the possibility of solving, in a practical way, highly complex problems that were intractable by means of exact techniques.

Operational Research.

The so-called Quantitative Methods of Efficient specially applied vision of the discipline known as Operational Research.

The objectives of quantitative methods are clearly limited to the study of decision-making problems. The Phases of the method are immediate.

  • The first phase, formulation of the problem, fulfills a primary function, since based on it it is possible to judge which aspects must be analyzed.
  • The second phase consists of the formulation of a mathematical model that describes the situation to be studied. A model is an abstraction or simplified representation of a part or segment of reality.
It may interest you:  Snails, something about its consumption

Two parts can be distinguished in the model: this representation is generally supported by a more or less sophisticated mathematical language according to the characteristics of the study being carried out. Once the construction of the model is completed, the selection of the specific criteria for evaluating alternatives is addressed.

Knowledge of the methods and techniques is of primary importance, on the one hand, it suggests possibilities for the mathematical expression of the relationships and, on the other, it provides information about what can be asked of it and is expected to provide the model.

  • In the Third Phase, deduction of solutions, a sufficient technical background is required to obtain the solutions of the model, if it is normative or the fundamental characteristics of the process if it is predictive, knowing on which aspects the modification of these characteristics depends.

The inherent complexity of the problems leads to the impossibility of obtaining the optimal solutions. In such cases, the generation of heuristic rules can lead to revealing new ways of acting in practice.

Indispensable in this case is the knowledge associated with Insights and design and coding of algorithms.

  • In the fourth phase, it is necessary to distinguish between the solutions revealed in the previous phase, choosing one of them or a synthesis of several. The last phase brings with it the characterization in all its details of the decision made.


The formation of Quantitative Methods of Efficient Its objective is to train the student in the basic concepts and techniques of Operational Research, as well as in the use of mathematical models to solve problems of Efficient and Engineering and in the Insights and development of basic algorithms and tools for optimization.


Linear Programming was born from the Second World War, as a technique dedicated to solving certain types of resource allocation problems between different activities.


It is a module focused on the transportation problem serving as the end of the module dedicated to linear programming in general, to start the Insights of problems with special structures. The module is completed with the study of distribution problems and their analysis using the primal-dual method.


The next module introduces integer linear programming by modeling situations where there are decision variables, logical implications, or disjunctive relationships.


The fourth module, Game theory, addresses a set of situations characterized by the fight or confrontation between two or more opponents.


In the fifth module, an introduction to the analysis of alternatives in different environments is made. It is described as a convenient instrument to approach decision making under conditions of uncertainty in which complete information is not available. The value of information in this context is analyzed.


The sixth module is devoted to the study of sequential or multi-stage decision problems. The variables that describe them are governed by transformations over time.


The modeling techniques module describes the general modeling system based on the following stages: verbal description of the identified problem, specification of the horizon to which the analysis refers, evaluation of the availability and existence of data, identification of variables, specification of the structure and limitations through the construction of restrictions, expressed in terms of the available data and the identified variables, selection of evaluation criteria for alternatives and the approach used to solve the model.

It may interest you:  The humorous history of the fork. A story to tell


Basically it consists in the construction of a model that describes the essential part of the behavior of a system of interest, as well as in the design of experiments with the model and the extraction of conclusions from the results of the same.


Specifically, this course studies the most innovative techniques for solving continuous and integer linear problems, expands the most innovative techniques for solving continuous and integer linear problems, expands the techniques already exposed from a computational point of view and generalizes the knowledge in the field of optimization to the most general case of non-linear problems, reviewing the methods that allow solving them.


It begins with the analysis, from a computational point of view, of the simplex algorithm as a method of resolution. Subsequently, the decomposition and partitioning methods are studied. The third topic focuses on so-called interior point methods and their application in the field of linear programming.


The necessary and sufficient optimality conditions in each type of problem are studied and other optimization methods for constrained problems are introduced.

Dual methods do not attack the original problem are dual.


In particular, various types of selection, crossing, mutation, etc. operators are shown. As well as dynamic ways to determine their respective employment frequencies.


The basic idea is not only to move from one point to a better one, which would be the reasonable thing to do, but also to allow the sporadic and probabilistic occurrence of steps backwards, that is, worsening in the value of the objective function.


The idea is that prohibiting immersed movements to those that appear in said table minimizes the probability that the search enters a dead-end cycle. The short-term memory effect of the Taboo List is completed by buffering and long-term memory mechanisms called intensification and Diversification respectively.


They are systems made up of a large number of highly interrelated elemental processing units that are capable of performing tasks such as classification, generalization, optimization, abstraction, etc.


It introduces the study from an analytical point of view, of the phenomena of waiting so common in the productive environment. Among the practical applications of queuing theory, those related to the design and analysis of production and service units stand out.

I am a dreamer and in my dreams I believe that a better world is possible, that no one knows more than anyone, we all learn from everyone. I love gastronomy, numbers, teaching and sharing all the little I know, because by sharing I also learn. "Let's all go together from foundation to success"
quantitative methods
Last entries of MBA Yosvanys R Guerra Valverde (see everything)