50 resultados para Linear programming models
Resumo:
This paper contributes to extend the minimax disparity to determine the ordered weighted averaging (OWA) model based on linear programming. It introduces the minimax disparity approach between any distinct pairs of the weights and uses the duality of linear programming to prove the feasibility of the extended OWA operator weights model. The paper finishes with an open problem. © 2006 Elsevier Ltd. All rights reserved.
Resumo:
In May 2006, the Ministers of Health of all the countries on the African continent, at a special session of the African Union, undertook to institutionalise efficiency monitoring within their respective national health information management systems. The specific objectives of this study were: (i) to assess the technical efficiency of National Health Systems (NHSs) of African countries for measuring male and female life expectancies, and (ii) to assess changes in health productivity over time with a view to analysing changes in efficiency and changes in technology. The analysis was based on a five-year panel data (1999-2003) from all the 53 countries of continental Africa. Data Envelopment Analysis (DEA) - a non-parametric linear programming approach - was employed to assess the technical efficiency. Malmquist Total Factor Productivity (MTFP) was used to analyse efficiency and productivity change over time among the 53 countries' national health systems. The data consisted of two outputs (male and female life expectancies) and two inputs (per capital total health expenditure and adult literacy). The DEA revealed that 49 (92.5%) countries' NHSs were run inefficiently in 1999 and 2000; 50 (94.3%), 48 (90.6%) and 47 (88.7%) operated inefficiently in 2001, 2002, and 2003 respectively. All the 53 countries' national health systems registered improvements in total factor productivity attributable mainly to technical progress. Fifty-two countries did not experience any change in scale efficiency, while thirty (56.6%) countries' national health systems had a Pure Efficiency Change (PEFFCH) index of less than one, signifying that those countries' NHSs pure efficiency contributed negatively to productivity change. All the 53 countries' national health systems registered improvements in total factor productivity, attributable mainly to technical progress. Over half of the countries' national health systems had a pure efficiency index of less than one, signifying that those countries' NHSs pure efficiency contributed negatively to productivity change. African countries may need to critically evaluate the utility of institutionalising Malmquist TFP type of analyses to monitor changes in health systems economic efficiency and productivity over time. African national health systems, per capita total health expenditure, technical efficiency, scale efficiency, Malmquist indices of productivity change, DEA
Resumo:
The generalised transportation problem (GTP) is an extension of the linear Hitchcock transportation problem. However, it does not have the unimodularity property, which means the linear programming solution (like the simplex method) cannot guarantee to be integer. This is a major difference between the GTP and the Hitchcock transportation problem. Although some special algorithms, such as the generalised stepping-stone method, have been developed, but they are based on the linear programming model and the integer solution requirement of the GTP is relaxed. This paper proposes a genetic algorithm (GA) to solve the GTP and a numerical example is presented to show the algorithm and its efficiency.
Resumo:
This paper re-assesses three independently developed approaches that are aimed at solving the problem of zero-weights or non-zero slacks in Data Envelopment Analysis (DEA). The methods are weights restricted, non-radial and extended facet DEA models. Weights restricted DEA models are dual to envelopment DEA models with restrictions on the dual variables (DEA weights) aimed at avoiding zero values for those weights; non-radial DEA models are envelopment models which avoid non-zero slacks in the input-output constraints. Finally, extended facet DEA models recognize that only projections on facets of full dimension correspond to well defined rates of substitution/transformation between all inputs/outputs which in turn correspond to non-zero weights in the multiplier version of the DEA model. We demonstrate how these methods are equivalent, not only in their aim but also in the solutions they yield. In addition, we show that the aforementioned methods modify the production frontier by extending existing facets or creating unobserved facets. Further we propose a new approach that uses weight restrictions to extend existing facets. This approach has some advantages in computational terms, because extended facet models normally make use of mixed integer programming models, which are computationally demanding.
Resumo:
In this work the solution of a class of capital investment problems is considered within the framework of mathematical programming. Upon the basis of the net present value criterion, the problems in question are mainly characterized by the fact that the cost of capital is defined as a non-decreasing function of the investment requirements. Capital rationing and some cases of technological dependence are also included, this approach leading to zero-one non-linear programming problems, for which specifically designed solution procedures supported by a general branch and bound development are presented. In the context of both this development and the relevant mathematical properties of the previously mentioned zero-one programs, a generalized zero-one model is also discussed. Finally,a variant of the scheme, connected with the search sequencing of optimal solutions, is presented as an alternative in which reduced storage limitations are encountered.
Resumo:
This paper explores the use of the optimization procedures in SAS/OR software with application to the ordered weight averaging (OWA) operators of decision-making units (DMUs). OWA was originally introduced by Yager (IEEE Trans Syst Man Cybern 18(1):183-190, 1988) has gained much interest among researchers, hence many applications such as in the areas of decision making, expert systems, data mining, approximate reasoning, fuzzy system and control have been proposed. On the other hand, the SAS is powerful software and it is capable of running various optimization tools such as linear and non-linear programming with all type of constraints. To facilitate the use of OWA operator by SAS users, a code was implemented. The SAS macro developed in this paper selects the criteria and alternatives from a SAS dataset and calculates a set of OWA weights. An example is given to illustrate the features of SAS/OWA software. © Springer-Verlag 2009.
Resumo:
This research is concerned with the application of operational research techniques in the development of a long- term waste management policy by an English waste disposal authority. The main aspects which have been considered are the estimation of future waste production and the assessment of the effects of proposed systems. Only household and commercial wastes have been dealt with in detail, though suggestions are made for the extension of the effect assessment to cover industrial and other wastes. Similarly, the only effects considered in detail have been costs, but possible extensions are discussed. An important feature of the study is that it was conducted in close collaboration with a waste disposal authority, and so pays more attention to the actual needs of the authority than is usual in such research. A critical examination of previous waste forecasting work leads to the use of simple trend extrapolation methods, with some consideration of seasonal effects. The possibility of relating waste production to other social and economic indicators is discussed. It is concluded that, at present, large uncertainties in predictions are inevitable; waste management systems must therefore be designed to cope with this uncertainty. Linear programming is used to assess the overall costs of proposals. Two alternative linear programming formulations of this problem are used and discussed. The first is a straightforward approach, which has been .implemented as an interactive computer program. The second is more sophisticated and represents the behaviour of incineration plants more realistically. Careful attention is paid to the choice of appropriate data and the interpretation of the results. Recommendations are made on methods for immediate use, on the choice of data to be collected for future plans, and on the most useful lines for further research and development.
Resumo:
This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two non-linear techniques, namely, recurrent neural networks and kernel recursive least squares regression - techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation.
Resumo:
Background/Aim - People of south Asian origin have an excessive risk of morbidity and mortality from cardiovascular disease. We examined the effect of ethnicity on known risk factors and analysed the risk of cardiovascular events and mortality in UK south Asian and white Europeans patients with type 2 diabetes over a 2 year period. Methods - A total of 1486 south Asian (SA) and 492 white European (WE) subjects with type 2 diabetes were recruited from 25 general practices in Coventry and Birmingham, UK. Baseline data included clinical history, anthropometry and measurements of traditional risk factors – blood pressure, total cholesterol, HbA1c. Multiple linear regression models were used to examine ethnicity differences in individual risk factors. Ten-year cardiovascular risk was estimated using the Framingham and UKPDS equations. All subjects were followed up for 2 years. Cardiovascular events (CVD) and mortality between the two groups were compared. Findings - Significant differences were noted in risk profiles between both groups. After adjustment for clustering and confounding a significant ethnicity effect remained only for higher HbA1c (0.50 [0.22 to 0.77]; P?=?0.0004) and lower HDL (-0.09 [-0.17 to -0.01]; P?=?0.0266). Baseline CVD history was predictive of CVD events during follow-up for SA (P?0.0001) but not WE (P?=?0.189). Mean age at death was 66.8 (11.8) for SA vs. 74.2 (12.1) for WE, a difference of 7.4 years (95% CI 1.0 to 13.7 years), P?=?0.023. The adjusted odds ratio of CVD event or death from CVD was greater but not significantly so in SA than in WE (OR 1.4 [0.9 to 2.2]). Limitations - Fewer events in both groups and short period of follow-up are key limitations. Longer follow-up is required to see if the observed differences between the ethnic groups persist. Conclusion - South Asian patients with type 2 diabetes in the UK have a higher cardiovascular risk and present with cardiovascular events at a significantly younger age than white Europeans. Enhanced and ethnicity specific targets and effective treatments are needed if these inequalities are to be reduced.
Resumo:
Data envelopment analysis (DEA) as introduced by Charnes, Cooper, and Rhodes (1978) is a linear programming technique that has widely been used to evaluate the relative efficiency of a set of homogenous decision making units (DMUs). In many real applications, the input-output variables cannot be precisely measured. This is particularly important in assessing efficiency of DMUs using DEA, since the efficiency score of inefficient DMUs are very sensitive to possible data errors. Hence, several approaches have been proposed to deal with imprecise data. Perhaps the most popular fuzzy DEA model is based on a-cut. One drawback of the a-cut approach is that it cannot include all information about uncertainty. This paper aims to introduce an alternative linear programming model that can include some uncertainty information from the intervals within the a-cut approach. We introduce the concept of "local a-level" to develop a multi-objective linear programming to measure the efficiency of DMUs under uncertainty. An example is given to illustrate the use of this method.
Resumo:
Integer-valued data envelopment analysis (DEA) with alternative returns to scale technology has been introduced and developed recently by Kuosmanen and Kazemi Matin. The proportionality assumption of their introduced "natural augmentability" axiom in constant and nondecreasing returns to scale technologies makes it possible to achieve feasible decision-making units (DMUs) of arbitrary large size. In many real world applications it is not possible to achieve such production plans since some of the input and output variables are bounded above. In this paper, we extend the axiomatic foundation of integer-valuedDEAmodels for including bounded output variables. Some model variants are achieved by introducing a new axiom of "boundedness" over the selected output variables. A mixed integer linear programming (MILP) formulation is also introduced for computing efficiency scores in the associated production set. © 2011 The Authors. International Transactions in Operational Research © 2011 International Federation of Operational Research Societies.
Resumo:
The non-linear programming algorithms for the minimum weight design of structural frames are presented in this thesis. The first, which is applied to rigidly jointed and pin jointed plane frames subject to deflexion constraints, consists of a search in a feasible design space. Successive trial designs are developed so that the feasibility and the optimality of the designs are improved simultaneously. It is found that this method is restricted lo the design of structures with few unknown variables. The second non-linear programming algorithm is presented .in a general form. This consists of two types of search, one improving feasibility and the other optimality. The method speeds up the 'feasible direction' approach by obtaining a constant weight direction vector that is influenced by dominating constraints. For pin jointed plane and space frames this method is used to obtain a 'minimum weight' design which satisfies restrictions on stresses and deflexions. The matrix force method enables the design requirements to be expressed in a general form and the design problem is automatically formulated within the computer. Examples are given to explain the method and the design criteria are extended to include member buckling. Fundamental theorems are proposed and proved to confirm that structures are inter-related. These theorems are applicable to linear elastic structures and facilitate the prediction of the behaviour of one structure from the results of analysing another, more general, or related structure. It becomes possible to evaluate the significance of each member in the behaviour of a structure and the problem of minimum weight design is extended to include shape. A method is proposed to design structures of optimum shape with stress and deflexion limitations. Finally a detailed investigation is carried out into the design of structures to study the factors that influence their shape.
Resumo:
The Multiple Pheromone Ant Clustering Algorithm (MPACA) models the collective behaviour of ants to find clusters in data and to assign objects to the most appropriate class. It is an ant colony optimisation approach that uses pheromones to mark paths linking objects that are similar and potentially members of the same cluster or class. Its novelty is in the way it uses separate pheromones for each descriptive attribute of the object rather than a single pheromone representing the whole object. Ants that encounter other ants frequently enough can combine the attribute values they are detecting, which enables the MPACA to learn influential variable interactions. This paper applies the model to real-world data from two domains. One is logistics, focusing on resource allocation rather than the more traditional vehicle-routing problem. The other is mental-health risk assessment. The task for the MPACA in each domain was to predict class membership where the classes for the logistics domain were the levels of demand on haulage company resources and the mental-health classes were levels of suicide risk. Results on these noisy real-world data were promising, demonstrating the ability of the MPACA to find patterns in the data with accuracy comparable to more traditional linear regression models. © 2013 Polish Information Processing Society.
Resumo:
This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. We use non-linear, artificial intelligence techniques, namely, recurrent neural networks, evolution strategies and kernel methods in our forecasting experiment. In the experiment, these three methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. There is evidence in the literature that evolutionary methods can be used to evolve kernels hence our future work should combine the evolutionary and kernel methods to get the benefits of both.
Resumo:
In the contemporary customer-driven supply chain, maximization of customer service plays an equally important role as minimization of costs for a company to retain and increase its competitiveness. This article develops a multiple-criteria optimization approach, combining the analytic hierarchy process (AHP) and an integer linear programming (ILP) model, to aid the design of an optimal logistics distribution network. The proposed approach outperforms traditional cost-based optimization techniques because it considers both quantitative and qualitative factors and also aims at maximizing the benefits of deliverer and customers. In the approach, the AHP is used to determine the relative importance weightings or priorities of alternative warehouses with respect to some critical customer-oriented criteria. The results of AHP prioritization are utilized as the input of the ILP model, the objective of which is to select the best warehouses at the lowest possible cost. In this article, two commercial packages are used: including Expert Choice and LINDO.