936 resultados para Linear programming models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. Generalisation is measured by the performance on independent test data drawn from the same distribution as the training data. Such performance can be quantified by the posterior average of the information divergence between the true and the model distributions. Averaging over the Bayesian posterior guarantees internal coherence; Using information divergence guarantees invariance with respect to representation. The theory generalises the least mean squares theory for linear Gaussian models to general problems of statistical estimation. The main results are: (1)~the ideal optimal estimate is always given by average over the posterior; (2)~the optimal estimate within a computational model is given by the projection of the ideal estimate to the model. This incidentally shows some currently popular methods dealing with hyperpriors are in general unnecessary and misleading. The extension of information divergence to positive normalisable measures reveals a remarkable relation between the dlt dual affine geometry of statistical manifolds and the geometry of the dual pair of Banach spaces Ld and Ldd. It therefore offers conceptual simplification to information geometry. The general conclusion on the issue of evaluating neural network learning rules and other statistical inference methods is that such evaluations are only meaningful under three assumptions: The prior P(p), describing the environment of all the problems; the divergence Dd, specifying the requirement of the task; and the model Q, specifying available computing resources.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the dependence of Bayesian error bars on the distribution of data in input space. For generalized linear regression models we derive an upper bound on the error bars which shows that, in the neighbourhood of the data points, the error bars are substantially reduced from their prior values. For regions of high data density we also show that the contribution to the output variance due to the uncertainty in the weights can exhibit an approximate inverse proportionality to the probability density. Empirical results support these conclusions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Iterative multiuser joint decoding based on exact Belief Propagation (BP) is analyzed in the large system limit by means of the replica method. It is shown that performance can be improved by appropriate power assignment to the users. The optimum power assignment can be found by linear programming in most technically relevant cases. The performance of BP iterative multiuser joint decoding is compared to suboptimum approximations based on Interference Cancellation (IC). While IC receivers show a significant loss for equal-power users, they yield performance close to BP under optimum power assignment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Physical distribution plays an imporant role in contemporary logistics management. Both satisfaction level of of customer and competitiveness of company can be enhanced if the distribution problem is solved optimally. The multi-depot vehicle routing problem (MDVRP) belongs to a practical logistics distribution problem, which consists of three critical issues: customer assignment, customer routing, and vehicle sequencing. According to the literatures, the solution approaches for the MDVRP are not satisfactory because some unrealistic assumptions were made on the first sub-problem of the MDVRP, ot the customer assignment problem. To refine the approaches, the focus of this paper is confined to this problem only. This paper formulates the customer assignment problem as a minimax-type integer linear programming model with the objective of minimizing the cycle time of the depots where setup times are explicitly considered. Since the model is proven to be MP-complete, a genetic algorithm is developed for solving the problem. The efficiency and effectiveness of the genetic algorithm are illustrated by a numerical example.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Group decision making is the study of identifying and selecting alternatives based on the values and preferences of the decision maker. Making a decision implies that there are several alternative choices to be considered. This paper uses the concept of Data Envelopment Analysis to introduce a new mathematical method for selecting the best alternative in a group decision making environment. The introduced model is a multi-objective function which is converted into a multi-objective linear programming model from which the optimal solution is obtained. A numerical example shows how the new model can be applied to rank the alternatives or to choose a subset of the most promising alternatives.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over 60% of the recurrent budget of the Ministry of Health (MoH) in Angola is spent on the operations of the fixed health care facilities (health centres plus hospitals). However, to date, no study has been attempted to investigate how efficiently those resources are used to produce health services. Therefore the objectives of this study were to assess the technical efficiency of public municipal hospitals in Angola; assess changes in productivity over time with a view to analyzing changes in efficiency and technology; and demonstrate how the results can be used in the pursuit of the public health objective of promoting efficiency in the use of health resources. The analysis was based on a 3-year panel data from all the 28 public municipal hospitals in Angola. Data Envelopment Analysis (DEA), a non-parametric linear programming approach, was employed to assess the technical and scale efficiency and productivity change over time using Malmquist index.The results show that on average, productivity of municipal hospitals in Angola increased by 4.5% over the period 2000-2002; that growth was due to improvements in efficiency rather than innovation. © 2008 Springer Science+Business Media, LLC.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper contributes to extend the minimax disparity to determine the ordered weighted averaging (OWA) model based on linear programming. It introduces the minimax disparity approach between any distinct pairs of the weights and uses the duality of linear programming to prove the feasibility of the extended OWA operator weights model. The paper finishes with an open problem. © 2006 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In May 2006, the Ministers of Health of all the countries on the African continent, at a special session of the African Union, undertook to institutionalise efficiency monitoring within their respective national health information management systems. The specific objectives of this study were: (i) to assess the technical efficiency of National Health Systems (NHSs) of African countries for measuring male and female life expectancies, and (ii) to assess changes in health productivity over time with a view to analysing changes in efficiency and changes in technology. The analysis was based on a five-year panel data (1999-2003) from all the 53 countries of continental Africa. Data Envelopment Analysis (DEA) - a non-parametric linear programming approach - was employed to assess the technical efficiency. Malmquist Total Factor Productivity (MTFP) was used to analyse efficiency and productivity change over time among the 53 countries' national health systems. The data consisted of two outputs (male and female life expectancies) and two inputs (per capital total health expenditure and adult literacy). The DEA revealed that 49 (92.5%) countries' NHSs were run inefficiently in 1999 and 2000; 50 (94.3%), 48 (90.6%) and 47 (88.7%) operated inefficiently in 2001, 2002, and 2003 respectively. All the 53 countries' national health systems registered improvements in total factor productivity attributable mainly to technical progress. Fifty-two countries did not experience any change in scale efficiency, while thirty (56.6%) countries' national health systems had a Pure Efficiency Change (PEFFCH) index of less than one, signifying that those countries' NHSs pure efficiency contributed negatively to productivity change. All the 53 countries' national health systems registered improvements in total factor productivity, attributable mainly to technical progress. Over half of the countries' national health systems had a pure efficiency index of less than one, signifying that those countries' NHSs pure efficiency contributed negatively to productivity change. African countries may need to critically evaluate the utility of institutionalising Malmquist TFP type of analyses to monitor changes in health systems economic efficiency and productivity over time. African national health systems, per capita total health expenditure, technical efficiency, scale efficiency, Malmquist indices of productivity change, DEA

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The generalised transportation problem (GTP) is an extension of the linear Hitchcock transportation problem. However, it does not have the unimodularity property, which means the linear programming solution (like the simplex method) cannot guarantee to be integer. This is a major difference between the GTP and the Hitchcock transportation problem. Although some special algorithms, such as the generalised stepping-stone method, have been developed, but they are based on the linear programming model and the integer solution requirement of the GTP is relaxed. This paper proposes a genetic algorithm (GA) to solve the GTP and a numerical example is presented to show the algorithm and its efficiency.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper re-assesses three independently developed approaches that are aimed at solving the problem of zero-weights or non-zero slacks in Data Envelopment Analysis (DEA). The methods are weights restricted, non-radial and extended facet DEA models. Weights restricted DEA models are dual to envelopment DEA models with restrictions on the dual variables (DEA weights) aimed at avoiding zero values for those weights; non-radial DEA models are envelopment models which avoid non-zero slacks in the input-output constraints. Finally, extended facet DEA models recognize that only projections on facets of full dimension correspond to well defined rates of substitution/transformation between all inputs/outputs which in turn correspond to non-zero weights in the multiplier version of the DEA model. We demonstrate how these methods are equivalent, not only in their aim but also in the solutions they yield. In addition, we show that the aforementioned methods modify the production frontier by extending existing facets or creating unobserved facets. Further we propose a new approach that uses weight restrictions to extend existing facets. This approach has some advantages in computational terms, because extended facet models normally make use of mixed integer programming models, which are computationally demanding.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work the solution of a class of capital investment problems is considered within the framework of mathematical programming. Upon the basis of the net present value criterion, the problems in question are mainly characterized by the fact that the cost of capital is defined as a non-decreasing function of the investment requirements. Capital rationing and some cases of technological dependence are also included, this approach leading to zero-one non-linear programming problems, for which specifically designed solution procedures supported by a general branch and bound development are presented. In the context of both this development and the relevant mathematical properties of the previously mentioned zero-one programs, a generalized zero-one model is also discussed. Finally,a variant of the scheme, connected with the search sequencing of optimal solutions, is presented as an alternative in which reduced storage limitations are encountered.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper explores the use of the optimization procedures in SAS/OR software with application to the ordered weight averaging (OWA) operators of decision-making units (DMUs). OWA was originally introduced by Yager (IEEE Trans Syst Man Cybern 18(1):183-190, 1988) has gained much interest among researchers, hence many applications such as in the areas of decision making, expert systems, data mining, approximate reasoning, fuzzy system and control have been proposed. On the other hand, the SAS is powerful software and it is capable of running various optimization tools such as linear and non-linear programming with all type of constraints. To facilitate the use of OWA operator by SAS users, a code was implemented. The SAS macro developed in this paper selects the criteria and alternatives from a SAS dataset and calculates a set of OWA weights. An example is given to illustrate the features of SAS/OWA software. © Springer-Verlag 2009.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research is concerned with the application of operational research techniques in the development of a long- term waste management policy by an English waste disposal authority. The main aspects which have been considered are the estimation of future waste production and the assessment of the effects of proposed systems. Only household and commercial wastes have been dealt with in detail, though suggestions are made for the extension of the effect assessment to cover industrial and other wastes. Similarly, the only effects considered in detail have been costs, but possible extensions are discussed. An important feature of the study is that it was conducted in close collaboration with a waste disposal authority, and so pays more attention to the actual needs of the authority than is usual in such research. A critical examination of previous waste forecasting work leads to the use of simple trend extrapolation methods, with some consideration of seasonal effects. The possibility of relating waste production to other social and economic indicators is discussed. It is concluded that, at present, large uncertainties in predictions are inevitable; waste management systems must therefore be designed to cope with this uncertainty. Linear programming is used to assess the overall costs of proposals. Two alternative linear programming formulations of this problem are used and discussed. The first is a straightforward approach, which has been .implemented as an interactive computer program. The second is more sophisticated and represents the behaviour of incineration plants more realistically. Careful attention is paid to the choice of appropriate data and the interpretation of the results. Recommendations are made on methods for immediate use, on the choice of data to be collected for future plans, and on the most useful lines for further research and development.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two non-linear techniques, namely, recurrent neural networks and kernel recursive least squares regression - techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background/Aim - People of south Asian origin have an excessive risk of morbidity and mortality from cardiovascular disease. We examined the effect of ethnicity on known risk factors and analysed the risk of cardiovascular events and mortality in UK south Asian and white Europeans patients with type 2 diabetes over a 2 year period. Methods - A total of 1486 south Asian (SA) and 492 white European (WE) subjects with type 2 diabetes were recruited from 25 general practices in Coventry and Birmingham, UK. Baseline data included clinical history, anthropometry and measurements of traditional risk factors – blood pressure, total cholesterol, HbA1c. Multiple linear regression models were used to examine ethnicity differences in individual risk factors. Ten-year cardiovascular risk was estimated using the Framingham and UKPDS equations. All subjects were followed up for 2 years. Cardiovascular events (CVD) and mortality between the two groups were compared. Findings - Significant differences were noted in risk profiles between both groups. After adjustment for clustering and confounding a significant ethnicity effect remained only for higher HbA1c (0.50 [0.22 to 0.77]; P?=?0.0004) and lower HDL (-0.09 [-0.17 to -0.01]; P?=?0.0266). Baseline CVD history was predictive of CVD events during follow-up for SA (P?