995 resultados para Bootstrap weights approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In multi-attribute utility theory, it is often not easy to elicit precise values for the scaling weights representing the relative importance of criteria. A very widespread approach is to gather incomplete information. A recent approach for dealing with such situations is to use information about each alternative?s intensity of dominance, known as dominance measuring methods. Different dominancemeasuring methods have been proposed, and simulation studies have been carried out to compare these methods with each other and with other approaches but only when ordinal information about weights is available. In this paper, we useMonte Carlo simulation techniques to analyse the performance of and adapt such methods to deal with weight intervals, weights fitting independent normal probability distributions orweights represented by fuzzy numbers.Moreover, dominance measuringmethod performance is also compared with a widely used methodology dealing with incomplete information on weights, the stochastic multicriteria acceptability analysis (SMAA). SMAA is based on exploring the weight space to describe the evaluations that would make each alternative the preferred one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to propose a multi-criteria optimization and decision-making technique to solve food engineering problems. This technique was demostrated using experimental data obtained on osmotic dehydratation of carrot cubes in a sodium chloride solution. The Aggregating Functions Approach, the Adaptive Random Search Algorithm, and the Penalty Functions Approach were used in this study to compute the initial set of non-dominated or Pareto-optimal solutions. Multiple non-linear regression analysis was performed on a set of experimental data in order to obtain particular multi-objective functions (responses), namely water loss, solute gain, rehydration ratio, three different colour criteria of rehydrated product, and sensory evaluation (organoleptic quality). Two multi-criteria decision-making approaches, the Analytic Hierarchy Process (AHP) and the Tabular Method (TM), were used simultaneously to choose the best alternative among the set of non-dominated solutions. The multi-criteria optimization and decision-making technique proposed in this study can facilitate the assessment of criteria weights, giving rise to a fairer, more consistent, and adequate final compromised solution or food process. This technique can be useful to food scientists in research and education, as well as to engineers involved in the improvement of a variety of food engineering processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces the rank-dependent quality-adjusted life-years (QALY) model, a new method to aggregate QALYs in economic evaluations of health care. The rank-dependent QALY model permits the formalization of influential concepts of equity in the allocation of health care, such as the fair innings approach, and it includes as special cases many of the social welfare functions that have been proposed in the literature. An important advantage of the rank-dependent QALY model is that it offers a straightforward procedure to estimate equity weights for QALYs. We characterize the rank-dependent QALY model and argue that its central condition has normative appeal. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents an array antenna with beam-steering capability in azimuth over a wide frequency band using real-valued weighting coefficients that can be realized in practice by amplifiers or attenuators. The described beamforming scheme relies on a 2D (instead of 1D) array structure in order to make sure that there are enough degrees of freedom to realize a given radiation pattern in both the angular and frequency domains. In the presented approach, weights are determined using an inverse discrete Fourier transform (IDFT) technique by neglecting the mutual coupling between array elements. Because of the presence of mutual coupling, the actual array produces a radiation pattern with increased side-lobe levels. In order to counter this effect, the design aims to realize the initial radiation pattern with a lower side-lobe level. This strategy is demonstrated in the design example of 4 X 4 element array. (C) 2005 Wiley Periodicals. Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work has demonstrated that for the first time a single RAFT agent (i. e., difunctional) can be used in conjunction with a radical initiator to obtain a desired M-n and PDI with controlled rates of polymerization. Simulations were used not only to verify the model but also to provide us with a predictive tool to generate other MWDs. It was also shown that all the MWDs prepared in this work could be translated to higher molecular weights through chain extension experiments with little or no compromise in the control of end group functionality. The ratio of monofunctional to difunctional SdC(CH2Ph)S- end groups, XPX and XP (where X) S=C(CH2Ph) S-), can be controlled by simply changing the concentration of initiator, AIBN. Importantly, the amount of dead polymer is extremely low and fulfils the criterion as suggested by Szwarc (Nature 1956) that to meet living requirements nonfunctional polymeric species formed by side reactions in the process should be undetectable by analytical techniques. In addition, this novel methodology will allow the synthesis of AB, ABA, and statistical multiblock copolymers with predetermined ratios to be produced in a one-pot reaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper contributes to extend the minimax disparity to determine the ordered weighted averaging (OWA) model based on linear programming. It introduces the minimax disparity approach between any distinct pairs of the weights and uses the duality of linear programming to prove the feasibility of the extended OWA operator weights model. The paper finishes with an open problem. © 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper re-assesses three independently developed approaches that are aimed at solving the problem of zero-weights or non-zero slacks in Data Envelopment Analysis (DEA). The methods are weights restricted, non-radial and extended facet DEA models. Weights restricted DEA models are dual to envelopment DEA models with restrictions on the dual variables (DEA weights) aimed at avoiding zero values for those weights; non-radial DEA models are envelopment models which avoid non-zero slacks in the input-output constraints. Finally, extended facet DEA models recognize that only projections on facets of full dimension correspond to well defined rates of substitution/transformation between all inputs/outputs which in turn correspond to non-zero weights in the multiplier version of the DEA model. We demonstrate how these methods are equivalent, not only in their aim but also in the solutions they yield. In addition, we show that the aforementioned methods modify the production frontier by extending existing facets or creating unobserved facets. Further we propose a new approach that uses weight restrictions to extend existing facets. This approach has some advantages in computational terms, because extended facet models normally make use of mixed integer programming models, which are computationally demanding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to develop econometric models to better understand the economic factors affecting inbound tourist flows from each of six origin countries that contribute to Hong Kong’s international tourism demand. To this end, we test alternative cointegration and error correction approaches to examine the economic determinants of tourist flows to Hong Kong, and to produce accurate econometric forecasts of inbound tourism demand. Our empirical findings show that permanent income is the most significant determinant of tourism demand in all models. The variables of own price, weighted substitute prices, trade volume, the share price index (as an indicator of changes in wealth in origin countries), and a dummy variable representing the Beijing incident (1989) are also found to be important determinants for some origin countries. The average long-run income and own price elasticity was measured at 2.66 and – 1.02, respectively. It was hypothesised that permanent income is a better explanatory variable of long-haul tourism demand than current income. A novel approach (grid search process) has been used to empirically derive the weights to be attached to the lagged income variable for estimating permanent income. The results indicate that permanent income, estimated with empirically determined relatively small weighting factors, was capable of producing better results than the current income variable in explaining long-haul tourism demand. This finding suggests that the use of current income in previous empirical tourism demand studies may have produced inaccurate results. The share price index, as a measure of wealth, was also found to be significant in two models. Studies of tourism demand rarely include wealth as an explanatory forecasting long-haul tourism demand. However, finding a satisfactory proxy for wealth common to different countries is problematic. This study indicates with the ECM (Error Correction Models) based on the Engle-Granger (1987) approach produce more accurate forecasts than ECM based on Pesaran and Shin (1998) and Johansen (1988, 1991, 1995) approaches for all of the long-haul markets and Japan. Overall, ECM produce better forecasts than the OLS, ARIMA and NAÏVE models, indicating the superiority of the application of a cointegration approach for tourism demand forecasting. The results show that permanent income is the most important explanatory variable for tourism demand from all countries but there are substantial variations between countries with the long-run elasticity ranging between 1.1 for the U.S. and 5.3 for U.K. Price is the next most important variable with the long-run elasticities ranging between -0.8 for Japan and -1.3 for Germany and short-run elasticities ranging between – 0.14 for Germany and -0.7 for Taiwan. The fastest growing market is Mainland China. The findings have implications for policies and strategies on investment, marketing promotion and pricing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Zambia and many other countries in Sub-Saharan Africa face a key challenge of sustaining high levels of coverage of AIDS treatment under prospects of dwindling global resources for HIV/AIDS treatment. Policy debate in HIV/AIDS is increasingly paying more focus to efficiency in the use of available resources. In this chapter, we apply Data Envelopment Analysis (DEA) to estimate short term technical efficiency of 34 HIV/AIDS treatment facilities in Zambia. The data consists of input variables such as human resources, medical equipment, building space, drugs, medical supplies, and other materials used in providing HIV/AIDS treatment. Two main outputs namely, numbers of ART-years (Anti-Retroviral Therapy-years) and pre-ART-years are included in the model. Results show the mean technical efficiency score to be 83%, with great variability in efficiency scores across the facilities. Scale inefficiency is also shown to be significant. About half of the facilities were on the efficiency frontier. We also construct bootstrap confidence intervals around the efficiency scores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper seeks to advance the theory and practice of the dynamics of complex networks in relation to direct and indirect citations. It applies social network analysis (SNA) and the ordered weighted averaging operator (OWA) to study a patent citations network. So far the SNA studies investigating long chains of patents citations have rarely been undertaken and the importance of a node in a network has been associated mostly with its number of direct ties. In this research OWA is used to analyse complex networks, assess the role of indirect ties, and provide guidance to reduce complexity for decision makers and analysts. An empirical example of a set of European patents published in 2000 in the renewable energy industry is provided to show the usefulness of the proposed approach for the preference ranking of patent citations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An effective aperture approach is used as a tool for analysis and parameter optimization of mostly known ultrasound imaging systems - phased array systems, compounding systems and synthetic aperture imaging systems. Both characteristics of an imaging system, the effective aperture function and the corresponding two-way radiation pattern, provide information about two of the most important parameters of images produced by an ultrasound system - lateral resolution and contrast. Therefore, in the design, optimization of the effective aperture function leads to optimal choice of such parameters of an imaging systems that influence on lateral resolution and contrast of images produced by this imaging system. It is shown that the effective aperture approach can be used for optimization of a sparse synthetic transmit aperture (STA) imaging system. A new two-stage algorithm is proposed for optimization of both the positions of the transmitted elements and the weights of the receive elements. The proposed system employs a 64-element array with only four active elements used during transmit. The numerical results show that Hamming apodization gives the best compromise between the contrast of images and the lateral resolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

* Supported by INTAS 2000-626, INTAS YSF 03-55-1969, INTAS INNO 182, and TIC 2003-09319-c03-03.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Well–prepared, adaptive and sustainably developing specialists are an important competitive advantage, but also one of the main challenges for businesses. One option of the education system for creation and development of staff adequate to the needs is the development of pro jects with topics from real economy ("Practical Projects"). The objective assessment is an essential driver and motivator, and is based on a system of well-chosen, well-defined and specific criteria and indicators. An approach to a more objective evaluation of practical projects is finding more objective weights of the criteria. A natural and reasonable approach is the accumulation of opinions of proven experts and subsequent bringing out the weights from the accumulated data. The preparation and conduction of a survey among recognized experts in the field of project-based learning in mathematics, informatics and information technologies is described. The processing of the data accumulated by applying AHP, allowed us to objectively determine weights of evaluation criteria and hence to achieve the desired objectiveness. ACM Computing Classification System (1998): K.3.2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the major challenges in measuring efficiency in terms of resources and outcomes is the assessment of the evolution of units over time. Although Data Envelopment Analysis (DEA) has been applied for time series datasets, DEA models, by construction, form the reference set for inefficient units (lambda values) based on their distance from the efficient frontier, that is, in a spatial manner. However, when dealing with temporal datasets, the proximity in time between units should also be taken into account, since it reflects the structural resemblance among time periods of a unit that evolves. In this paper, we propose a two-stage spatiotemporal DEA approach, which captures both the spatial and temporal dimension through a multi-objective programming model. In the first stage, DEA is solved iteratively extracting for each unit only previous DMUs as peers in its reference set. In the second stage, the lambda values derived from the first stage are fed to a Multiobjective Mixed Integer Linear Programming model, which filters peers in the reference set based on weights assigned to the spatial and temporal dimension. The approach is demonstrated on a real-world example drawn from software development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a long debate (going back to Keynes) how to interpret the concept of probability in economics, in business decisions, in finance. Iván Bélyácz suggested that the Black–Scholes– Merton analysis of fi nancial derivatives has a contribution to this risk vs. uncertainty debate. This article tries to interpret this suggestion, from the viewpoint of traded options, real options, Arrow–Debreu model, Heath–Jarrow–Morton model, insurance business. The article suggests making clear distinction and using different naming ● when the frequents approach and the statistics is relevant, ● when we just use consequent relative weights during the no-arbitrage pricing, and these weight are just interpreted as probabilities, ● when we just lack the necessary information, and there is a basic uncertainty in the business decision making process. The paper suggests making a sharp distinction between fi nancial derivatives used for market risk management and credit risk type derivatives (CDO, CDS, etc) in the reregulation process of the fi nancial markets.