995 resultados para Decision procedure


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a procedure that allows us to determine the preference structures (PS) associated to each of the different groups of actors that can be identified in a group decision making problem with a large number of individuals. To that end, it makes use of the Analytic Hierarchy Process (AHP) (Saaty, 1980) as the technique to solve discrete multicriteria decision making problems. This technique permits the resolution of multicriteria, multienvironment and multiactor problems in which subjective aspects and uncertainty have been incorporated into the model, constructing ratio scales corresponding to the priorities relative to the elements being compared, normalised in a distributive manner (wi = 1). On the basis of the individuals’ priorities we identify different clusters for the decision makers and, for each of these, the associated preference structure using, to that end, tools analogous to those of Multidimensional Scaling. The resulting PS will be employed to extract knowledge for the subsequent negotiation processes and, should it be necessary, to determine the relative importance of the alternatives being compared using anyone of the existing procedures

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This special report analyses legislative activity in the European Union and coalition formation in the European Parliament (EP) during the first half of the 7th legislative term, 2009-14. Co-decision is now the ordinary legislative procedure, not by name only: it was deployed on 90% of new proposals in 2010 and 86% in 2011, which suggests that the EP is now more influential than ever. There are differences in the degree of empowerment across committees, however. This report looks at the legislative workload of selected committees as an indicator of change in their influence, identifying which of them won and which lost out in terms of the quantity and type of legislation they tackle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to present two multi-criteria decision-making models, including an Analytic Hierarchy Process (AHP) model and an Analytic Network Process (ANP) model for the assessment of deconstruction plans and to make a comparison between the two models with an experimental case study. Deconstruction planning is under pressure to reduce operation costs, adverse environmental impacts and duration, in the meanwhile to improve productivity and safety in accordance with structure characteristics, site conditions and past experiences. To achieve these targets in deconstruction projects, there is an impending need to develop a formal procedure for contractors to select a most appropriate deconstruction plan. Because numbers of factors influence the selection of deconstruction techniques, engineers definitely need effective tools to conduct the selection process. In this regard, multi-criteria decision-making methods such as AHP have been adopted to effectively support deconstruction technique selection in previous researches. in which it has been proved that AHP method can help decision-makers to make informed decisions on deconstruction technique selection based on a sound technical framework. In this paper, the authors present the application and comparison of two decision-making models including the AHP model and the ANP model for deconstruction plan assessment. The paper concludes that both AHP and ANP are viable and capable tools for deconstruction plan assessment under the same set of evaluation criteria. However, although the ANP can measure relationship among selection criteria and their sub-criteria, which is normally ignored in the AHP, the authors also indicate that whether the ANP model can provide a more accurate result should be examined in further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rules and the principles of the common law are formed from the cases decided in courts of common law. The unique nature of the evolution of the common law has long been the subject of study. Less frequently studied has been the impact of procedure upon the development of substantive law. This paper examines how the procedures applicable to the trial of a case can affect the substance of the resulting decision. The focus of the examination is the decision in Bell v Lever Bros [1932] AC 161. While the case has long been regarded as a leading, albeit confusing, contract law case it is also greatly concerned with the conduct of litigation. This paper argues that the substantive decision was largely determined by the civil procedure available. Different rules of civil procedure, it is suggested, would have resulted in a better decision in the English law of contract.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work was to design a set of rules for levodopa infusion dose adjustment in Parkinson’s disease based on a simulation experiments. Using this simulator, optimal infusions dose in different conditions were calculated. There are seven conditions (-3 to +3)appearing in a rating scale for Parkinson’s disease patients. By finding mean of the differences between conditions and optimal dose, two sets of rules were designed. The set of rules was optimized by several testing. Usefulness for optimizing the titration procedure of new infusion patients based on rule-based reasoning was investigated. Results show that both of the number of the steps and the errors for finding optimal dose was shorten by new rules. At last, the dose predicted with new rules well on each single occasion of majority of patients in simulation experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, different ways of combining neural predictive models or neural-based forecasts are discussed. The proposed approaches consider mostly Gaussian radial basis function networks, which can be efficiently identified and estimated through recursive/adaptive methods. Two different ways of combining are explored to get a final estimate – model mixing and model synthesis –, with the aim of obtaining improvements both in terms of efficiency and effectiveness. In the context of model mixing, the usual framework for linearly combining estimates from different models is extended, to deal with the case where the forecast errors from those models are correlated. In the context of model synthesis, and to address the problems raised by heavily nonstationary time series, we propose hybrid dynamic models for more advanced time series forecasting, composed of a dynamic trend regressive model (or, even, a dynamic harmonic regressive model), and a Gaussian radial basis function network. Additionally, using the model mixing procedure, two approaches for decision-making from forecasting models are discussed and compared: either inferring decisions from combined predictive estimates, or combining prescriptive solutions derived from different forecasting models. Finally, the application of some of the models and methods proposed previously is illustrated with two case studies, based on time series from finance and from tourism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to develop a procedure based on Gompertz function to determine the efficiency of utilization of amino acid. The procedure was applied to determine the efficiency of utilization of dietary lysine, methionine+cystine and threonine by growing pullets and based on the efficiencies were estimated the requirements for the growth phase of birds. The Gompertz function was fitted to the data of feed intake, body weight, feather-free body protein weight and feather protein weight of four strains of laying hens in the growth phase. The rates of consumption and daily protein deposition (PD) were calculated. The amino acid deposition was obtained by multiplying the PD by the amino acid concentration in feather protein and feather-free body protein. The results showed that the efficiency of utilization of amino acid decreased with maturity and, conversely, there was a proportional increase of the requirement per kg of weight gain. The procedure based on the Gompertz function to determine the efficiency of utilization of amino acid proved to be suitable to evaluate the efficiency of utilization of amino acid and can be a useful tool to diagnose the effectiveness of the nutritional management, aiding in decision-making on the nutritional management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many engineering sectors are challenged by multi-objective optimization problems. Even if the idea behind these problems is simple and well established, the implementation of any procedure to solve them is not a trivial task. The use of evolutionary algorithms to find candidate solutions is widespread. Usually they supply a discrete picture of the non-dominated solutions, a Pareto set. Although it is very interesting to know the non-dominated solutions, an additional criterion is needed to select one solution to be deployed. To better support the design process, this paper presents a new method of solving non-linear multi-objective optimization problems by adding a control function that will guide the optimization process over the Pareto set that does not need to be found explicitly. The proposed methodology differs from the classical methods that combine the objective functions in a single scale, and is based on a unique run of non-linear single-objective optimizers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Smoke spikes occurring during transient engine operation have detrimental health effects and increase fuel consumption by requiring more frequent regeneration of the diesel particulate filter. This paper proposes a decision tree approach to real-time detection of smoke spikes for control and on-board diagnostics purposes. A contemporary, electronically controlled heavy-duty diesel engine was used to investigate the deficiencies of smoke control based on the fuel-to-oxygen-ratio limit. With the aid of transient and steady state data analysis and empirical as well as dimensional modeling, it was shown that the fuel-to-oxygen ratio was not estimated correctly during the turbocharger lag period. This inaccuracy was attributed to the large manifold pressure ratios and low exhaust gas recirculation flows recorded during the turbocharger lag period, which meant that engine control module correlations for the exhaust gas recirculation flow and the volumetric efficiency had to be extrapolated. The engine control module correlations were based on steady state data and it was shown that, unless the turbocharger efficiency is artificially reduced, the large manifold pressure ratios observed during the turbocharger lag period cannot be achieved at steady state. Additionally, the cylinder-to-cylinder variation during this period were shown to be sufficiently significant to make the average fuel-to-oxygen ratio a poor predictor of the transient smoke emissions. The steady state data also showed higher smoke emissions with higher exhaust gas recirculation fractions at constant fuel-to-oxygen-ratio levels. This suggests that, even if the fuel-to-oxygen ratios were to be estimated accurately for each cylinder, they would still be ineffective as smoke limiters. A decision tree trained on snap throttle data and pruned with engineering knowledge was able to use the inaccurate engine control module estimates of the fuel-to-oxygen ratio together with information on the engine control module estimate of the exhaust gas recirculation fraction, the engine speed, and the manifold pressure ratio to predict 94% of all spikes occurring over the Federal Test Procedure cycle. The advantages of this non-parametric approach over other commonly used parametric empirical methods such as regression were described. An application of accurate smoke spike detection in which the injection pressure is increased at points with a high opacity to reduce the cumulative particulate matter emissions substantially with a minimum increase in the cumulative nitrogrn oxide emissions was illustrated with dimensional and empirical modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Humans and animals face decision tasks in an uncertain multi-agent environment where an agent's strategy may change in time due to the co-adaptation of others strategies. The neuronal substrate and the computational algorithms underlying such adaptive decision making, however, is largely unknown. We propose a population coding model of spiking neurons with a policy gradient procedure that successfully acquires optimal strategies for classical game-theoretical tasks. The suggested population reinforcement learning reproduces data from human behavioral experiments for the blackjack and the inspector game. It performs optimally according to a pure (deterministic) and mixed (stochastic) Nash equilibrium, respectively. In contrast, temporal-difference(TD)-learning, covariance-learning, and basic reinforcement learning fail to perform optimally for the stochastic strategy. Spike-based population reinforcement learning, shown to follow the stochastic reward gradient, is therefore a viable candidate to explain automated decision learning of a Nash equilibrium in two-player games.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES To evaluate prosthetic parameters in the edentulous anterior maxilla for decision making between fixed and removable implant prosthesis using virtual planning software. MATERIAL AND METHODS CT- or DVT-scans of 43 patients (mean age 62 ± 8 years) with an edentulous maxilla were analyzed with the NobelGuide software. Implants (≥3.5 mm diameter, ≥10 mm length) were virtually placed in the optimal three-dimensional prosthetic position of all maxillary front teeth. Anatomical and prosthetic landmarks, including the cervical crown point (C-Point), the acrylic flange border (F-Point), and the implant-platform buccal-end (I-Point) were defined in each middle section to determine four measuring parameters: (1) acrylic flange height (FLHeight), (2) mucosal coverage (MucCov), (3) crown-Implant distance (CID) and (4) buccal prosthesis profile (ProsthProfile). Based on these parameters, all patients were assigned to one of three classes: (A) MucCov ≤ 0 mm and ProsthProfile≥45(0) allowing for fixed prosthesis, (B) MucCov = 0-5 mm and/or ProsthProfile = 30(0) -45(0) probably allowing for fixed prosthesis, and (C) MucCov ≥ 5 mm and/or ProsthProfile ≤ 30(0) where removable prosthesis is favorable. Statistical analyses included descriptive methods and non-parametric tests. RESULTS Mean values were for FLHeight 10.0 mm, MucCov 5.6 mm, CID 7.4 mm, and ProsthProfile 39.1(0) . Seventy percent of patients fulfilled class C criteria (removable), 21% class B (probably fixed), and 2% class A (fixed), while in 7% (three patients) bone volume was insufficient for implant planning. CONCLUSIONS The proposed classification and virtual planning procedure simplify the decision-making process regarding type of prosthesis and increase predictability of esthetic treatment outcomes. It was demonstrated that in the majority of cases, the space between the prosthetic crown and implant platform had to be filled with prosthetic materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES Valve-sparing root replacement (VSRR) is thought to reduce the rate of thromboembolic and bleeding events compared with aortic root replacement using a mechanical aortic root replacement (MRR) with a composite graft by avoiding oral anticoagulation. But as VSRR carries a certain risk for subsequent reinterventions, decision-making in the individual patient can be challenging. METHODS Of 100 Marfan syndrome (MFS) patients who underwent 169 aortic surgeries and were followed at our institution since 1995, 59 consecutive patients without a history of dissection or prior aortic surgery underwent elective VSRR or MRR and were retrospectively analysed. RESULTS VSRR was performed in 29 (David n = 24, Yacoub n = 5) and MRR in 30 patients. The mean age was 33 ± 15 years. The mean follow-up after VSRR was 6.5 ± 4 years (180 patient-years) compared with 8.8 ± 9 years (274 patient-years) after MRR. Reoperation rates after root remodelling (Yacoub) were significantly higher than after the reimplantation (David) procedure (60 vs 4.2%, P = 0.01). The need for reinterventions after the reimplantation procedure (0.8% per patient-year) was not significantly higher than after MRR (P = 0.44) but follow-up after VSRR was significantly shorter (P = 0.03). There was neither significant morbidity nor mortality associated with root reoperations. There were no neurological events after VSRR compared with four stroke/intracranial bleeding events in the MRR group (log-rank, P = 0.11), translating into an event rate of 1.46% per patient-year following MRR. CONCLUSION The calculated annual failure rate after VSRR using the reimplantation technique was lower than the annual risk for thromboembolic or bleeding events. Since the perioperative risk of reinterventions following VSRR is low, patients might benefit from VSRR even if redo surgery may become necessary during follow-up.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BackgroundConsensus-based approaches provide an alternative to evidence-based decision making, especially in situations where high-level evidence is limited. Our aim was to demonstrate a novel source of information, objective consensus based on recommendations in decision tree format from multiple sources.MethodsBased on nine sample recommendations in decision tree format a representative analysis was performed. The most common (mode) recommendations for each eventuality (each permutation of parameters) were determined. The same procedure was applied to real clinical recommendations for primary radiotherapy for prostate cancer. Data was collected from 16 radiation oncology centres, converted into decision tree format and analyzed in order to determine the objective consensus.ResultsBased on information from multiple sources in decision tree format, treatment recommendations can be assessed for every parameter combination. An objective consensus can be determined by means of mode recommendations without compromise or confrontation among the parties. In the clinical example involving prostate cancer therapy, three parameters were used with two cut-off values each (Gleason score, PSA, T-stage) resulting in a total of 27 possible combinations per decision tree. Despite significant variations among the recommendations, a mode recommendation could be found for specific combinations of parameters.ConclusionRecommendations represented as decision trees can serve as a basis for objective consensus among multiple parties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Libraries of learning objects may serve as basis for deriving course offerings that are customized to the needs of different learning communities or even individuals. Several ways of organizing this course composition process are discussed. Course composition needs a clear understanding of the dependencies between the learning objects. Therefore we discuss the metadata for object relationships proposed in different standardization projects and especially those suggested in the Dublin Core Metadata Initiative. Based on these metadata we construct adjacency matrices and graphs. We show how Gozinto-type computations can be used to determine direct and indirect prerequisites for certain learning objects. The metadata may also be used to define integer programming models which can be applied to support the instructor in formulating his specifications for selecting objects or which allow a computer agent to automatically select learning objects. Such decision models could also be helpful for a learner navigating through a library of learning objects. We also sketch a graph-based procedure for manual or automatic sequencing of the learning objects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to gain an understanding of the Assistive Technology decision making process at four regional school districts in Pennsylvania. A qualitative case study research method involving the triangulation of data sources was implemented to collect and analyze data. Through an analysis of the data, three major topics emerged that will be addressed in the body of this paper: (a) the procedure for determining assistive technology needs and the dynamics of the decision-making process, b) the cohesiveness of Special Education and General Education programs, and c) major concerns that impact the delivery of assistive technology services.