919 resultados para Superiority and Inferiority Multi-criteria Ranking (SIR) Method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a compact form for the maximum value of the non-Archimedean in Data Envelopment Analysis (DEA) models applied for the technology selection, without the need to solve a linear programming (LP). Using this method the computational performance the common weight multi-criteria decision-making (MCDM) DEA model proposed by Karsak and Ahiska (International Journal of Production Research, 2005, 43(8), 1537-1554) is improved. This improvement is significant when computational issues and complexity analysis are a concern.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Liquid-liquid extraction has long been known as a unit operation that plays an important role in industry. This process is well known for its complexity and sensitivity to operation conditions. This thesis presents an attempt to explore the dynamics and control of this process using a systematic approach and state of the art control system design techniques. The process was studied first experimentally under carefully selected. operation conditions, which resembles the ranges employed practically under stable and efficient conditions. Data were collected at steady state conditions using adequate sampling techniques for the dispersed and continuous phases as well as during the transients of the column with the aid of a computer-based online data logging system and online concentration analysis. A stagewise single stage backflow model was improved to mimic the dynamic operation of the column. The developed model accounts for the variation in hydrodynamics, mass transfer, and physical properties throughout the length of the column. End effects were treated by addition of stages at the column entrances. Two parameters were incorporated in the model namely; mass transfer weight factor to correct for the assumption of no mass transfer in the. settling zones at each stage and the backmixing coefficients to handle the axial dispersion phenomena encountered in the course of column operation. The parameters were estimated by minimizing the differences between the experimental and the model predicted concentration profiles at steady state conditions using non-linear optimisation technique. The estimated values were then correlated as functions of operating parameters and were incorporated in·the model equations. The model equations comprise a stiff differential~algebraic system. This system was solved using the GEAR ODE solver. The calculated concentration profiles were compared to those experimentally measured. A very good agreement of the two profiles was achieved within a percent relative error of ±2.S%. The developed rigorous dynamic model of the extraction column was used to derive linear time-invariant reduced-order models that relate the input variables (agitator speed, solvent feed flowrate and concentration, feed concentration and flowrate) to the output variables (raffinate concentration and extract concentration) using the asymptotic method of system identification. The reduced-order models were shown to be accurate in capturing the dynamic behaviour of the process with a maximum modelling prediction error of I %. The simplicity and accuracy of the derived reduced-order models allow for control system design and analysis of such complicated processes. The extraction column is a typical multivariable process with agitator speed and solvent feed flowrate considered as manipulative variables; raffinate concentration and extract concentration as controlled variables and the feeds concentration and feed flowrate as disturbance variables. The control system design of the extraction process was tackled as multi-loop decentralised SISO (Single Input Single Output) as well as centralised MIMO (Multi-Input Multi-Output) system using both conventional and model-based control techniques such as IMC (Internal Model Control) and MPC (Model Predictive Control). Control performance of each control scheme was. studied in terms of stability, speed of response, sensitivity to modelling errors (robustness), setpoint tracking capabilities and load rejection. For decentralised control, multiple loops were assigned to pair.each manipulated variable with each controlled variable according to the interaction analysis and other pairing criteria such as relative gain array (RGA), singular value analysis (SVD). Loops namely Rotor speed-Raffinate concentration and Solvent flowrate Extract concentration showed weak interaction. Multivariable MPC has shown more effective performance compared to other conventional techniques since it accounts for loops interaction, time delays, and input-output variables constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research involves a study of the questions, "what is considered safe", how are safety levels defined or decided, and according to whom. Tolerable or acceptable risk questions raise various issues: about values and assumptions inherent in such levels; about decision-making frameworks at the highest level of policy making as well as on the individual level; and about the suitability and competency of decision-makers to decide and to communicate their decisions. The wide-ranging topics covering philosophical and practical concerns examined in the literature review reveal the multi-disciplined scope of this research. To support this theoretical study empirical research was undertaken at the European Space Research and Technology Centre (ESTEC) of the European Space Agency (ESA). ESTEC is a large, multi-nationality, high technology organisation which presented an ideal case study for exploring how decisions are made with respect to safety from a personal as well as organisational aspect. A qualitative methodology was employed to gather, analyse and report the findings of this research. Significant findings reveal how experts perceive risks and the prevalence of informal decision-making processes partly due to the inadequacy of formal methods for deciding risk tolerability. In the field of occupational health and safety, this research has highlighted the importance and need for criteria to decide whether a risk is great enough to warrant attention in setting standards and priorities for risk control and resources. From a wider perspective and with the recognition that risk is an inherent part of life, the establishment of tolerability risk levels can be viewed as cornerstones indicating our progress, expectations and values, of life and work, in an increasingly litigious, knowledgeable and global society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a new mathematical method for improving the discrimination power of data envelopment analysis and to completely rank the efficient decision-making units (DMUs). Fuzzy concept is utilised. For this purpose, first all DMUs are evaluated with the CCR model. Thereafter, the resulted weights for each output are considered as fuzzy sets and are then converted to fuzzy numbers. The introduced model is a multi-objective linear model, endpoints of which are the highest and lowest of the weighted values. An added advantage of the model is its ability to handle the infeasibility situation sometimes faced by previously introduced models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the face of global population growth and the uneven distribution of water supply, a better knowledge of the spatial and temporal distribution of surface water resources is critical. Remote sensing provides a synoptic view of ongoing processes, which addresses the intricate nature of water surfaces and allows an assessment of the pressures placed on aquatic ecosystems. However, the main challenge in identifying water surfaces from remotely sensed data is the high variability of spectral signatures, both in space and time. In the last 10 years only a few operational methods have been proposed to map or monitor surface water at continental or global scale, and each of them show limitations. The objective of this study is to develop and demonstrate the adequacy of a generic multi-temporal and multi-spectral image analysis method to detect water surfaces automatically, and to monitor them in near-real-time. The proposed approach, based on a transformation of the RGB color space into HSV, provides dynamic information at the continental scale. The validation of the algorithm showed very few omission errors and no commission errors. It demonstrates the ability of the proposed algorithm to perform as effectively as human interpretation of the images. The validation of the permanent water surface product with an independent dataset derived from high resolution imagery, showed an accuracy of 91.5% and few commission errors. Potential applications of the proposed method have been identified and discussed. The methodology that has been developed 27 is generic: it can be applied to sensors with similar bands with good reliability, and minimal effort. Moreover, this experiment at continental scale showed that the methodology is efficient for a large range of environmental conditions. Additional preliminary tests over other continents indicate that the proposed methodology could also be applied at the global scale without too many difficulties

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Usually, data mining projects that are based on decision trees for classifying test cases will use the probabilities provided by these decision trees for ranking classified test cases. We have a need for a better method for ranking test cases that have already been classified by a binary decision tree because these probabilities are not always accurate and reliable enough. A reason for this is that the probability estimates computed by existing decision tree algorithms are always the same for all the different cases in a particular leaf of the decision tree. This is only one reason why the probability estimates given by decision tree algorithms can not be used as an accurate means of deciding if a test case has been correctly classified. Isabelle Alvarez has proposed a new method that could be used to rank the test cases that were classified by a binary decision tree [Alvarez, 2004]. In this paper we will give the results of a comparison of different ranking methods that are based on the probability estimate, the sensitivity of a particular case or both.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Global connectivity, for anyone, at anyplace, at anytime, to provide high-speed, high-quality, and reliable communication channels for mobile devices, is now becoming a reality. The credit mainly goes to the recent technological advances in wireless communications comprised of a wide range of technologies, services, and applications to fulfill the particular needs of end-users in different deployment scenarios (Wi-Fi, WiMAX, and 3G/4G cellular systems). In such a heterogeneous wireless environment, one of the key ingredients to provide efficient ubiquitous computing with guaranteed quality and continuity of service is the design of intelligent handoff algorithms. Traditional single-metric handoff decision algorithms, such as Received Signal Strength (RSS) based, are not efficient and intelligent enough to minimize the number of unnecessary handoffs, decision delays, and call-dropping and/or blocking probabilities. This research presented a novel approach for the design and implementation of a multi-criteria vertical handoff algorithm for heterogeneous wireless networks. Several parallel Fuzzy Logic Controllers were utilized in combination with different types of ranking algorithms and metric weighting schemes to implement two major modules: the first module estimated the necessity of handoff, and the other module was developed to select the best network as the target of handoff. Simulations based on different traffic classes, utilizing various types of wireless networks were carried out by implementing a wireless test-bed inspired by the concept of Rudimentary Network Emulator (RUNE). Simulation results indicated that the proposed scheme provided better performance in terms of minimizing the unnecessary handoffs, call dropping, and call blocking and handoff blocking probabilities. When subjected to Conversational traffic and compared against the RSS-based reference algorithm, the proposed scheme, utilizing the FTOPSIS ranking algorithm, was able to reduce the average outage probability of MSs moving with high speeds by 17%, new call blocking probability by 22%, the handoff blocking probability by 16%, and the average handoff rate by 40%. The significant reduction in the resulted handoff rate provides MS with efficient power consumption, and more available battery life. These percentages indicated a higher probability of guaranteed session continuity and quality of the currently utilized service, resulting in higher user satisfaction levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Planning for complex ecosystem restoration projects involves integrating ecological modeling with analysis of performance trade-offs among restoration alternatives. The authors used the Everglades Landscape Model and Multi-Criteria Decision Analysis to explore the effect of simulated ecosystem performance, risk preferences, and criteria weights on the ranking of three alternatives to restoring overland sheet flow in the Everglades. The ecological model outputs included both hydrologic and water quality criteria. Results were scored in the decision analysis framework, highlighting the trade-offs between hydrologic restoration and water quality constraints. Given equal weighting of performance measures, the alternative with more homogenous sheet flow was preferred over other alternatives, despite evidence of some localized eutrophication risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The successful performance of a hydrological model is usually challenged by the quality of the sensitivity analysis, calibration and uncertainty analysis carried out in the modeling exercise and subsequent simulation results. This is especially important under changing climatic conditions where there are more uncertainties associated with climate models and downscaling processes that increase the complexities of the hydrological modeling system. In response to these challenges and to improve the performance of the hydrological models under changing climatic conditions, this research proposed five new methods for supporting hydrological modeling. First, a design of experiment aided sensitivity analysis and parameterization (DOE-SAP) method was proposed to investigate the significant parameters and provide more reliable sensitivity analysis for improving parameterization during hydrological modeling. The better calibration results along with the advanced sensitivity analysis for significant parameters and their interactions were achieved in the case study. Second, a comprehensive uncertainty evaluation scheme was developed to evaluate three uncertainty analysis methods, the sequential uncertainty fitting version 2 (SUFI-2), generalized likelihood uncertainty estimation (GLUE) and Parameter solution (ParaSol) methods. The results showed that the SUFI-2 performed better than the other two methods based on calibration and uncertainty analysis results. The proposed evaluation scheme demonstrated that it is capable of selecting the most suitable uncertainty method for case studies. Third, a novel sequential multi-criteria based calibration and uncertainty analysis (SMC-CUA) method was proposed to improve the efficiency of calibration and uncertainty analysis and control the phenomenon of equifinality. The results showed that the SMC-CUA method was able to provide better uncertainty analysis results with high computational efficiency compared to the SUFI-2 and GLUE methods and control parameter uncertainty and the equifinality effect without sacrificing simulation performance. Fourth, an innovative response based statistical evaluation method (RESEM) was proposed for estimating the uncertainty propagated effects and providing long-term prediction for hydrological responses under changing climatic conditions. By using RESEM, the uncertainty propagated from statistical downscaling to hydrological modeling can be evaluated. Fifth, an integrated simulation-based evaluation system for uncertainty propagation analysis (ISES-UPA) was proposed for investigating the effects and contributions of different uncertainty components to the total propagated uncertainty from statistical downscaling. Using ISES-UPA, the uncertainty from statistical downscaling, uncertainty from hydrological modeling, and the total uncertainty from two uncertainty sources can be compared and quantified. The feasibility of all the methods has been tested using hypothetical and real-world case studies. The proposed methods can also be integrated as a hydrological modeling system to better support hydrological studies under changing climatic conditions. The results from the proposed integrated hydrological modeling system can be used as scientific references for decision makers to reduce the potential risk of damages caused by extreme events for long-term water resource management and planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Objectives: Schizophrenia is a severe chronic disease. Endpoint variables lack objectivity and the diagnostic criteria have evolved with time. In order to guide the development of new drugs, European Medicines Agency (EMA) issued a guideline on the clinical investigation of medicinal products for the treatment of schizophrenia. Methods: Authors reviewed and discussed the efficacy trial part of the Guideline. Results: The Guideline divides clinical efficacy trials into short-term trials and long-term trials. The short-term three-arm trial is recommended to replace the short-term two-arm active-controlled non-inferiority trial because the latter has sensitivity issues. The Guideline ultimately makes that three-arm trial a superiority trial. The Guideline discusses four types of long-term trial designs. The randomized withdrawal trial design has some disadvantages. Long-term two-arm active-controlled non-inferiority trial is not recommended due to the sensitivity issue. Extension of the short-term trial is only suitable for extension of the short-term two-arm active-controlled superiority trial. The Guideline suggests that a hybrid design of a randomized withdrawal trial incorporated into a long-term parallel trial might be optimal. However, such a design has some disadvantages and might be too complex to be carried out. Authors suggest instead a three-group long-term trial design, which could provide comparison between test drug and active comparator along with comparison between the test drug and placebo. This alternative could arguably be much easier to carry out compared with the hybrid design. Conclusions: The three-group long-term design merits further discussion and evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The Analytic Hierarchy Process (AHP), developed by Saaty in the late 1970s, is one of the methods for multi-criteria decision making. The AHP disaggregates a complex decision problem into different hierarchical levels. The weight for each criterion and alternative are judged in pairwise comparisons and priorities are calculated by the Eigenvector method. The slowly increasing application of the AHP was the motivation for this study to explore the current state of its methodology in the healthcare context. Methods: A systematic literature review was conducted by searching the Pubmed and Web of Science databases for articles with the following keywords in their titles or abstracts: "Analytic Hierarchy Process," "Analytical Hierarchy Process," "multi-criteria decision analysis," "multiple criteria decision," "stated preference," and "pairwise comparison." In addition, we developed reporting criteria to indicate whether the authors reported important aspects and evaluated the resulting studies' reporting. Results: The systematic review resulted in 121 articles. The number of studies applying AHP has increased since 2005. Most studies were from Asia (almost 30 %), followed by the US (25.6 %). On average, the studies used 19.64 criteria throughout their hierarchical levels. Furthermore, we restricted a detailed analysis to those articles published within the last 5 years (n = 69). The mean of participants in these studies were 109, whereas we identified major differences in how the surveys were conducted. The evaluation of reporting showed that the mean of reported elements was about 6.75 out of 10. Thus, 12 out of 69 studies reported less than half of the criteria. Conclusion: The AHP has been applied inconsistently in healthcare research. A minority of studies described all the relevant aspects. Thus, the statements in this review may be biased, as they are restricted to the information available in the papers. Hence, further research is required to discover who should be interviewed and how, how inconsistent answers should be dealt with, and how the outcome and stability of the results should be presented. In addition, we need new insights to determine which target group can best handle the challenges of the AHP. © 2015 Schmidt et al.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Object recognition has long been a core problem in computer vision. To improve object spatial support and speed up object localization for object recognition, generating high-quality category-independent object proposals as the input for object recognition system has drawn attention recently. Given an image, we generate a limited number of high-quality and category-independent object proposals in advance and used as inputs for many computer vision tasks. We present an efficient dictionary-based model for image classification task. We further extend the work to a discriminative dictionary learning method for tensor sparse coding. In the first part, a multi-scale greedy-based object proposal generation approach is presented. Based on the multi-scale nature of objects in images, our approach is built on top of a hierarchical segmentation. We first identify the representative and diverse exemplar clusters within each scale. Object proposals are obtained by selecting a subset from the multi-scale segment pool via maximizing a submodular objective function, which consists of a weighted coverage term, a single-scale diversity term and a multi-scale reward term. The weighted coverage term forces the selected set of object proposals to be representative and compact; the single-scale diversity term encourages choosing segments from different exemplar clusters so that they will cover as many object patterns as possible; the multi-scale reward term encourages the selected proposals to be discriminative and selected from multiple layers generated by the hierarchical image segmentation. The experimental results on the Berkeley Segmentation Dataset and PASCAL VOC2012 segmentation dataset demonstrate the accuracy and efficiency of our object proposal model. Additionally, we validate our object proposals in simultaneous segmentation and detection and outperform the state-of-art performance. To classify the object in the image, we design a discriminative, structural low-rank framework for image classification. We use a supervised learning method to construct a discriminative and reconstructive dictionary. By introducing an ideal regularization term, we perform low-rank matrix recovery for contaminated training data from all categories simultaneously without losing structural information. A discriminative low-rank representation for images with respect to the constructed dictionary is obtained. With semantic structure information and strong identification capability, this representation is good for classification tasks even using a simple linear multi-classifier.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of Quality of Life (Qol) has been conducted on various scales throughout the years with focus on assessing overall quality of living amongst citizens. The main focus in these studies have been on economic factors, with the purpose of creating a Quality of Life Index (QLI).When it comes down to narrowing the focus to the environment and factors like Urban Green Spaces (UGS) and air quality the topic gets more focused on pointing out how each alternative meets this certain criteria. With the benefits of UGS and a healthy environment in focus a new Environmental Quality of Life Index (EQLI) will be proposed by incorporating Multi Criteria Analysis (MCA) and Geographical Information Systems (GIS). Working with MCA on complex environmental problems and incorporating it with GIS is a challenging but rewarding task, and has proven to be an efficient approach among environmental scientists. Background information on three MCA methods will be shown: Analytical Hierarchy Process (AHP), Regime Analysis and PROMETHEE. A survey based on a previous study conducted on the status of UGS within European cities was sent to 18 municipalities in the study area. The survey consists of evaluating the current status of UGS as well as planning and management of UGS with in municipalities for the purpose of getting criteria material for the selected MCA method. The current situation of UGS is assessed with use of GIS software and change detection is done on a 10 year period using NDVI index for comparison purposes to one of the criteria in the MCA. To add to the criteria, interpolation of nitrogen dioxide levels was performed with ordinary kriging and the results transformed into indicator values. The final outcome is an EQLI map with indicators of environmentally attractive municipalities with ranking based on predefinedMCA criteria using PROMETHEE I pairwise comparison and PROMETHEE II complete ranking of alternatives. The proposed methodology is applied to Lisbon’s Metropolitan Area, Portugal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fishing trials with monofilament gill nets and longlines using small hooks were carried out in Algarve waters (southern Portugal) over a one-year period. Four hook sizes of "Mustad" brand, round bent, flatted sea hooks (Quality 2316 DT, numbers 15, 13, 12 and 11) and four mesh sizes of 25, 30, 35 and 40 mm (bar length) monofilament gill nets were used. Commercially valuable sea breams dominated the longline catches while small pelagics were relatively more important in the gill nets. Significant differences in the catch size frequency distributions of the two gears were found for all the most important species caught by both gears (Boops boops, Diplodus bellottii, Diplodus vulgaris, Pagellus acarne, Pagellus erythrinus, Spondyiosoma cantharus, Scomber japonicus and Scorpaena notata), with longlines catching larger fish and a wider size range than nets. Whereas longline catch size frequency distributions for most species for the different hook sizes were generally highly overlapped, suggesting little or no differences in size selectivity, gill net catch size frequency distributions clearly showed size selection. A variety of models were fitted to the gill net and hook data using the SELECT method, while the parameters of the logistic model were estimated by maximum likelihood for the longline data. The bi-normal model gave the best fits for most of the species caught with gill nets, while the logistic model adequately described hook selectivity. The results of this study show that the two static gears compete for many of the same species and have different impacts in terms of catch composition and size selectivity. This information will I;e useful for the improved management of these small-scale fisheries in which many different gears compete for scarce resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some polycyclic aromatic hydrocarbons (PAHs) are ubiquitous in air and have been implicated as carcinogenic materials. Therefore, literature is replete with studies that are focused on their occurrence and profiles in indoor and outdoor air samples. However, because the relative potency of individual PAHs vary widely, health risks associated with the presence of PAHs in a particular environment cannot be extrapolated directly from the concentrations of individual PAHs in that environment. In addition, database on the potency of PAH mixtures is currently limited. In this paper, we have utilized multi-criteria decision making methods (MCDMs) to simultaneously correlate PAH-related health risk in some microenvironments to the concentration levels, ethoxyresorufin-O-deethylase (EROD) activity induction equivalency factors and toxic equivalency factors (TEFs) of PAHs found in those microenvironments. The results showed that the relative risk associated with PAHs in different air samples depends on the index used. Nevertheless, this approach offers a promising tool that could help identify microenvironments of concern and assist the prioritisation of control strategies.