44 resultados para Multi Criteria Analysis
em Université de Lausanne, Switzerland
Resumo:
Within Data Envelopment Analysis, several alternative models allow for an environmental adjustment. The majority of them deliver divergent results. Decision makers face the difficult task of selecting the most suitable model. This study is performed to overcome this difficulty. By doing so, it fills a research gap. First, a two-step web-based survey is conducted. It aims (1) to identify the selection criteria, (2) to prioritize and weight the selection criteria with respect to the goal of selecting the most suitable model and (3) to collect the preferences about which model is preferable to fulfil each selection criterion. Second, Analytic Hierarchy Process is used to quantify the preferences expressed in the survey. Results show that the understandability, the applicability and the acceptability of the alternative models are valid selection criteria. The selection of the most suitable model depends on the preferences of the decision makers with regards to these criteria.
Resumo:
PURPOSE: Intraoperative adverse events significantly influence morbidity and mortality of laparoscopic colorectal resections. Over an 11-year period, the changes of occurrence of such intraoperative adverse events were assessed in this study. METHODS: Analysis of 3,928 patients undergoing elective laparoscopic colorectal resection based on the prospective database of the Swiss Association of Laparoscopic and Thoracoscopic Surgery was performed. RESULTS: Overall, 377 intraoperative adverse events occurred in 329 patients (overall incidence of 8.4 %). Of 377 events, 163 (43 %) were surgical complications and 214 (57 %) were nonsurgical adverse events. Surgical complications were iatrogenic injury to solid organs (n = 63; incidence of 1.6 %), bleeding (n = 62; 1.6 %), lesion by puncture (n = 25; 0.6 %), and intraoperative anastomotic leakage (n = 13; 0.3 %). Of note, 11 % of intraoperative organ/puncture lesions requiring re-intervention were missed intraoperatively. Nonsurgical adverse events were problems with equipment (n = 127; 3.2 %), anesthetic problems (n = 30; 0.8 %), and various (n = 57; 1.5 %). Over time, the rate of intraoperative adverse events decreased, but not significantly. Bleeding complications significantly decreased (p = 0.015), and equipment problems increased (p = 0.036). However, the rate of adverse events requiring conversion significantly decreased with time (p < 0.001). Patients with an intraoperative adverse event had a significantly higher rate of postoperative local and general morbidity (41.2 and 32.9 % vs. 18.0 and 17.2 %, p < 0.001 and p < 0.001, respectively). CONCLUSIONS: Intraoperative surgical complications and adverse events in laparoscopic colorectal resections did not change significantly over time and are associated with an increased postoperative morbidity.
Resumo:
This contribution introduces Data Envelopment Analysis (DEA), a performance measurement technique. DEA helps decision makers for the following reasons: (1) By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement; (2) By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient; (3) By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimise the average total cost; (4) By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices. This contribution presents the essentials about DEA, alongside a case study to intuitively understand its application. It also introduces Win4DEAP, a software package that conducts efficiency analysis based on DEA methodology. The methodical background of DEA is presented for more demanding readers. Finally, four advanced topics of DEA are treated: adjustment to the environment, preferences, sensitivity analysis and time series data.
Resumo:
Measuring school efficiency is a challenging task. First, a performance measurement technique has to be selected. Within Data Envelopment Analysis (DEA), one such technique, alternative models have been developed in order to deal with environmental variables. The majority of these models lead to diverging results. Second, the choice of input and output variables to be included in the efficiency analysis is often dictated by data availability. The choice of the variables remains an issue even when data is available. As a result, the choice of technique, model and variables is probably, and ultimately, a political judgement. Multi-criteria decision analysis methods can help the decision makers to select the most suitable model. The number of selection criteria should remain parsimonious and not be oriented towards the results of the models in order to avoid opportunistic behaviour. The selection criteria should also be backed by the literature or by an expert group. Once the most suitable model is identified, the principle of permanence of methods should be applied in order to avoid a change of practices over time. Within DEA, the two-stage model developed by Ray (1991) is the most convincing model which allows for an environmental adjustment. In this model, an efficiency analysis is conducted with DEA followed by an econometric analysis to explain the efficiency scores. An environmental variable of particular interest, tested in this thesis, consists of the fact that operations are held, for certain schools, on multiple sites. Results show that the fact of being located on more than one site has a negative influence on efficiency. A likely way to solve this negative influence would consist of improving the use of ICT in school management and teaching. Planning new schools should also consider the advantages of being located on a unique site, which allows reaching a critical size in terms of pupils and teachers. The fact that underprivileged pupils perform worse than privileged pupils has been public knowledge since Coleman et al. (1966). As a result, underprivileged pupils have a negative influence on school efficiency. This is confirmed by this thesis for the first time in Switzerland. Several countries have developed priority education policies in order to compensate for the negative impact of disadvantaged socioeconomic status on school performance. These policies have failed. As a result, other actions need to be taken. In order to define these actions, one has to identify the social-class differences which explain why disadvantaged children underperform. Childrearing and literary practices, health characteristics, housing stability and economic security influence pupil achievement. Rather than allocating more resources to schools, policymakers should therefore focus on related social policies. For instance, they could define pre-school, family, health, housing and benefits policies in order to improve the conditions for disadvantaged children.
Resumo:
We analysed the relationship between changes in land cover patterns and the Eurasian otter occurrence over the course of about 20 years (1985-2006) using multi-temporal Species Distribution Models (SDMs). The study area includes five river catchments covering most of the otter's Italian range. Land cover and topographic data were used as proxies of the ecological requirements of the otter within a 300-m buffer around river courses. We used species presence, pseudo-absence data, and environmental predictors to build past (1985) and current (2006) SDMs by applying an ensemble procedure through the BIOMOD modelling package. The performance of each model was evaluated by measuring the area under the curve (AUC) of the receiver-operating characteristic (ROC). Multi-temporal analyses of species distribution and land cover maps were performed by comparing the maps produced for 1985 and 2006. The ensemble procedure provided a good overall modelling accuracy, revealing that elevation and slope affected the otter's distribution in the past; in contrast, land cover predictors, such as cultivations and forests, were more important in the present period. During the transition period, 20.5% of the area became suitable, with 76% of the new otter presence data being located in these newly available areas. The multi-temporal analysis suggested that the quality of otter habitat improved in the last 20 years owing to the expansion of forests and to the reduction of cultivated fields in riparian belts. The evidence presented here stresses the great potential of riverine habitat restoration and environmental management for the future expansion of the otter in Italy
Resumo:
Due to the existence of free software and pedagogical guides, the use of Data Envelopment Analysis (DEA) has been further democratized in recent years. Nowadays, it is quite usual for practitioners and decision makers with no or little knowledge in operational research to run their own efficiency analysis. Within DEA, several alternative models allow for an environmental adjustment. Four alternative models, each user-friendly and easily accessible to practitioners and decision makers, are performed using empirical data of 90 primary schools in the State of Geneva, Switzerland. Results show that the majority of alternative models deliver divergent results. From a political and a managerial standpoint, these diverging results could lead to potentially ineffective decisions. As no consensus emerges on the best model to use, practitioners and decision makers may be tempted to select the model that is right for them, in other words, the model that best reflects their own preferences. Further studies should investigate how an appropriate multi-criteria decision analysis method could help decision makers to select the right model.
Resumo:
Ipilimumab and tremelimumab are human monoclonal antibodies (Abs) against cytotoxic T-lymphocyte antigen-4 (CTLA-4). Ipilimumab was the first agent to show a statistically significant benefit in overall survival in advanced melanoma patients. Currently, there is no proven association between the BRAFV600 mutation and the disease control rate in response to ipilimumab. This analysis was carried out to assess if BRAFV600 and NRAS mutation status affects the clinical outcome of anti-CTLA-4-treated melanoma patients. This is a retrospective multi-center analysis of 101 patients, with confirmed BRAF and NRAS mutation status, treated with anti-CTLA-4 antibodies from December 2006 until August 2012. The median overall survival, defined from the treatment start date with the anti-CTLA-4. Abs-treatment to death or till last follow up, of BRAFV600 or NRAS mutant patients (n = 62) was 10.12 months (95% CI 6.78-13.2) compared to 8.26 months (95% CI 6.02-19.9) in BRAFV600/NRASwt subpopulation (n = 39) (p = 0.67). The median OS of NRAS mutated patients (n = 24) was 12.1 months and although was prolonged compared to the median OS of BRAF mutated patients (n = 38, mOS = 8.03 months) or BRAFV600/NRASwt patients (n = 39, mOS = 8.26 months) the difference didn't reach statistical significance (p = 0.56). 69 patients were able to complete 4 cycles of anti-CTLA-4 treatment. Of the 24 patients treated with selected BRAF- or MEK-inhibitors, 16 patients received anti-CTLA 4 Abs following either a BRAF or MEK inhibitor with only 8 of them being able to finish 4 cycles of treatment. Based on our results, there is no difference in the median OS in patients treated with anti-CTLA-4 Abs implying that the BRAF/NRAS mutation status alone is not sufficient to predict the outcome of patients treated with anti-CTLA-4 Abs.
Resumo:
1. Digital elevation models (DEMs) are often used in landscape ecology to retrieve elevation or first derivative terrain attributes such as slope or aspect in the context of species distribution modelling. However, DEM-derived variables are scale-dependent and, given the increasing availability of very high-resolution (VHR) DEMs, their ecological relevancemust be assessed for different spatial resolutions. 2. In a study area located in the Swiss Western Alps, we computed VHR DEMs-derived variables related to morphometry, hydrology and solar radiation. Based on an original spatial resolution of 0.5 m, we generated DEM-derived variables at 1, 2 and 4 mspatial resolutions, applying a Gaussian Pyramid. Their associations with local climatic factors, measured by sensors (direct and ambient air temperature, air humidity and soil moisture) as well as ecological indicators derived fromspecies composition, were assessed with multivariate generalized linearmodels (GLM) andmixed models (GLMM). 3. Specific VHR DEM-derived variables showed significant associations with climatic factors. In addition to slope, aspect and curvature, the underused wetness and ruggedness indices modelledmeasured ambient humidity and soilmoisture, respectively. Remarkably, spatial resolution of VHR DEM-derived variables had a significant influence on models' strength, with coefficients of determination decreasing with coarser resolutions or showing a local optimumwith a 2 mresolution, depending on the variable considered. 4. These results support the relevance of using multi-scale DEM variables to provide surrogates for important climatic variables such as humidity, moisture and temperature, offering suitable alternatives to direct measurements for evolutionary ecology studies at a local scale.
Resumo:
In this thesis, we study the use of prediction markets for technology assessment. We particularly focus on their ability to assess complex issues, the design constraints required for such applications and their efficacy compared to traditional techniques. To achieve this, we followed a design science research paradigm, iteratively developing, instantiating, evaluating and refining the design of our artifacts. This allowed us to make multiple contributions, both practical and theoretical. We first showed that prediction markets are adequate for properly assessing complex issues. We also developed a typology of design factors and design propositions for using these markets in a technology assessment context. Then, we showed that they are able to solve some issues related to the R&D portfolio management process and we proposed a roadmap for their implementation. Finally, by comparing the instantiation and the results of a multi-criteria decision method and a prediction market, we showed that the latter are more efficient, while offering similar results. We also proposed a framework for comparing forecasting methods, to identify the constraints based on contingency factors. In conclusion, our research opens a new field of application of prediction markets and should help hasten their adoption by enterprises. Résumé français: Dans cette thèse, nous étudions l'utilisation de marchés de prédictions pour l'évaluation de nouvelles technologies. Nous nous intéressons plus particulièrement aux capacités des marchés de prédictions à évaluer des problématiques complexes, aux contraintes de conception pour une telle utilisation et à leur efficacité par rapport à des techniques traditionnelles. Pour ce faire, nous avons suivi une approche Design Science, développant itérativement plusieurs prototypes, les instanciant, puis les évaluant avant d'en raffiner la conception. Ceci nous a permis de faire de multiples contributions tant pratiques que théoriques. Nous avons tout d'abord montré que les marchés de prédictions étaient adaptés pour correctement apprécier des problématiques complexes. Nous avons également développé une typologie de facteurs de conception ainsi que des propositions de conception pour l'utilisation de ces marchés dans des contextes d'évaluation technologique. Ensuite, nous avons montré que ces marchés pouvaient résoudre une partie des problèmes liés à la gestion des portes-feuille de projets de recherche et développement et proposons une feuille de route pour leur mise en oeuvre. Finalement, en comparant la mise en oeuvre et les résultats d'une méthode de décision multi-critère et d'un marché de prédiction, nous avons montré que ces derniers étaient plus efficaces, tout en offrant des résultats semblables. Nous proposons également un cadre de comparaison des méthodes d'évaluation technologiques, permettant de cerner au mieux les besoins en fonction de facteurs de contingence. En conclusion, notre recherche ouvre un nouveau champ d'application des marchés de prédiction et devrait permettre d'accélérer leur adoption par les entreprises.
Resumo:
This study was designed to check for the equivalence of the ZKPQ-50-CC (Spanish and French versions) through Internet on-line (OL) and paper and pencil (PP) answer format. Differences in means and devia- tions were significant in some scales, but effect sizes are minimal except for Sociability in the Spanish sample. Alpha reliabilities are also very similar in both versions with no significant differences between formats. A robust factorial structure was found for the two formats and the average congruency coefficients were 0.98. The goodness-of-fit indexes obtained by confirmatory factorial analysis are very similar to those obtained in the ZKPQ-50-CC validation study and they do not differ between the two formats. The multi-group analysis confirms the equivalence among the OL-PP formats in both countries. These results in general support the validity and reliability of the Internet as a method in investigations using the ZKPQ-50-CC.
Resumo:
Abstract In this thesis we present the design of a systematic integrated computer-based approach for detecting potential disruptions from an industry perspective. Following the design science paradigm, we iteratively develop several multi-actor multi-criteria artifacts dedicated to environment scanning. The contributions of this thesis are both theoretical and practical. We demonstrate the successful use of multi-criteria decision-making methods for technology foresight. Furthermore, we illustrate the design of our artifacts using build and-evaluate loops supported with a field study of the Swiss mobile payment industry. To increase the relevance of this study, we systematically interview key Swiss experts for each design iteration. As a result, our research provides a realistic picture of the current situation in the Swiss mobile payment market and reveals previously undiscovered weak signals for future trends. Finally, we suggest a generic design process for environment scanning.
Resumo:
This paper presents the current state and development of a prototype web-GIS (Geographic Information System) decision support platform intended for application in natural hazards and risk management, mainly for floods and landslides. This web platform uses open-source geospatial software and technologies, particularly the Boundless (formerly OpenGeo) framework and its client side software development kit (SDK). The main purpose of the platform is to assist the experts and stakeholders in the decision-making process for evaluation and selection of different risk management strategies through an interactive participation approach, integrating web-GIS interface with decision support tool based on a compromise programming approach. The access rights and functionality of the platform are varied depending on the roles and responsibilities of stakeholders in managing the risk. The application of the prototype platform is demonstrated based on an example case study site: Malborghetto Valbruna municipality of North-Eastern Italy where flash floods and landslides are frequent with major events having occurred in 2003. The preliminary feedback collected from the stakeholders in the region is discussed to understand the perspectives of stakeholders on the proposed prototype platform.
Resumo:
BACKGROUND: Patient-centered care (PCC) has been recognized as a marker of quality in health service delivery. In policy documents, PCC is often used interchangeably with other models of care. There is a wide literature about PCC, but there is a lack of evidence about which model is the most appropriate for maternity services specifically. AIM: We sought to identify and critically appraise the literature to identify which definition of PCC is most relevant for maternity services. METHODS: The four-step approach used to identify definitions of PCC was to 1) search electronic databases using key terms (1995-2011), 2) cross-reference key papers, 3) search of specific journals, and 4) search the grey literature. Four papers and two books met our inclusion criteria. ANALYSIS: A four-criteria critical appraisal tool developed for the review was used to appraise the papers and books. MAIN RESULTS: From the six identified definitions, the Shaller's definition met the majority of the four criteria outlined and seems to be the most relevant to maternity services because it includes physiologic conditions as well as pathology, psychological aspects, a nonmedical approach to care, the greater involvement of family and friends, and strategies to implement PCC. CONCLUSION: This review highlights Shaller's definitions of PCC as the one that would be the most inclusive of all women using maternity services. Future research should concentrate on evaluating programs that support PCC in maternity services, and testing/validating this model of care.
Resumo:
Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (Bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker - typically a client of a forensic examination or a scientist acting on behalf of a client - ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and Bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and Bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked here also serve the purpose of supporting the discussion of the similarities, differences and complementary aspects of existing Bayesian probabilistic sampling criteria and the decision-theoretic approach proposed throughout this paper.