61 resultados para Methods for Multi-criteria Evaluation
em Université de Lausanne, Switzerland
Resumo:
The aim of this study was to extract multi-parametric measures characterizing different features of sit-to-stand (Si-St) and stand-to-sit (St-Si) transitions in older persons, using a single inertial sensor attached to the chest. Investigated parameters were transition's duration, range of trunk tilt, smoothness of transition pattern assessed by its fractal dimension, and trunk movement's dynamic described by local wavelet energy. A measurement protocol with a Si-St followed by a St-Si postural transition was performed by two groups of participants: the first group (N=79) included Frail Elderly subjects admitted to a post-acute rehabilitation facility and the second group (N=27) were healthy community-dwelling elderly persons. Subjects were also evaluated with Tinetti's POMA scale. Compared to Healthy Elderly persons, frail group at baseline had significantly longer Si-St (3.85±1.04 vs. 2.60±0.32, p=0.001) and St-Si (4.08±1.21 vs. 2.81±0.36, p=0.001) transition's duration. Frail older persons also had significantly decreased smoothness of Si-St transition pattern (1.36±0.07 vs. 1.21±0.05, p=0.001) and dynamic of trunk movement. Measurements after three weeks of rehabilitation in frail older persons showed that smoothness of transition pattern had the highest improvement effect size (0.4) and discriminative performance. These results demonstrate the potential interest of such parameters to distinguish older subjects with different functional and health conditions.
Resumo:
Drug-eluting microspheres are used for embolization of hypervascular tumors and allow for local controlled drug release. Although the drug release from the microspheres relies on fast ion-exchange, so far only slow-releasing in vitro dissolution methods have been correlated to in vivo data. Three in vitro release methods are assessed in this study for their potential to predict slow in vivo release of sunitinib from chemoembolization spheres to the plasma, and fast local in vivo release obtained in an earlier study in rabbits. Release in an orbital shaker was slow (t50%=4.5h, 84% release) compared to fast release in USP 4 flow-through implant cells (t50%=1h, 100% release). Sunitinib release in saline from microspheres enclosed in dialysis inserts was prolonged and incomplete (t50%=9 days, 68% release) due to low drug diffusion through the dialysis membrane. The slow-release profile fitted best to low sunitinib plasma AUC following injection of sunitinib-eluting spheres. Although limited by lack of standardization, release in the orbital shaker fitted best to local in vivo sunitinib concentrations. Drug release in USP flow-through implant cells was too fast to correlate with local concentrations, although this method is preferred to discriminate between different sphere types.
Resumo:
Within Data Envelopment Analysis, several alternative models allow for an environmental adjustment. The majority of them deliver divergent results. Decision makers face the difficult task of selecting the most suitable model. This study is performed to overcome this difficulty. By doing so, it fills a research gap. First, a two-step web-based survey is conducted. It aims (1) to identify the selection criteria, (2) to prioritize and weight the selection criteria with respect to the goal of selecting the most suitable model and (3) to collect the preferences about which model is preferable to fulfil each selection criterion. Second, Analytic Hierarchy Process is used to quantify the preferences expressed in the survey. Results show that the understandability, the applicability and the acceptability of the alternative models are valid selection criteria. The selection of the most suitable model depends on the preferences of the decision makers with regards to these criteria.
Resumo:
This paper presents the current state and development of a prototype web-GIS (Geographic Information System) decision support platform intended for application in natural hazards and risk management, mainly for floods and landslides. This web platform uses open-source geospatial software and technologies, particularly the Boundless (formerly OpenGeo) framework and its client side software development kit (SDK). The main purpose of the platform is to assist the experts and stakeholders in the decision-making process for evaluation and selection of different risk management strategies through an interactive participation approach, integrating web-GIS interface with decision support tool based on a compromise programming approach. The access rights and functionality of the platform are varied depending on the roles and responsibilities of stakeholders in managing the risk. The application of the prototype platform is demonstrated based on an example case study site: Malborghetto Valbruna municipality of North-Eastern Italy where flash floods and landslides are frequent with major events having occurred in 2003. The preliminary feedback collected from the stakeholders in the region is discussed to understand the perspectives of stakeholders on the proposed prototype platform.
Resumo:
BACKGROUND: Magnetic resonance imaging (MRI) of pacemakers is a relative contraindication because of the risks to the patient from potentially hazardous interactions between the MRI and the pacemaker system. Chest scans (ie, cardiac magnetic resonance scans) are of particular importance and higher risk. The previously Food and Drug Administration-approved magnetic resonance conditional system includes positioning restrictions, limiting the powerful utility of MRI. OBJECTIVE: To confirm the safety and effectiveness of a pacemaker system designed for safe whole body MRI without MRI scan positioning restrictions. METHODS: Primary eligibility criteria included standard dual-chamber pacing indications. Patients (n = 263) were randomized in a 2:1 ratio to undergo 16 chest and head scans at 1.5 T between 9 and 12 weeks postimplant (n = 177) or to not undergo MRI (n = 86) post-implant. Evaluation of the pacemaker system occurred immediately before, during (monitoring), and after MRI, 1-week post-MRI, and 1-month post-MRI, and similarly for controls. Primary end points measured the MRI-related complication-free rate for safety and compared pacing capture threshold between MRI and control subjects for effectiveness. RESULTS: There were no MRI-related complications during or after MRI in subjects undergoing MRI (n = 148). Differences in pacing capture threshold values from pre-MRI to 1-month post-MRI were minimal and similar between the MRI and control groups. CONCLUSIONS: This randomized trial demonstrates that the Advisa MRI pulse generator and CapSureFix MRI 5086MRI lead system is safe and effective in the 1.5 T MRI environment without positioning restrictions for MRI scans or limitations of body parts scanned.
Resumo:
Nowadays, the joint exploitation of images acquired daily by remote sensing instruments and of images available from archives allows a detailed monitoring of the transitions occurring at the surface of the Earth. These modifications of the land cover generate spectral discrepancies that can be detected via the analysis of remote sensing images. Independently from the origin of the images and of type of surface change, a correct processing of such data implies the adoption of flexible, robust and possibly nonlinear method, to correctly account for the complex statistical relationships characterizing the pixels of the images. This Thesis deals with the development and the application of advanced statistical methods for multi-temporal optical remote sensing image processing tasks. Three different families of machine learning models have been explored and fundamental solutions for change detection problems are provided. In the first part, change detection with user supervision has been considered. In a first application, a nonlinear classifier has been applied with the intent of precisely delineating flooded regions from a pair of images. In a second case study, the spatial context of each pixel has been injected into another nonlinear classifier to obtain a precise mapping of new urban structures. In both cases, the user provides the classifier with examples of what he believes has changed or not. In the second part, a completely automatic and unsupervised method for precise binary detection of changes has been proposed. The technique allows a very accurate mapping without any user intervention, resulting particularly useful when readiness and reaction times of the system are a crucial constraint. In the third, the problem of statistical distributions shifting between acquisitions is studied. Two approaches to transform the couple of bi-temporal images and reduce their differences unrelated to changes in land cover are studied. The methods align the distributions of the images, so that the pixel-wise comparison could be carried out with higher accuracy. Furthermore, the second method can deal with images from different sensors, no matter the dimensionality of the data nor the spectral information content. This opens the doors to possible solutions for a crucial problem in the field: detecting changes when the images have been acquired by two different sensors.
Resumo:
This contribution introduces Data Envelopment Analysis (DEA), a performance measurement technique. DEA helps decision makers for the following reasons: (1) By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement; (2) By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient; (3) By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimise the average total cost; (4) By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices. This contribution presents the essentials about DEA, alongside a case study to intuitively understand its application. It also introduces Win4DEAP, a software package that conducts efficiency analysis based on DEA methodology. The methodical background of DEA is presented for more demanding readers. Finally, four advanced topics of DEA are treated: adjustment to the environment, preferences, sensitivity analysis and time series data.
Resumo:
In this thesis, we study the use of prediction markets for technology assessment. We particularly focus on their ability to assess complex issues, the design constraints required for such applications and their efficacy compared to traditional techniques. To achieve this, we followed a design science research paradigm, iteratively developing, instantiating, evaluating and refining the design of our artifacts. This allowed us to make multiple contributions, both practical and theoretical. We first showed that prediction markets are adequate for properly assessing complex issues. We also developed a typology of design factors and design propositions for using these markets in a technology assessment context. Then, we showed that they are able to solve some issues related to the R&D portfolio management process and we proposed a roadmap for their implementation. Finally, by comparing the instantiation and the results of a multi-criteria decision method and a prediction market, we showed that the latter are more efficient, while offering similar results. We also proposed a framework for comparing forecasting methods, to identify the constraints based on contingency factors. In conclusion, our research opens a new field of application of prediction markets and should help hasten their adoption by enterprises. Résumé français: Dans cette thèse, nous étudions l'utilisation de marchés de prédictions pour l'évaluation de nouvelles technologies. Nous nous intéressons plus particulièrement aux capacités des marchés de prédictions à évaluer des problématiques complexes, aux contraintes de conception pour une telle utilisation et à leur efficacité par rapport à des techniques traditionnelles. Pour ce faire, nous avons suivi une approche Design Science, développant itérativement plusieurs prototypes, les instanciant, puis les évaluant avant d'en raffiner la conception. Ceci nous a permis de faire de multiples contributions tant pratiques que théoriques. Nous avons tout d'abord montré que les marchés de prédictions étaient adaptés pour correctement apprécier des problématiques complexes. Nous avons également développé une typologie de facteurs de conception ainsi que des propositions de conception pour l'utilisation de ces marchés dans des contextes d'évaluation technologique. Ensuite, nous avons montré que ces marchés pouvaient résoudre une partie des problèmes liés à la gestion des portes-feuille de projets de recherche et développement et proposons une feuille de route pour leur mise en oeuvre. Finalement, en comparant la mise en oeuvre et les résultats d'une méthode de décision multi-critère et d'un marché de prédiction, nous avons montré que ces derniers étaient plus efficaces, tout en offrant des résultats semblables. Nous proposons également un cadre de comparaison des méthodes d'évaluation technologiques, permettant de cerner au mieux les besoins en fonction de facteurs de contingence. En conclusion, notre recherche ouvre un nouveau champ d'application des marchés de prédiction et devrait permettre d'accélérer leur adoption par les entreprises.
Resumo:
Measuring school efficiency is a challenging task. First, a performance measurement technique has to be selected. Within Data Envelopment Analysis (DEA), one such technique, alternative models have been developed in order to deal with environmental variables. The majority of these models lead to diverging results. Second, the choice of input and output variables to be included in the efficiency analysis is often dictated by data availability. The choice of the variables remains an issue even when data is available. As a result, the choice of technique, model and variables is probably, and ultimately, a political judgement. Multi-criteria decision analysis methods can help the decision makers to select the most suitable model. The number of selection criteria should remain parsimonious and not be oriented towards the results of the models in order to avoid opportunistic behaviour. The selection criteria should also be backed by the literature or by an expert group. Once the most suitable model is identified, the principle of permanence of methods should be applied in order to avoid a change of practices over time. Within DEA, the two-stage model developed by Ray (1991) is the most convincing model which allows for an environmental adjustment. In this model, an efficiency analysis is conducted with DEA followed by an econometric analysis to explain the efficiency scores. An environmental variable of particular interest, tested in this thesis, consists of the fact that operations are held, for certain schools, on multiple sites. Results show that the fact of being located on more than one site has a negative influence on efficiency. A likely way to solve this negative influence would consist of improving the use of ICT in school management and teaching. Planning new schools should also consider the advantages of being located on a unique site, which allows reaching a critical size in terms of pupils and teachers. The fact that underprivileged pupils perform worse than privileged pupils has been public knowledge since Coleman et al. (1966). As a result, underprivileged pupils have a negative influence on school efficiency. This is confirmed by this thesis for the first time in Switzerland. Several countries have developed priority education policies in order to compensate for the negative impact of disadvantaged socioeconomic status on school performance. These policies have failed. As a result, other actions need to be taken. In order to define these actions, one has to identify the social-class differences which explain why disadvantaged children underperform. Childrearing and literary practices, health characteristics, housing stability and economic security influence pupil achievement. Rather than allocating more resources to schools, policymakers should therefore focus on related social policies. For instance, they could define pre-school, family, health, housing and benefits policies in order to improve the conditions for disadvantaged children.
Resumo:
Abstract In this thesis we present the design of a systematic integrated computer-based approach for detecting potential disruptions from an industry perspective. Following the design science paradigm, we iteratively develop several multi-actor multi-criteria artifacts dedicated to environment scanning. The contributions of this thesis are both theoretical and practical. We demonstrate the successful use of multi-criteria decision-making methods for technology foresight. Furthermore, we illustrate the design of our artifacts using build and-evaluate loops supported with a field study of the Swiss mobile payment industry. To increase the relevance of this study, we systematically interview key Swiss experts for each design iteration. As a result, our research provides a realistic picture of the current situation in the Swiss mobile payment market and reveals previously undiscovered weak signals for future trends. Finally, we suggest a generic design process for environment scanning.
Resumo:
Aim To assess the geographical transferability of niche-based species distribution models fitted with two modelling techniques. Location Two distinct geographical study areas in Switzerland and Austria, in the subalpine and alpine belts. Methods Generalized linear and generalized additive models (GLM and GAM) with a binomial probability distribution and a logit link were fitted for 54 plant species, based on topoclimatic predictor variables. These models were then evaluated quantitatively and used for spatially explicit predictions within (internal evaluation and prediction) and between (external evaluation and prediction) the two regions. Comparisons of evaluations and spatial predictions between regions and models were conducted in order to test if species and methods meet the criteria of full transferability. By full transferability, we mean that: (1) the internal evaluation of models fitted in region A and B must be similar; (2) a model fitted in region A must at least retain a comparable external evaluation when projected into region B, and vice-versa; and (3) internal and external spatial predictions have to match within both regions. Results The measures of model fit are, on average, 24% higher for GAMs than for GLMs in both regions. However, the differences between internal and external evaluations (AUC coefficient) are also higher for GAMs than for GLMs (a difference of 30% for models fitted in Switzerland and 54% for models fitted in Austria). Transferability, as measured with the AUC evaluation, fails for 68% of the species in Switzerland and 55% in Austria for GLMs (respectively for 67% and 53% of the species for GAMs). For both GAMs and GLMs, the agreement between internal and external predictions is rather weak on average (Kulczynski's coefficient in the range 0.3-0.4), but varies widely among individual species. The dominant pattern is an asymmetrical transferability between the two study regions (a mean decrease of 20% for the AUC coefficient when the models are transferred from Switzerland and 13% when they are transferred from Austria). Main conclusions The large inter-specific variability observed among the 54 study species underlines the need to consider more than a few species to test properly the transferability of species distribution models. The pronounced asymmetry in transferability between the two study regions may be due to peculiarities of these regions, such as differences in the ranges of environmental predictors or the varied impact of land-use history, or to species-specific reasons like differential phenotypic plasticity, existence of ecotypes or varied dependence on biotic interactions that are not properly incorporated into niche-based models. The lower variation between internal and external evaluation of GLMs compared to GAMs further suggests that overfitting may reduce transferability. Overall, a limited geographical transferability calls for caution when projecting niche-based models for assessing the fate of species in future environments.
Resumo:
Aortic stenosis mostly occurs among old-old patients. Once symptoms appear, prognosis is guarded, with 2-year mortality as high as 50%. Transcatheter Aortic Valve Implantation (TAVI) is a new therapeutic option in patients at very high surgical risk, who are mostly older persons. However, TAVI is associated with some complications, and patient selection remains a challenge. Comprehensive geriatric assessment (CGA) identifies patients with medical and functional problems likely to affect the TAVI post-operative course. Collaboration between cardiologists and geriatricians will likely become a standard approach to enhance the assessment of these frail patients and identify those most likely to benefit from TAVI.
Resumo:
BACKGROUND: Sunitinib (SU) is a multitargeted tyrosine kinase inhibitor with antitumor and antiangiogenic activity. The objective of this trial was to demonstrate antitumor activity of continuous SU treatment in patients with hepatocellular carcinoma (HCC). PATIENTS AND METHODS: Key eligibility criteria included unresectable or metastatic HCC, no prior systemic anticancer treatment, measurable disease, and Child-Pugh class A or mild Child-Pugh class B liver dysfunction. Patients received 37.5 mg SU daily until progression or unacceptable toxicity. The primary endpoint was progression-free survival at 12 weeks (PFS12). RESULTS: Forty-five patients were enrolled. The median age was 63 years; 89% had Child-Pugh class A disease and 47% had distant metastases. PFS12 was rated successful in 15 patients (33%; 95% confidence interval, 20%-47%). Over the whole trial period, one complete response and a 40% rate of stable disease as the best response were achieved. The median PFS duration, disease stabilization duration, time to progression, and overall survival time were 1.5, 2.9, 1.5, and 9.3 months, respectively. Grade 3 and 4 adverse events were infrequent. None of the 33 deaths were considered drug related. CONCLUSION: Continuous SU treatment with 37.5 mg daily is feasible and has moderate activity in patients with advanced HCC and mild to moderately impaired liver dysfunction. Under this trial design (>13 PFS12 successes), the therapy is considered promising. This is the first trial describing the clinical effects of continuous dosing of SU in HCC patients on a schedule that is used in an ongoing, randomized, phase III trial in comparison with the current treatment standard, sorafenib (ClinicalTrials.gov identifier, NCT00699374).
Resumo:
BACKGROUND: Incidence of perioperative in-stent thrombosis associated with myocardial infarction in patients undergoing major lung resection within 3 months of coronary stenting. METHODS: Retrospective multi-institutional trial including all patients undergoing major lung resection (lobectomy or pneumonectomy) within 3 months of coronary stenting with non-drug-eluting stents between 1999 and 2004. RESULTS: There were 32 patients (29 men and 3 women), with age ranging from 46 to 82 years. One, two or four coronary stents were deployed in 72%, 22% and 6% of the patients, respectively. The time intervals between stenting and lung surgery were <30 days, 30-60 days and 61-90 days in 22%, 53% and 25% of the patients, respectively. All patients had dual antiplatelet therapy after stenting. Perioperative medication consisted of heparin alone or heparin plus aspirin in 34% and 66% of the patients, respectively. Perioperative in-stent thrombosis with myocardial infarction occurred in three patients (9%) with fatal outcome in one (3%). Twenty patients underwent lung resection after 4 weeks of dual antiplatelet therapy as recommended by the ACC/AHA Guideline Update; however, two out of three perioperative in-stent thrombosis occurred in this group of patients. CONCLUSIONS: Major lung resection performed within 3 months of coronary stenting may be complicated by perioperative in-stent thrombosis despite 4 weeks of dual antiplatelet therapy after stenting as recommended by the ACC/AHA Guideline Update.
Resumo:
Trimethyltin (TMT) is a neurotoxicant known to induce early microglial activation. The present study was undertaken to investigate the role played by these microglial cells in the TMT-induced neurotoxicity. The effects of TMT were investigated in monolayer cultures of isolated microglia or in neuron-enriched cultures and in neuron-microglia and astrocyte-microglia cocultures. The end points used were morphological criteria; evaluation of cell death and cell proliferation; and measurements of tumor necrosis factor-alpha (TNF-alpha), interleukin-6 (IL-6), and nitric oxide (NO) release in culture supernatant. The results showed that, in cultures of microglia, TMT (10(-6) M) caused, after a 5-day treatment, an increased release of TNF-alpha, without affecting microglial shape or cell viability. When microglia were cocultured with astrocytes, TNF-alpha release was decreased to undetectable levels. In contrast, in neuron-microglia cocultures, TNF-alpha levels were found to increase at lower concentrations of TMT (i.e., 10(-8) M). Moreover, at 10(-6) M of TMT, microglia displayed further morphological activation, as suggested by process retraction and by decrease in cell size. No morphological activation was observed in cultures of isolated microglial cells and in astrocyte-microglia cocultures. With regard to neurons, 10(-6) M of TMT induced about 30% of cell death, when applied to neuron-enriched cultures, whereas close to 100% of neuronal death was observed in neuron-microglia cocultures. In conclusion, whereas astrocytes may rather dampen the microglial activation by decreasing microglial TNF-alpha production, neuronal-microglial interactions lead to enhanced microglial activation. This microglial activation, in turn, exacerbates the neurotoxic effects of TMT. TNF-alpha may play a major role in such cell-cell communications.