898 resultados para Multi Kidney Exchange Problem KEP
Resumo:
Abstract This work studies the multi-label classification of turns in simple English Wikipedia talk pages into dialog acts. The treated dataset was created and multi-labeled by (Ferschke et al., 2012). The first part analyses dependences between labels, in order to examine the annotation coherence and to determine a classification method. Then, a multi-label classification is computed, after transforming the problem into binary relevance. Regarding features, whereas (Ferschke et al., 2012) use features such as uni-, bi-, and trigrams, time distance between turns or the indentation level of the turn, other features are considered here: lemmas, part-of-speech tags and the meaning of verbs (according to WordNet). The dataset authors applied approaches such as Naive Bayes or Support Vector Machines. The present paper proposes, as an alternative, to use Schoenberg transformations which, following the example of kernel methods, transform original Euclidean distances into other Euclidean distances, in a space of high dimensionality. Résumé Ce travail étudie la classification supervisée multi-étiquette en actes de dialogue des tours de parole des contributeurs aux pages de discussion de Simple English Wikipedia (Wikipédia en anglais simple). Le jeu de données considéré a été créé et multi-étiqueté par (Ferschke et al., 2012). Une première partie analyse les relations entre les étiquettes pour examiner la cohérence des annotations et pour déterminer une méthode de classification. Ensuite, une classification supervisée multi-étiquette est effectuée, après recodage binaire des étiquettes. Concernant les variables, alors que (Ferschke et al., 2012) utilisent des caractéristiques telles que les uni-, bi- et trigrammes, le temps entre les tours de parole ou l'indentation d'un tour de parole, d'autres descripteurs sont considérés ici : les lemmes, les catégories morphosyntaxiques et le sens des verbes (selon WordNet). Les auteurs du jeu de données ont employé des approches telles que le Naive Bayes ou les Séparateurs à Vastes Marges (SVM) pour la classification. Cet article propose, de façon alternative, d'utiliser et d'étendre l'analyse discriminante linéaire aux transformations de Schoenberg qui, à l'instar des méthodes à noyau, transforment les distances euclidiennes originales en d'autres distances euclidiennes, dans un espace de haute dimensionnalité.
Resumo:
In fetal brain MRI, most of the high-resolution reconstruction algorithms rely on brain segmentation as a preprocessing step. Manual brain segmentation is however highly time-consuming and therefore not a realistic solution. In this work, we assess on a large dataset the performance of Multiple Atlas Fusion (MAF) strategies to automatically address this problem. Firstly, we show that MAF significantly increase the accuracy of brain segmentation as regards single-atlas strategy. Secondly, we show that MAF compares favorably with the most recent approach (Dice above 0.90). Finally, we show that MAF could in turn provide an enhancement in terms of reconstruction quality.
Resumo:
Peer-reviewed
Resumo:
Peer-reviewed
Resumo:
The multi-element determination of Al, Cr, Mn, Ni, Cu, Zn, Cd, Ba, Pb, SO4= and Cl- in riverine water samples was accomplished by inductively coupled plasma mass spectrometry (ICP-MS). The sample passed through a column containing the anionic resin AG1-X8 and the metals were determined directly. The retained anionic species were eluted and SO4= and Cl- were determined at m/z 48 and 35 correspondent to the ions SO+ and Cl+ formed at the plasma. Accuracy for metals was assessed by analysing the certified reference TM-26 (National Water Research Institute of Canada). Results for SO4= and Cl- were in agreement with those obtained by turbidimetry and spectrophotometry. LOD's of 0.1 µg l-1 for Cd, Ba and Pb; 0.2 µg l-1 for Al, Mn and Cu; 0.5 µg l-1 for Cr; 0.9 for Zn; 2.0 µg l-1for Ni , 60 µg l-1 for S and 200 µg l-1 Cl were attained.
Resumo:
Simultaneous localization and mapping(SLAM) is a very important problem in mobile robotics. Many solutions have been proposed by different scientists during the last two decades, nevertheless few studies have considered the use of multiple sensors simultane¬ously. The solution is on combining several data sources with the aid of an Extended Kalman Filter (EKF). Two approaches are proposed. The first one is to use the ordinary EKF SLAM algorithm for each data source separately in parallel and then at the end of each step, fuse the results into one solution. Another proposed approach is the use of multiple data sources simultaneously in a single filter. The comparison of the computational com¬plexity of the two methods is also presented. The first method is almost four times faster than the second one.
Resumo:
Global warming mitigation has recently become a priority worldwide. A large body of literature dealing with energy related problems has focused on reducing greenhouse gases emissions at an engineering scale. In contrast, the minimization of climate change at a wider macroeconomic level has so far received much less attention. We investigate here the issue of how to mitigate global warming by performing changes in an economy. To this end, we make use of a systematic tool that combines three methods: linear programming, environmentally extended input output models, and life cycle assessment principles. The problem of identifying key economic sectors that contribute significantly to global warming is posed in mathematical terms as a bi criteria linear program that seeks to optimize simultaneously the total economic output and the total life cycle CO2 emissions. We have applied this approach to the European Union economy, finding that significant reductions in global warming potential can be attained by regulating specific economic sectors. Our tool is intended to aid policymakers in the design of more effective public policies for achieving the environmental and economic targets sought.
Resumo:
We investigated the prognostic effects of high-flux hemodialysis (HFHD) and low-flux hemodialysis (LFHD) in patients with chronic kidney disease (CKD). Both an electronic and a manual search were performed based on our rigorous inclusion and exclusion criteria to retrieve high-quality, relevant clinical studies from various scientific literature databases. Comprehensive meta-analysis 2.0 (CMA 2.0) was used for the quantitative analysis. We initially retrieved 227 studies from the database search. Following a multi-step screening process, eight high-quality studies were selected for our meta-analysis. These eight studies included 4967 patients with CKD (2416 patients in the HFHD group, 2551 patients in the LFHD group). The results of our meta-analysis showed that the all-cause death rate in the HFHD group was significantly lower than that in the LFHD group (OR=0.704, 95%CI=0.533-0.929, P=0.013). Additionally, the cardiovascular death rate in the HFHD group was significantly lower than that in the LFHD group (OR=0.731, 95%CI=0.616-0.866, P<0.001). The results of this meta-analysis clearly showed that HFHD decreases all-cause death and cardiovascular death rates in patients with CKD and that HFHD can therefore be implemented as one of the first therapy choices for CKD.
Resumo:
Vitamin D deficiency is common in the chronic kidney disease (CKD) population. CKD has been recognized as a significant public health problem and CKD patients are at increased risk of total and cardiovascular morbidity and mortality. There are increasing epidemiological data suggesting that vitamin D deficiency may play a role in overall morbidity and mortality associated with CKD. The vitamin D hormonal system is classically implicated in the regulation of calcium homeostasis and bone metabolism but there is ample evidence to support the claim that extra renal conversion of 25(OH)D to 1.25(OH)2 has significant biological roles beyond those traditionally ascribed to vitamin D. Based on the current state of evidence this review intends to give an update on novel biological and clinical insights with relevance to the steroid hormone vitamin D specifically in patients with kidney disease.
Resumo:
This thesis aims to investigate pricing of liquidity risks in London Stock Exchange. Liquidity Adjusted Capital Asset Pricing Model i.e. LCAPM developed by Acharya and Pedersen (2005) is being applied to test the influence of various liquidity risks on stock returns in London Stock Exchange. The Liquidity Adjusted Capital Asset Pricing model provides a unified framework for the testing of liquidity risks. All the common stocks listed and delisted for the period of 2000 to 2014 are included in the data sample. The study has incorporated three different measures of liquidity – Percent Quoted Spread, Amihud (2002) and Turnover. The reason behind the application of three different liquidity measures is the multi-dimensional nature of liquidity. Firm fixed effects panel regression is applied for the estimation of LCAPM. However, the results are robust according to Fama-Macbeth regressions. The results of the study indicates that liquidity risks in the form of (i) level of liquidity, (ii) commonality in liquidity (iii) flight to liquidity, (iv) depressed wealth effect and market return as well as aggregate liquidity risk are priced at London Stock Exchange. However, the results are sensitive to the choice of liquidity measures.
Resumo:
A research project submitted to the Faculty of Extension, University of Alberta in partial fulfillment of the requirements for the degree of Master of Arts in Communications and Technology in 2005.
Resumo:
Picture Exchange Communication System (PECS) is an augmentative and alternative communicative system that improves communication and decreases problem behaviors in children with Developmental Disabilities and Autism. The mediator model is a validated approach that clinicians use to train parents to perform evidence-based interventions. Parental non-adherence to treatment recommendations is a documented problem. This qualitative study investigated clinician-perceived factors that influence parental adherence to PECS recommendations. Three focus groups (n=8) were conducted with Speech Language Pathologists and Behavior Therapists experienced in providing parents with PECS recommendations. Constant comparison analysis was used. In general, clinicians believed that PECS was complex to implement. Thirty-one bridges were identified to overcome complexity. Twenty-two barriers and 6 other factors also impacted parental adherence. Strategies to address these factors were proposed based on a review of the literature. Future research will be performed to validate these findings using parents and a larger sample size.
Characterizing Dynamic Optimization Benchmarks for the Comparison of Multi-Modal Tracking Algorithms
Resumo:
Population-based metaheuristics, such as particle swarm optimization (PSO), have been employed to solve many real-world optimization problems. Although it is of- ten sufficient to find a single solution to these problems, there does exist those cases where identifying multiple, diverse solutions can be beneficial or even required. Some of these problems are further complicated by a change in their objective function over time. This type of optimization is referred to as dynamic, multi-modal optimization. Algorithms which exploit multiple optima in a search space are identified as niching algorithms. Although numerous dynamic, niching algorithms have been developed, their performance is often measured solely on their ability to find a single, global optimum. Furthermore, the comparisons often use synthetic benchmarks whose landscape characteristics are generally limited and unknown. This thesis provides a landscape analysis of the dynamic benchmark functions commonly developed for multi-modal optimization. The benchmark analysis results reveal that the mechanisms responsible for dynamism in the current dynamic bench- marks do not significantly affect landscape features, thus suggesting a lack of representation for problems whose landscape features vary over time. This analysis is used in a comparison of current niching algorithms to identify the effects that specific landscape features have on niching performance. Two performance metrics are proposed to measure both the scalability and accuracy of the niching algorithms. The algorithm comparison results demonstrate the algorithms best suited for a variety of dynamic environments. This comparison also examines each of the algorithms in terms of their niching behaviours and analyzing the range and trade-off between scalability and accuracy when tuning the algorithms respective parameters. These results contribute to the understanding of current niching techniques as well as the problem features that ultimately dictate their success.
Resumo:
We study the problem of testing the error distribution in a multivariate linear regression (MLR) model. The tests are functions of appropriately standardized multivariate least squares residuals whose distribution is invariant to the unknown cross-equation error covariance matrix. Empirical multivariate skewness and kurtosis criteria are then compared to simulation-based estimate of their expected value under the hypothesized distribution. Special cases considered include testing multivariate normal, Student t; normal mixtures and stable error models. In the Gaussian case, finite-sample versions of the standard multivariate skewness and kurtosis tests are derived. To do this, we exploit simple, double and multi-stage Monte Carlo test methods. For non-Gaussian distribution families involving nuisance parameters, confidence sets are derived for the the nuisance parameters and the error distribution. The procedures considered are evaluated in a small simulation experi-ment. Finally, the tests are applied to an asset pricing model with observable risk-free rates, using monthly returns on New York Stock Exchange (NYSE) portfolios over five-year subperiods from 1926-1995.
Resumo:
Les débats éthiques sur l’architecture ont traditionnellement abordé trois thématiques récurrentes : la beauté, la solidité et l’utilité de l’œuvre architecturale. Plus récemment, les nouvelles connaissances provenant du domaine de la gestion des projets et du développement durable ont apporté d’importantes contributions à la compréhension de la gouvernance de projets. Cependant, la démarche de réalisation des projets d’architecture est tributaire des caractéristiques propres à l’industrie du bâtiment; une industrie qui fonctionne grâce à la mise en place d’équipes temporaires formées par des organisations hautement spécialisées. L’analyse systémique d’études de cas permet d’identifier la complexité des équipes qui interviennent dans les projets d’architecture. Nous examinons dans cet article trois caractéristiques de l’industrie du bâtiment : (i) la complexité organisationnelle du donneur d’ouvrage, (ii) l’influence des parties prenantes, et (iii) les divers niveaux de proximité entre l’architecte et les usagers. L’identification des diverses configurations organisationnelles met en évidence les effets de ces caractéristiques sur les relations formelles et informelles entre l’architecte et les donneurs d’ouvrage ainsi que celles entre toutes les parties prenantes. L’architecte est contraint de travailler sur un projet qui devient, de plus en plus, l’objet de négociation entre les diverses parties prenantes. Face à ce défi, il doit tenir compte de la complexité des relations entre tous les acteurs au sein du système social du projet et créer les scénarios adéquats à la participation, à la négociation et aux échanges entre eux.