97 resultados para Portlet-based application


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of support vector machine classification (SVM) to combined information from magnetic resonance imaging (MRI) and [F18]fluorodeoxyglucose positron emission tomography (FDG-PET) has been shown to improve detection and differentiation of Alzheimer's disease dementia (AD) and frontotemporal lobar degeneration. To validate this approach for the most frequent dementia syndrome AD, and to test its applicability to multicenter data, we randomly extracted FDG-PET and MRI data of 28 AD patients and 28 healthy control subjects from the database provided by the Alzheimer's Disease Neuroimaging Initiative (ADNI) and compared them to data of 21 patients with AD and 13 control subjects from our own Leipzig cohort. SVM classification using combined volume-of-interest information from FDG-PET and MRI based on comprehensive quantitative meta-analyses investigating dementia syndromes revealed a higher discrimination accuracy in comparison to single modality classification. For the ADNI dataset accuracy rates of up to 88% and for the Leipzig cohort of up to 100% were obtained. Classifiers trained on the ADNI data discriminated the Leipzig cohorts with an accuracy of 91%. In conclusion, our results suggest SVM classification based on quantitative meta-analyses of multicenter data as a valid method for individual AD diagnosis. Furthermore, combining imaging information from MRI and FDG-PET might substantially improve the accuracy of AD diagnosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Species distribution models (SDMs) studies suggest that, without control measures, the distribution of many alien invasive plant species (AIS) will increase under climate and land-use changes. Due to limited resources and large areas colonised by invaders, management and monitoring resources must be prioritised. Choices depend on the conservation value of the invaded areas and can be guided by SDM predictions. Here, we use a hierarchical SDM framework, complemented by connectivity analysis of AIS distributions, to evaluate current and future conflicts between AIS and high conservation value areas. We illustrate the framework with three Australian wattle (Acacia) species and patterns of conservation value in Northern Portugal. Results show that protected areas will likely suffer higher pressure from all three Acacia species under future climatic conditions. Due to this higher predicted conflict in protected areas, management might be prioritised for Acacia dealbata and Acacia melanoxylon. Connectivity of AIS suitable areas inside protected areas is currently lower than across the full study area, but this would change under future environmental conditions. Coupled SDM and connectivity analysis can support resource prioritisation for anticipation and monitoring of AIS impacts. However, further tests of this framework over a wide range of regions and organisms are still required before wide application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To assess the preferred methods to quit smoking among current smokers. Cross-sectional, population-based study conducted in Lausanne between 2003 and 2006 including 988 current smokers. Preference was assessed by questionnaire. Evidence-based (EB) methods were nicotine replacement, bupropion, physician or group consultations; non-EB-based methods were acupuncture, hypnosis and autogenic training. EB methods were frequently (physician consultation: 48%, 95% confidence interval (45-51); nicotine replacement therapy: 35% (32-38)) or rarely (bupropion and group consultations: 13% (11-15)) preferred by the participants. Non-EB methods were preferred by a third (acupuncture: 33% (30-36)), a quarter (hypnosis: 26% (23-29)) or a seventh (autogenic training: 13% (11-15)) of responders. On multivariate analysis, women preferred both EB and non-EB methods more frequently than men (odds ratio and 95% confidence interval: 1.46 (1.10-1.93) and 2.26 (1.72-2.96) for any EB and non-EB method, respectively). Preference for non-EB methods was higher among highly educated participants, while no such relationship was found for EB methods. Many smokers are unaware of the full variety of methods to quit smoking. Better information regarding these methods is necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Robust Huber type regression and testing of linear hypotheses are adapted to statistical analysis of parallel line and slope ratio assays. They are applied in the evaluation of results of several experiments carried out in order to compare and validate alternatives to animal experimentation based on embryo and cell cultures. Computational procedures necessary for the application of robust methods of analysis used the conversational statistical package ROBSYS. Special commands for the analysis of parallel line and slope ratio assays have been added to ROBSYS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Pharmacy-based case mix measures are an alternative source of information to the relatively scarce outpatient diagnoses data. But most published tools use national drug nomenclatures and offer no head-to-head comparisons between drugs-related and diagnoses-based categories. The objective of the study was to test the accuracy of drugs-based morbidity groups derived from the World Health Organization Anatomical Therapeutic Chemical Classification of drugs by checking them against diagnoses-based groups. METHODS: We compared drugs-based categories with their diagnoses-based analogues using anonymous data on 108,915 individuals insured with one of four companies. They were followed throughout 2005 and 2006 and hospitalized at least once during this period. The agreement between the two approaches was measured by weighted kappa coefficients. The reproducibility of the drugs-based morbidity measure over the 2 years was assessed for all enrollees. RESULTS: Eighty percent used a drug associated with at least one of the 60 morbidity categories derived from drugs dispensation. After accounting for inpatient under-coding, fifteen conditions agreed sufficiently with their diagnoses-based counterparts to be considered alternative strategies to diagnoses. In addition, they exhibited good reproducibility and allowed prevalence estimates in accordance with national estimates. For 22 conditions, drugs-based information identified accurately a subset of the population defined by diagnoses. CONCLUSIONS: Most categories provide insurers with health status information that could be exploited for healthcare expenditure prediction or ambulatory cost control, especially when ambulatory diagnoses are not available. However, due to insufficient concordance with their diagnoses-based analogues, their use for morbidity indicators is limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How to summarize and facilitate the implementation of national and international guidelines in clinical practice? To adress these issues, we present a summary of dyslipidemia management for general practitioners. To achieve these aims, we adopted strategies based on international and national guidelines and focused on clinical applications which implies to choose specific options, such as the use of cardiovascular risk score and specific therapies as first options.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Landscape is an example of a non-market good where no metrics exist to measure its quality. The paper proposes an original methodology to nevertheless estimate scope variables in those circumstances, allowing then to better test if people's willingnesstopay for such good is sensitive to the scope. The methodology is based on techniques developed in the context of multicriteria decision analysis. It is applied to assess the quality of the landscape of several Swiss alpine resorts. This assessment is then used as an explanatory variable in a hedonic price function to explain the rent of apartments and to derive an implicit price of the landscape quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To develop and evaluate a practical method for the quantification of signal-to-noise ratio (SNR) on coronary MR angiograms (MRA) acquired with parallel imaging.Materials and Methods: To quantify the spatially varying noise due to parallel imaging reconstruction, a new method has been implemented incorporating image data acquisition followed by a fast noise scan during which radio-frequency pulses, cardiac triggering and navigator gating are disabled. The performance of this method was evaluated in a phantom study where SNR measurements were compared with those of a reference standard (multiple repetitions). Subsequently, SNR of myocardium and posterior skeletal muscle was determined on in vivo human coronary MRA.Results: In a phantom, the SNR measured using the proposed method deviated less than 10.1% from the reference method for small geometry factors (<= 2). In vivo, the noise scan for a 10 min coronary MRA acquisition was acquired in 30 s. Higher signal and lower SNR, due to spatially varying noise, were found in myocardium compared with posterior skeletal muscle.Conclusion: SNR quantification based on a fast noise scan is a validated and easy-to-use method when applied to three-dimensional coronary MRA obtained with parallel imaging as long as the geometry factor remains low.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The threat of punishment usually promotes cooperation. However, punishing itself is costly, rare in nonhuman animals, and humans who punish often finish with low payoffs in economic experiments. The evolution of punishment has therefore been unclear. Recent theoretical developments suggest that punishment has evolved in the context of reputation games. We tested this idea in a simple helping game with observers and with punishment and punishment reputation (experimentally controlling for other possible reputational effects). We show that punishers fully compensate their costs as they receive help more often. The more likely defection is punished within a group, the higher the level of within-group cooperation. These beneficial effects perish if the punishment reputation is removed. We conclude that reputation is key to the evolution of punishment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 2009-2010 Data Fusion Contest organized by the Data Fusion Technical Committee of the IEEE Geoscience and Remote Sensing Society was focused on the detection of flooded areas using multi-temporal and multi-modal images. Both high spatial resolution optical and synthetic aperture radar data were provided. The goal was not only to identify the best algorithms (in terms of accuracy), but also to investigate the further improvement derived from decision fusion. This paper presents the four awarded algorithms and the conclusions of the contest, investigating both supervised and unsupervised methods and the use of multi-modal data for flood detection. Interestingly, a simple unsupervised change detection method provided similar accuracy as supervised approaches, and a digital elevation model-based predictive method yielded a comparable projected change detection map without using post-event data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The availability of high resolution Digital Elevation Models (DEM) at a regional scale enables the analysis of topography with high levels of detail. Hence, a DEM-based geomorphometric approach becomes more accurate for detecting potential rockfall sources. Potential rockfall source areas are identified according to the slope angle distribution deduced from high resolution DEM crossed with other information extracted from geological and topographic maps in GIS format. The slope angle distribution can be decomposed in several Gaussian distributions that can be considered as characteristic of morphological units: rock cliffs, steep slopes, footslopes and plains. A terrain is considered as potential rockfall sources when their slope angles lie over an angle threshold, which is defined where the Gaussian distribution of the morphological unit "Rock cliffs" become dominant over the one of "Steep slopes". In addition to this analysis, the cliff outcrops indicated by the topographic maps were added. They contain however "flat areas", so that only the slope angles values above the mode of the Gaussian distribution of the morphological unit "Steep slopes" were considered. An application of this method is presented over the entire Canton of Vaud (3200 km2), Switzerland. The results were compared with rockfall sources observed on the field and orthophotos analysis in order to validate the method. Finally, the influence of the cell size of the DEM is inspected by applying the methodology over six different DEM resolutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, the joint exploitation of images acquired daily by remote sensing instruments and of images available from archives allows a detailed monitoring of the transitions occurring at the surface of the Earth. These modifications of the land cover generate spectral discrepancies that can be detected via the analysis of remote sensing images. Independently from the origin of the images and of type of surface change, a correct processing of such data implies the adoption of flexible, robust and possibly nonlinear method, to correctly account for the complex statistical relationships characterizing the pixels of the images. This Thesis deals with the development and the application of advanced statistical methods for multi-temporal optical remote sensing image processing tasks. Three different families of machine learning models have been explored and fundamental solutions for change detection problems are provided. In the first part, change detection with user supervision has been considered. In a first application, a nonlinear classifier has been applied with the intent of precisely delineating flooded regions from a pair of images. In a second case study, the spatial context of each pixel has been injected into another nonlinear classifier to obtain a precise mapping of new urban structures. In both cases, the user provides the classifier with examples of what he believes has changed or not. In the second part, a completely automatic and unsupervised method for precise binary detection of changes has been proposed. The technique allows a very accurate mapping without any user intervention, resulting particularly useful when readiness and reaction times of the system are a crucial constraint. In the third, the problem of statistical distributions shifting between acquisitions is studied. Two approaches to transform the couple of bi-temporal images and reduce their differences unrelated to changes in land cover are studied. The methods align the distributions of the images, so that the pixel-wise comparison could be carried out with higher accuracy. Furthermore, the second method can deal with images from different sensors, no matter the dimensionality of the data nor the spectral information content. This opens the doors to possible solutions for a crucial problem in the field: detecting changes when the images have been acquired by two different sensors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integration of the differential equation of the second law of Fick applied to the diffusion of chemical elements in a semi-infinite solid made it easier to estimate the time of stay of olivine mega-cristals in contact with the host lava The results of this research show the existence of two groups of olivine. The first remained in contact with the magmatic liquid during 19 to 22 days, while the second remained so during only 5 to 9 days. This distinction is correlative to that based on the qualitative observation.