899 resultados para information bottleneck method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neuroimaging studies typically compare experimental conditions using average brain responses, thereby overlooking the stimulus-related information conveyed by distributed spatio-temporal patterns of single-trial responses. Here, we take advantage of this rich information at a single-trial level to decode stimulus-related signals in two event-related potential (ERP) studies. Our method models the statistical distribution of the voltage topographies with a Gaussian Mixture Model (GMM), which reduces the dataset to a number of representative voltage topographies. The degree of presence of these topographies across trials at specific latencies is then used to classify experimental conditions. We tested the algorithm using a cross-validation procedure in two independent EEG datasets. In the first ERP study, we classified left- versus right-hemifield checkerboard stimuli for upper and lower visual hemifields. In a second ERP study, when functional differences cannot be assumed, we classified initial versus repeated presentations of visual objects. With minimal a priori information, the GMM model provides neurophysiologically interpretable features - vis à vis voltage topographies - as well as dynamic information about brain function. This method can in principle be applied to any ERP dataset testing the functional relevance of specific time periods for stimulus processing, the predictability of subject's behavior and cognitive states, and the discrimination between healthy and clinical populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Although intracranial hypertension is one of the important prognostic factors after head injury, increased intracranial pressure (ICP) may also be observed in patients with favourable outcome. We have studied whether the value of ICP monitoring can be augmented by indices describing cerebrovascular pressure-reactivity and pressure-volume compensatory reserve derived from ICP and arterial blood pressure (ABP) waveforms. METHOD: 96 patients with intracranial hypertension were studied retrospectively: 57 with fatal outcome and 39 with favourable outcome. ABP and ICP waveforms were recorded. Indices of cerebrovascular reactivity (PRx) and cerebrospinal compensatory reserve (RAP) were calculated as moving correlation coefficients between slow waves of ABP and ICP, and between slow waves of ICP pulse amplitude and mean ICP, respectively. The magnitude of 'slow waves' was derived using ICP low-pass spectral filtration. RESULTS: The most significant difference was found in the magnitude of slow waves that was persistently higher in patients with a favourable outcome (p<0.00004). In patients who died ICP was significantly higher (p<0.0001) and cerebrovascular pressure-reactivity (described by PRx) was compromised (p<0.024). In the same patients, pressure-volume compensatory reserve showed a gradual deterioration over time with a sudden drop of RAP when ICP started to rise, suggesting an overlapping disruption of the vasomotor response. CONCLUSION: Indices derived from ICP waveform analysis can be helpful for the interpretation of progressive intracranial hypertension in patients after brain trauma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plain film radiography often underestimates the extent of injury in children with epiphyseal fracture. Especially Salter-Harris V fractures (crush fracture of the epiphyseal plate) are often primarily not detected. MRI of the ankle was performed in 10 children aged 9-17 (mean 14) years with suspected epiphyseal injury using 1.0-T Magnetom Expert. The fractures were classified according to the Salter-Harris-Rang-Odgen classification and compared with the results of plain radiography. In one case MRI could exclude epiphyseal injury; in four cases the MRI findings changed the therapeutic management. The visualisation of the fracture in three orthogonal planes and the possibility of detection of cartilage and ligamentous injury in MR imaging makes this method superior to conventional radiography and CT. With respect to radiation exposure MRI instead of CT should be used for the diagnosis of epiphyseal injuries in children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a procedure for the optical characterization of thin-film stacks from spectrophotometric data. The procedure overcomes the intrinsic limitations arising in the numerical determination of manyparameters from reflectance or transmittance spectra measurements. The key point is to use all theinformation available from the manufacturing process in a single global optimization process. The method is illustrated by a case study of solgel applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the considerable environmental importance of mercury (Hg), given its high toxicity and ability to contaminate large areas via atmospheric deposition, little is known about its activity in soils, especially tropical soils, in comparison with other heavy metals. This lack of information about Hg arises because analytical methods for determination of Hg are more laborious and expensive compared to methods for other heavy metals. The situation is even more precarious regarding speciation of Hg in soils since sequential extraction methods are also inefficient for this metal. The aim of this paper is to present a technique of thermal desorption associated with atomic absorption spectrometry, TDAAS, as an efficient tool for quantitative determination of Hg in soils. The method consists of the release of Hg by heating, followed by its quantification by atomic absorption spectrometry. It was developed by constructing calibration curves in different soil samples based on increasing volumes of standard Hg2+ solutions. Performance, accuracy, precision, and quantification and detection limit parameters were evaluated. No matrix interference was detected. Certified reference samples and comparison with a Direct Mercury Analyzer, DMA (another highly recognized technique), were used in validation of the method, which proved to be accurate and precise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work deals with the elaboration of flood hazard maps. These maps reflect the areas prone to floods based on the effects of Hurricane Mitch in the Municipality of Jucuarán of El Salvador. Stream channels located in the coastal range in the SE of El Salvador flow into the Pacific Ocean and generate alluvial fans. Communities often inhabit these fans can be affected by floods. The geomorphology of these stream basins is associated with small areas, steep slopes, well developed regolite and extensive deforestation. These features play a key role in the generation of flash-floods. This zone lacks comprehensive rainfall data and gauging stations. The most detailed topographic maps are on a scale of 1:25 000. Given that the scale was not sufficiently detailed, we used aerial photographs enlarged to the scale of 1:8000. The effects of Hurricane Mitch mapped on these photographs were regarded as the reference event. Flood maps have a dual purpose (1) community emergency plans, (2) regional land use planning carried out by local authorities. The geomorphological method is based on mapping the geomorphological evidence (alluvial fans, preferential stream channels, erosion and sedimentation, man-made terraces). Following the interpretation of the photographs this information was validated on the field and complemented by eyewitness reports such as the height of water and flow typology. In addition, community workshops were organized to obtain information about the evolution and the impact of the phenomena. The superimposition of this information enables us to obtain a comprehensive geomorphological map. Another aim of the study was the calculation of the peak discharge using the Manning and the paleohydraulic methods and estimates based on geomorphologic criterion. The results were compared with those obtained using the rational method. Significant differences in the order of magnitude of the calculated discharges were noted. The rational method underestimated the results owing to short and discontinuous periods of rainfall data with the result that probabilistic equations cannot be applied. The Manning method yields a wide range of results because of its dependence on the roughness coefficient. The paleohydraulic method yielded higher values than the rational and Manning methods. However, it should be pointed out that it is possible that bigger boulders could have been moved had they existed. These discharge values are lower than those obtained by the geomorphological estimates, i.e. much closer to reality. The flood hazard maps were derived from the comprehensive geomorphological map. Three categories of hazard were established (very high, high and moderate) using flood energy, water height and velocity flow deduced from geomorphological and eyewitness reports.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing anthropogenic pressures urge enhanced knowledge and understanding of the current state of marine biodiversity. This baseline information is pivotal to explore present trends, detect future modifications and propose adequate management actions for marine ecosystems. Coralligenous outcrops are a highly diverse and structurally complex deep-water habitat faced with major threats in the Mediterranean Sea. Despite its ecological, aesthetic and economic value, coralligenous biodiversity patterns are still poorly understood. There is currently no single sampling method that has been demonstrated to be sufficiently representative to ensure adequate community assessment and monitoring in this habitat. Therefore, we propose a rapid non-destructive protocol for biodiversity assessment and monitoring of coralligenous outcrops providing good estimates of its structure and species composition, based on photographic sampling and the determination of presence/absence of macrobenthic species. We used an extensive photographic survey, covering several spatial scales (100s of m to 100s of km) within the NW Mediterranean and including 2 different coralligenous assemblages: Paramuricea clavata (PCA) and Corallium rubrum assemblage (CRA). This approach allowed us to determine the minimal sampling area for each assemblage (5000 cm² for PCA and 2500 cm²for CRA). In addition, we conclude that 3 replicates provide an optimal sampling effort in order to maximize the species number and to assess the main biodiversity patterns of studied assemblages in variability studies requiring replicates. We contend that the proposed sampling approach provides a valuable tool for management and conservation planning, monitoring and research programs focused on coralligenous outcrops, potentially also applicable in other benthic ecosystems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comprehensive field detection method is proposed that is aimed at developing advanced capability for reliable monitoring, inspection and life estimation of bridge infrastructure. The goal is to utilize Motion-Sensing Radio Transponders (RFIDS) on fully adaptive bridge monitoring to minimize the problems inherent in human inspections of bridges. We developed a novel integrated condition-based maintenance (CBM) framework integrating transformative research in RFID sensors and sensing architecture, for in-situ scour monitoring, state-of-the-art computationally efficient multiscale modeling for scour assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Conventional magnetic resonance imaging (MRI) techniques are highly sensitive to detect multiple sclerosis (MS) plaques, enabling a quantitative assessment of inflammatory activity and lesion load. In quantitative analyses of focal lesions, manual or semi-automated segmentations have been widely used to compute the total number of lesions and the total lesion volume. These techniques, however, are both challenging and time-consuming, being also prone to intra-observer and inter-observer variability.Aim: To develop an automated approach to segment brain tissues and MS lesions from brain MRI images. The goal is to reduce the user interaction and to provide an objective tool that eliminates the inter- and intra-observer variability.Methods: Based on the recent methods developed by Souplet et al. and de Boer et al., we propose a novel pipeline which includes the following steps: bias correction, skull stripping, atlas registration, tissue classification, and lesion segmentation. After the initial pre-processing steps, a MRI scan is automatically segmented into 4 classes: white matter (WM), grey matter (GM), cerebrospinal fluid (CSF) and partial volume. An expectation maximisation method which fits a multivariate Gaussian mixture model to T1-w, T2-w and PD-w images is used for this purpose. Based on the obtained tissue masks and using the estimated GM mean and variance, we apply an intensity threshold to the FLAIR image, which provides the lesion segmentation. With the aim of improving this initial result, spatial information coming from the neighbouring tissue labels is used to refine the final lesion segmentation.Results:The experimental evaluation was performed using real data sets of 1.5T and the corresponding ground truth annotations provided by expert radiologists. The following values were obtained: 64% of true positive (TP) fraction, 80% of false positive (FP) fraction, and an average surface distance of 7.89 mm. The results of our approach were quantitatively compared to our implementations of the works of Souplet et al. and de Boer et al., obtaining higher TP and lower FP values.Conclusion: Promising MS lesion segmentation results have been obtained in terms of TP. However, the high number of FP which is still a well-known problem of all the automated MS lesion segmentation approaches has to be improved in order to use them for the standard clinical practice. Our future work will focus on tackling this issue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper highlights the role of non-functional information when reusing from a component library. We describe a method for selecting appropriate implementations of Ada packages taking non-functional constraints into account; these constraints model the context of reuse. Constraints take the form of queries using an interface description language called NoFun, which is also used to state non-functional information in Ada packages; query results are trees of implementations, following the import relationships between components. We define two different situations when reusing components, depending whether we take the library being searched as closed or extendible. The resulting tree of implementations can be manipulated by the user to solve ambiguities, to state default behaviours, and by the like. As part of the proposal, we face the problem of computing from code the non-functional information that determines the selection process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new method for decision making that uses the ordered weighted averaging (OWA) operator in the aggregation of the information is presented. It is used a concept that it is known in the literature as the index of maximum and minimum level (IMAM). This index is based on distance measures and other techniques that are useful for decision making. By using the OWA operator in the IMAM, we form a new aggregation operator that we call the ordered weighted averaging index of maximum and minimum level (OWAIMAM) operator. The main advantage is that it provides a parameterized family of aggregation operators between the minimum and the maximum and a wide range of special cases. Then, the decision maker may take decisions according to his degree of optimism and considering ideals in the decision process. A further extension of this approach is presented by using hybrid averages and Choquet integrals. We also develop an application of the new approach in a multi-person decision-making problem regarding the selection of strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The M-Coffee server is a web server that makes it possible to compute multiple sequence alignments (MSAs) by running several MSA methods and combining their output into one single model. This allows the user to simultaneously run all his methods of choice without having to arbitrarily choose one of them. The MSA is delivered along with a local estimation of its consistency with the individual MSAs it was derived from. The computation of the consensus multiple alignment is carried out using a special mode of the T-Coffee package [Notredame, Higgins and Heringa (T-Coffee: a novel method for fast and accurate multiple sequence alignment. J. Mol. Biol. 2000; 302: 205-217); Wallace, O'Sullivan, Higgins and Notredame (M-Coffee: combining multiple sequence alignment methods with T-Coffee. Nucleic Acids Res. 2006; 34: 1692-1699)] Given a set of sequences (DNA or proteins) in FASTA format, M-Coffee delivers a multiple alignment in the most common formats. M-Coffee is a freeware open source package distributed under a GPL license and it is available either as a standalone package or as a web service from www.tcoffee.org.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diagnosis of chronic inflammatory demyelinating polyneuropathy (CIDP) is based on a set of clinical and neurophysiological parameters. However, in clinical practice, CIDP remains difficult to diagnose in atypical cases. In the present study, 32 experts from 22 centers (the French CIDP study group) were asked individually to score four typical, and seven atypical, CIDP observations (TOs and AOs, respectively) reported by other physicians, according to the Delphi method. The diagnoses of CIDP were confirmed by the group in 96.9 % of the TO and 60.1 % of the AO (p < 0.0001). There was a positive correlation between the consensus of CIDP diagnosis and the demyelinating features (r = 0.82, p < 0.004). The European CIDP classification was used in 28.3 % of the TOs and 18.2 % of the AOs (p < 0.002). The French CIDP study group diagnostic strategy was used in 90 % of the TOs and 61 % of the AOs (p < 0.0001). In 3 % of the TOs and 21.6 % of the AOs, the experts had difficulty determining a final diagnosis due to a lack of information. This study shows that a set of criteria and a diagnostic strategy are not sufficient to reach a consensus for the diagnosis of atypical CIDP in clinical practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapid assessment methods are valuable tools for collecting information about the quality and status of natural systems. However, they are not a substitute for detailed surveys of those systems. Users of this method should consider that the method may under-score or over-score the site that’s assessed, especially when the site is not a typical fen (i.e. a sedge meadow could score lower than fens, but in-fact be a relatively high quality sedge meadow). This assessment can be used throughout most of the spring, summer and fall period, however the ideal “index period” would be from late May through early October when native plant communities, as well as invasive species, are most apparent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the most comprehensive comparison to date of the predictive benefit of genetics in addition to currently used clinical variables, using genotype data for 33 single-nucleotide polymorphisms (SNPs) in 1,547 Caucasian men from the placebo arm of the REduction by DUtasteride of prostate Cancer Events (REDUCE®) trial. Moreover, we conducted a detailed comparison of three techniques for incorporating genetics into clinical risk prediction. The first method was a standard logistic regression model, which included separate terms for the clinical covariates and for each of the genetic markers. This approach ignores a substantial amount of external information concerning effect sizes for these Genome Wide Association Study (GWAS)-replicated SNPs. The second and third methods investigated two possible approaches to incorporating meta-analysed external SNP effect estimates - one via a weighted PCa 'risk' score based solely on the meta analysis estimates, and the other incorporating both the current and prior data via informative priors in a Bayesian logistic regression model. All methods demonstrated a slight improvement in predictive performance upon incorporation of genetics. The two methods that incorporated external information showed the greatest receiver-operating-characteristic AUCs increase from 0.61 to 0.64. The value of our methods comparison is likely to lie in observations of performance similarities, rather than difference, between three approaches of very different resource requirements. The two methods that included external information performed best, but only marginally despite substantial differences in complexity.