867 resultados para Graph-based method


Relevância:

90.00% 90.00%

Publicador:

Resumo:

La presencia de microorganismos patógenos en alimentos es uno de los problemas esenciales en salud pública, y las enfermedades producidas por los mismos es una de las causas más importantes de enfermedad. Por tanto, la aplicación de controles microbiológicos dentro de los programas de aseguramiento de la calidad es una premisa para minimizar el riesgo de infección de los consumidores. Los métodos microbiológicos clásicos requieren, en general, el uso de pre-enriquecimientos no-selectivos, enriquecimientos selectivos, aislamiento en medios selectivos y la confirmación posterior usando pruebas basadas en la morfología, bioquímica y serología propias de cada uno de los microorganismos objeto de estudio. Por lo tanto, estos métodos son laboriosos, requieren un largo proceso para obtener resultados definitivos y, además, no siempre pueden realizarse. Para solucionar estos inconvenientes se han desarrollado diversas metodologías alternativas para la detección identificación y cuantificación de microorganismos patógenos de origen alimentario, entre las que destacan los métodos inmunológicos y moleculares. En esta última categoría, la técnica basada en la reacción en cadena de la polimerasa (PCR) se ha convertido en la técnica diagnóstica más popular en microbiología, y recientemente, la introducción de una mejora de ésta, la PCR a tiempo real, ha producido una segunda revolución en la metodología diagnóstica molecular, como pude observarse por el número creciente de publicaciones científicas y la aparición continua de nuevos kits comerciales. La PCR a tiempo real es una técnica altamente sensible -detección de hasta una molécula- que permite la cuantificación exacta de secuencias de ADN específicas de microorganismos patógenos de origen alimentario. Además, otras ventajas que favorecen su implantación potencial en laboratorios de análisis de alimentos son su rapidez, sencillez y el formato en tubo cerrado que puede evitar contaminaciones post-PCR y favorece la automatización y un alto rendimiento. En este trabajo se han desarrollado técnicas moleculares (PCR y NASBA) sensibles y fiables para la detección, identificación y cuantificación de bacterias patogénicas de origen alimentario (Listeria spp., Mycobacterium avium subsp. paratuberculosis y Salmonella spp.). En concreto, se han diseñado y optimizado métodos basados en la técnica de PCR a tiempo real para cada uno de estos agentes: L. monocytogenes, L. innocua, Listeria spp. M. avium subsp. paratuberculosis, y también se ha optimizado y evaluado en diferentes centros un método previamente desarrollado para Salmonella spp. Además, se ha diseñado y optimizado un método basado en la técnica NASBA para la detección específica de M. avium subsp. paratuberculosis. También se evaluó la aplicación potencial de la técnica NASBA para la detección específica de formas viables de este microorganismo. Todos los métodos presentaron una especificidad del 100 % con una sensibilidad adecuada para su aplicación potencial a muestras reales de alimentos. Además, se han desarrollado y evaluado procedimientos de preparación de las muestras en productos cárnicos, productos pesqueros, leche y agua. De esta manera se han desarrollado métodos basados en la PCR a tiempo real totalmente específicos y altamente sensibles para la determinación cuantitativa de L. monocytogenes en productos cárnicos y en salmón y productos derivados como el salmón ahumado y de M. avium subsp. paratuberculosis en muestras de agua y leche. Además este último método ha sido también aplicado para evaluar la presencia de este microorganismo en el intestino de pacientes con la enfermedad de Crohn's, a partir de biopsias obtenidas de colonoscopia de voluntarios afectados. En conclusión, este estudio presenta ensayos moleculares selectivos y sensibles para la detección de patógenos en alimentos (Listeria spp., Mycobacterium avium subsp. paratuberculosis) y para una rápida e inambigua identificación de Salmonella spp. La exactitud relativa de los ensayos ha sido excelente, si se comparan con los métodos microbiológicos de referencia y pueden serusados para la cuantificación de tanto ADN genómico como de suspensiones celulares. Por otro lado, la combinación con tratamientos de preamplificación ha resultado ser de gran eficiencia para el análisis de las bacterias objeto de estudio. Por tanto, pueden constituir una estrategia útil para la detección rápida y sensible de patógenos en alimentos y deberían ser una herramienta adicional al rango de herramientas diagnósticas disponibles para el estudio de patógenos de origen alimentario.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Flow in the world's oceans occurs at a wide range of spatial scales, from a fraction of a metre up to many thousands of kilometers. In particular, regions of intense flow are often highly localised, for example, western boundary currents, equatorial jets, overflows and convective plumes. Conventional numerical ocean models generally use static meshes. The use of dynamically-adaptive meshes has many potential advantages but needs to be guided by an error measure reflecting the underlying physics. A method of defining an error measure to guide an adaptive meshing algorithm for unstructured tetrahedral finite elements, utilizing an adjoint or goal-based method, is described here. This method is based upon a functional, encompassing important features of the flow structure. The sensitivity of this functional, with respect to the solution variables, is used as the basis from which an error measure is derived. This error measure acts to predict those areas of the domain where resolution should be changed. A barotropic wind driven gyre problem is used to demonstrate the capabilities of the method. The overall objective of this work is to develop robust error measures for use in an oceanographic context which will ensure areas of fine mesh resolution are used only where and when they are required. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Genetic association analyses of family-based studies with ordered categorical phenotypes are often conducted using methods either for quantitative or for binary traits, which can lead to suboptimal analyses. Here we present an alternative likelihood-based method of analysis for single nucleotide polymorphism (SNP) genotypes and ordered categorical phenotypes in nuclear families of any size. Our approach, which extends our previous work for binary phenotypes, permits straightforward inclusion of covariate, gene-gene and gene-covariate interaction terms in the likelihood, incorporates a simple model for ascertainment and allows for family-specific effects in the hypothesis test. Additionally, our method produces interpretable parameter estimates and valid confidence intervals. We assess the proposed method using simulated data, and apply it to a polymorphism in the c-reactive protein (CRP) gene typed in families collected to investigate human systemic lupus erythematosus. By including sex interactions in the analysis, we show that the polymorphism is associated with anti-nuclear autoantibody (ANA) production in females, while there appears to be no effect in males.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper proposes a three-shot improvement scheme for the hard-decision based method (HDM), an implementation solution for linear decorrelating detector (LDD) in asynchronous DS/CDMA systems. By taking advantage of the preceding (already reconstructed) bit and the matched filter output for the following two bits, the coupling between temporally adjacent bits (TABs), which always exists for asynchronous systems, is greatly suppressed and the performance of the original HDM is substantially improved. This new scheme requires no signaling overhead yet offers nearly the same performance as those more complicated methods. Also, it can easily accommodate the change in the number of active users in the channel, as no symbol/bit grouping is involved. Finally, the influence of synchronisation errors is investigated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, the statistical properties of tropical ice clouds (ice water content, visible extinction, effective radius, and total number concentration) derived from 3 yr of ground-based radar–lidar retrievals from the U.S. Department of Energy Atmospheric Radiation Measurement Climate Research Facility in Darwin, Australia, are compared with the same properties derived using the official CloudSat microphysical retrieval methods and from a simpler statistical method using radar reflectivity and air temperature. It is shown that the two official CloudSat microphysical products (2B-CWC-RO and 2B-CWC-RVOD) are statistically virtually identical. The comparison with the ground-based radar–lidar retrievals shows that all satellite methods produce ice water contents and extinctions in a much narrower range than the ground-based method and overestimate the mean vertical profiles of microphysical parameters below 10-km height by over a factor of 2. Better agreements are obtained above 10-km height. Ways to improve these estimates are suggested in this study. Effective radii retrievals from the standard CloudSat algorithms are characterized by a large positive bias of 8–12 μm. A sensitivity test shows that in response to such a bias the cloud longwave forcing is increased from 44.6 to 46.9 W m−2 (implying an error of about 5%), whereas the negative cloud shortwave forcing is increased from −81.6 to −82.8 W m−2. Further analysis reveals that these modest effects (although not insignificant) can be much larger for optically thick clouds. The statistical method using CloudSat reflectivities and air temperature was found to produce inaccurate mean vertical profiles and probability distribution functions of effective radius. This study also shows that the retrieval of the total number concentration needs to be improved in the official CloudSat microphysical methods prior to a quantitative use for the characterization of tropical ice clouds. Finally, the statistical relationship used to produce ice water content from extinction and air temperature obtained by the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite is evaluated for tropical ice clouds. It is suggested that the CALIPSO ice water content retrieval is robust for tropical ice clouds, but that the temperature dependence of the statistical relationship used should be slightly refined to better reproduce the radar–lidar retrievals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Drought characterisation is an intrinsically spatio-temporal problem. A limitation of previous approaches to characterisation is that they discard much of the spatio-temporal information by reducing events to a lower-order subspace. To address this, an explicit 3-dimensional (longitude, latitude, time) structure-based method is described in which drought events are defined by a spatially and temporarily coherent set of points displaying standardised precipitation below a given threshold. Geometric methods can then be used to measure similarity between individual drought structures. Groupings of these similarities provide an alternative to traditional methods for extracting recurrent space-time signals from geophysical data. The explicit consideration of structure encourages the construction of summary statistics which relate to the event geometry. Example measures considered are the event volume, centroid, and aspect ratio. The utility of a 3-dimensional approach is demonstrated by application to the analysis of European droughts (15 °W to 35°E, and 35 °N to 70°N) for the period 1901–2006. Large-scale structure is found to be abundant with 75 events identified lasting for more than 3 months and spanning at least 0.5 × 106 km2. Near-complete dissimilarity is seen between the individual drought structures, and little or no regularity is found in the time evolution of even the most spatially similar drought events. The spatial distribution of the event centroids and the time evolution of the geographic cross-sectional areas strongly suggest that large area, sustained droughts result from the combination of multiple small area (∼106 km2) short duration (∼3 months) events. The small events are not found to occur independently in space. This leads to the hypothesis that local water feedbacks play an important role in the aggregation process.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The validity of ensemble averaging on event-related potential (ERP) data has been questioned, due to its assumption that the ERP is identical across trials. Thus, there is a need for preliminary testing for cluster structure in the data. New method: We propose a complete pipeline for the cluster analysis of ERP data. To increase the signalto-noise (SNR) ratio of the raw single-trials, we used a denoising method based on Empirical Mode Decomposition (EMD). Next, we used a bootstrap-based method to determine the number of clusters, through a measure called the Stability Index (SI). We then used a clustering algorithm based on a Genetic Algorithm (GA)to define initial cluster centroids for subsequent k-means clustering. Finally, we visualised the clustering results through a scheme based on Principal Component Analysis (PCA). Results: After validating the pipeline on simulated data, we tested it on data from two experiments – a P300 speller paradigm on a single subject and a language processing study on 25 subjects. Results revealed evidence for the existence of 6 clusters in one experimental condition from the language processing study. Further, a two-way chi-square test revealed an influence of subject on cluster membership.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of kilometre-scale ensembles in operational forecasting provides new challenges for forecast interpretation and evaluation to account for uncertainty on the convective scale. A new neighbourhood based method is presented for evaluating and characterising the local predictability variations from convective scale ensembles. Spatial scales over which ensemble forecasts agree (agreement scales, S^A) are calculated at each grid point ij, providing a map of the spatial agreement between forecasts. By comparing the average agreement scale obtained from ensemble member pairs (S^A(mm)_ij), with that between members and radar observations (S^A(mo)_ij), this approach allows the location-dependent spatial spread-skill relationship of the ensemble to be assessed. The properties of the agreement scales are demonstrated using an idealised experiment. To demonstrate the methods in an operational context the S^A(mm)_ij and S^A(mo)_ij are calculated for six convective cases run with the Met Office UK Ensemble Prediction System. The S^A(mm)_ij highlight predictability differences between cases, which can be linked to physical processes. Maps of S^A(mm)_ij are found to summarise the spatial predictability in a compact and physically meaningful manner that is useful for forecasting and for model interpretation. Comparison of S^A(mm)_ij and S^A(mo)_ij demonstrates the case-by-case and temporal variability of the spatial spread-skill, which can again be linked to physical processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Inhibition of microtubule function is an attractive rational approach to anticancer therapy. Although taxanes are the most prominent among the microtubule-stabilizers, their clinical toxicity, poor pharmacokinetic properties, and resistance have stimulated the search for new antitumor agents having the same mechanism of action. Discodermolide is an example of nontaxane natural product that has the same mechanism of action, demonstrating superior antitumor efficacy and therapeutic index. The extraordinary chemical and biological properties have qualified discodermolide as a lead structure for the design of novel anticancer agents with optimized therapeutic properties. In the present work, we have employed a specialized fragment-based method to develop robust quantitative structure - activity relationship models for a series of synthetic discodermolide analogs. The generated molecular recognition patterns were combined with three-dimensional molecular modeling studies as a fundamental step on the path to understanding the molecular basis of drug-receptor interactions within this important series of potent antitumoral agents.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A myriad of methods are available for virtual screening of small organic compound databases. In this study we have successfully applied a quantitative model of consensus measurements, using a combination of 3D similarity searches (ROCS and EON), Hologram Quantitative Structure Activity Relationships (HQSAR) and docking (FRED, FlexX, Glide and AutoDock Vina), to retrieve cruzain inhibitors from collected databases. All methods were assessed individually and then combined in a Ligand-Based Virtual Screening (LBVS) and Target-Based Virtual Screening (TBVS) consensus scoring, using Receiving Operating Characteristic (ROC) curves to evaluate their performance. Three consensus strategies were used: scaled-rank-by-number, rank-by-rank and rank-by-vote, with the most thriving the scaled-rank-by-number strategy, considering that the stiff ROC curve appeared to be satisfactory in every way to indicate a higher enrichment power at early retrieval of active compounds from the database. The ligand-based method provided access to a robust and predictive HQSAR model that was developed to show superior discrimination between active and inactive compounds, which was also better than ROCS and EON procedures. Overall, the integration of fast computational techniques based on ligand and target structures resulted in a more efficient retrieval of cruzain inhibitors with desired pharmacological profiles that may be useful to advance the discovery of new trypanocidal agents.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The stretch zone width (SZW) data for 15-5PH steel CTOD specimens fractured at -150 degrees C to + 23 degrees C temperature were measured based on focused images and 3D maps obtained by extended depth-of-field reconstruction from light microscopy (LM) image stacks. This LM-based method, with a larger lateral resolution, seems to be as effective for quantitative analysis of SZW as scanning electron microscopy (SEM) or confocal scanning laser microscopy (CSLM), permitting to clearly identify stretch zone boundaries. Despite the worst sharpness of focused images, a robust linear correlation was established to fracture toughness (KC) and SZW data for the 15-5PH steel tested specimens, measured at their center region. The method is an alternative to evaluate the boundaries of stretched zones, at a lower cost of implementation and training, since topographic data from elevation maps can be associated with reconstructed image, which summarizes the original contrast and brightness information. Finally, the extended depth-of-field method is presented here as a valuable tool for failure analysis, as a cheaper alternative to investigate rough surfaces or fracture, compared to scanning electron or confocal light microscopes. Microsc. Res. Tech. 75:11551158, 2012. (C) 2012 Wiley Periodicals, Inc.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Concept drift is a problem of increasing importance in machine learning and data mining. Data sets under analysis are no longer only static databases, but also data streams in which concepts and data distributions may not be stable over time. However, most learning algorithms produced so far are based on the assumption that data comes from a fixed distribution, so they are not suitable to handle concept drifts. Moreover, some concept drifts applications requires fast response, which means an algorithm must always be (re) trained with the latest available data. But the process of labeling data is usually expensive and/or time consuming when compared to unlabeled data acquisition, thus only a small fraction of the incoming data may be effectively labeled. Semi-supervised learning methods may help in this scenario, as they use both labeled and unlabeled data in the training process. However, most of them are also based on the assumption that the data is static. Therefore, semi-supervised learning with concept drifts is still an open challenge in machine learning. Recently, a particle competition and cooperation approach was used to realize graph-based semi-supervised learning from static data. In this paper, we extend that approach to handle data streams and concept drift. The result is a passive algorithm using a single classifier, which naturally adapts to concept changes, without any explicit drift detection mechanism. Its built-in mechanisms provide a natural way of learning from new data, gradually forgetting older knowledge as older labeled data items became less influent on the classification of newer data items. Some computer simulation are presented, showing the effectiveness of the proposed method.