928 resultados para Residual-based tests


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resumen tomado de la publicaci??n

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resumen tomado de la publicación

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resumen tomado de la publicación

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resumen tomado de la publicaci??n

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La presencia de microorganismos patógenos en alimentos es uno de los problemas esenciales en salud pública, y las enfermedades producidas por los mismos es una de las causas más importantes de enfermedad. Por tanto, la aplicación de controles microbiológicos dentro de los programas de aseguramiento de la calidad es una premisa para minimizar el riesgo de infección de los consumidores. Los métodos microbiológicos clásicos requieren, en general, el uso de pre-enriquecimientos no-selectivos, enriquecimientos selectivos, aislamiento en medios selectivos y la confirmación posterior usando pruebas basadas en la morfología, bioquímica y serología propias de cada uno de los microorganismos objeto de estudio. Por lo tanto, estos métodos son laboriosos, requieren un largo proceso para obtener resultados definitivos y, además, no siempre pueden realizarse. Para solucionar estos inconvenientes se han desarrollado diversas metodologías alternativas para la detección identificación y cuantificación de microorganismos patógenos de origen alimentario, entre las que destacan los métodos inmunológicos y moleculares. En esta última categoría, la técnica basada en la reacción en cadena de la polimerasa (PCR) se ha convertido en la técnica diagnóstica más popular en microbiología, y recientemente, la introducción de una mejora de ésta, la PCR a tiempo real, ha producido una segunda revolución en la metodología diagnóstica molecular, como pude observarse por el número creciente de publicaciones científicas y la aparición continua de nuevos kits comerciales. La PCR a tiempo real es una técnica altamente sensible -detección de hasta una molécula- que permite la cuantificación exacta de secuencias de ADN específicas de microorganismos patógenos de origen alimentario. Además, otras ventajas que favorecen su implantación potencial en laboratorios de análisis de alimentos son su rapidez, sencillez y el formato en tubo cerrado que puede evitar contaminaciones post-PCR y favorece la automatización y un alto rendimiento. En este trabajo se han desarrollado técnicas moleculares (PCR y NASBA) sensibles y fiables para la detección, identificación y cuantificación de bacterias patogénicas de origen alimentario (Listeria spp., Mycobacterium avium subsp. paratuberculosis y Salmonella spp.). En concreto, se han diseñado y optimizado métodos basados en la técnica de PCR a tiempo real para cada uno de estos agentes: L. monocytogenes, L. innocua, Listeria spp. M. avium subsp. paratuberculosis, y también se ha optimizado y evaluado en diferentes centros un método previamente desarrollado para Salmonella spp. Además, se ha diseñado y optimizado un método basado en la técnica NASBA para la detección específica de M. avium subsp. paratuberculosis. También se evaluó la aplicación potencial de la técnica NASBA para la detección específica de formas viables de este microorganismo. Todos los métodos presentaron una especificidad del 100 % con una sensibilidad adecuada para su aplicación potencial a muestras reales de alimentos. Además, se han desarrollado y evaluado procedimientos de preparación de las muestras en productos cárnicos, productos pesqueros, leche y agua. De esta manera se han desarrollado métodos basados en la PCR a tiempo real totalmente específicos y altamente sensibles para la determinación cuantitativa de L. monocytogenes en productos cárnicos y en salmón y productos derivados como el salmón ahumado y de M. avium subsp. paratuberculosis en muestras de agua y leche. Además este último método ha sido también aplicado para evaluar la presencia de este microorganismo en el intestino de pacientes con la enfermedad de Crohn's, a partir de biopsias obtenidas de colonoscopia de voluntarios afectados. En conclusión, este estudio presenta ensayos moleculares selectivos y sensibles para la detección de patógenos en alimentos (Listeria spp., Mycobacterium avium subsp. paratuberculosis) y para una rápida e inambigua identificación de Salmonella spp. La exactitud relativa de los ensayos ha sido excelente, si se comparan con los métodos microbiológicos de referencia y pueden serusados para la cuantificación de tanto ADN genómico como de suspensiones celulares. Por otro lado, la combinación con tratamientos de preamplificación ha resultado ser de gran eficiencia para el análisis de las bacterias objeto de estudio. Por tanto, pueden constituir una estrategia útil para la detección rápida y sensible de patógenos en alimentos y deberían ser una herramienta adicional al rango de herramientas diagnósticas disponibles para el estudio de patógenos de origen alimentario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El test de circuits és una fase del procés de producció que cada vegada pren més importància quan es desenvolupa un nou producte. Les tècniques de test i diagnosi per a circuits digitals han estat desenvolupades i automatitzades amb èxit, mentre que aquest no és encara el cas dels circuits analògics. D'entre tots els mètodes proposats per diagnosticar circuits analògics els més utilitzats són els diccionaris de falles. En aquesta tesi se'n descriuen alguns, tot analitzant-ne els seus avantatges i inconvenients. Durant aquests últims anys, les tècniques d'Intel·ligència Artificial han esdevingut un dels camps de recerca més importants per a la diagnosi de falles. Aquesta tesi desenvolupa dues d'aquestes tècniques per tal de cobrir algunes de les mancances que presenten els diccionaris de falles. La primera proposta es basa en construir un sistema fuzzy com a eina per identificar. Els resultats obtinguts son força bons, ja que s'aconsegueix localitzar la falla en un elevat tant percent dels casos. Per altra banda, el percentatge d'encerts no és prou bo quan a més a més s'intenta esbrinar la desviació. Com que els diccionaris de falles es poden veure com una aproximació simplificada al Raonament Basat en Casos (CBR), la segona proposta fa una extensió dels diccionaris de falles cap a un sistema CBR. El propòsit no és donar una solució general del problema sinó contribuir amb una nova metodologia. Aquesta consisteix en millorar la diagnosis dels diccionaris de falles mitjançant l'addició i l'adaptació dels nous casos per tal d'esdevenir un sistema de Raonament Basat en Casos. Es descriu l'estructura de la base de casos així com les tasques d'extracció, de reutilització, de revisió i de retenció, fent èmfasi al procés d'aprenentatge. En el transcurs del text s'utilitzen diversos circuits per mostrar exemples dels mètodes de test descrits, però en particular el filtre biquadràtic és l'utilitzat per provar les metodologies plantejades, ja que és un dels benchmarks proposats en el context dels circuits analògics. Les falles considerades son paramètriques, permanents, independents i simples, encara que la metodologia pot ser fàcilment extrapolable per a la diagnosi de falles múltiples i catastròfiques. El mètode es centra en el test dels components passius, encara que també es podria extendre per a falles en els actius.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper evaluates the routine of one pediatrics facility interested in incorporating a hearing screening protocol into their practice and suggests such a protocol using distortion product otoacoustic emission tests (DPOAE).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A stochastic parameterization scheme for deep convection is described, suitable for use in both climate and NWP models. Theoretical arguments and the results of cloud-resolving models, are discussed in order to motivate the form of the scheme. In the deterministic limit, it tends to a spectrum of entraining/detraining plumes and is similar to other current parameterizations. The stochastic variability describes the local fluctuations about a large-scale equilibrium state. Plumes are drawn at random from a probability distribution function (pdf) that defines the chance of finding a plume of given cloud-base mass flux within each model grid box. The normalization of the pdf is given by the ensemble-mean mass flux, and this is computed with a CAPE closure method. The characteristics of each plume produced are determined using an adaptation of the plume model from the Kain-Fritsch parameterization. Initial tests in the single column version of the Unified Model verify that the scheme is effective in producing the desired distributions of convective variability without adversely affecting the mean state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shallow groundwater beneath a former airfield site in southern England has been heavily contaminated with a wide range of chlorinated solvents. The feasibility of using bacterial biosensors to complement chemical analysis and enable cost-effective, and focussed sampling has been assessed as part of a site evaluation programme. Five different biosensors, three metabolic (Vibrio fischeri, Pseudomonas fluorescens 10568 and Escherichia coli HB101) and two catabolic (Pseudomonas putida TVA8 and E. coli DH5alpha), were employed to identify areas where the availability and toxicity of pollutants is of most immediate environmental concern. The biosensors used showed different sensitivities to each other and to the groundwater samples tested. There was generally a good agreement with chemical analyses. The potential efficacy of remediation strategies was explored by coupling sample manipulation to biosensor tests. Manipulation involved sparging and charcoal treatment procedures to simulate remediative engineering solutions. Sparging was sufficient at most locations. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Olsen method is an indicator of plant-available phosphorus (P). The effect of time and temperature on residual phosphate in soils was measured using the Olsen method in a pot experiment. Four soils were investigated: two from Pakistan and one each from England (calcareous) and Colombia (acidic). Two levels of residual phosphate were developed in each soil after addition of phosphate by incubation at either 10degreesC or 45degreesC. The amount of phosphate added was based on the P maximum of each soil, calculated using the Langmuir equation. Rvegrass was used as the test crop. The pooled data for the four soils incubated at 10degreesC showed good correlation between Olsen P and dry matter yield or P uptake (r(2) = 0.85 and 0.77, respectively), whereas at 45 degreesC, each soil had its own relationship and pooled data did not show correlation of Olsen P with dry matter yield or P uptake. When the data at both temperatures were pooled, Olsen P was a good indicator of yield and uptake for the English soil. For the Pakistani soils, Olsen P after 45 degreesC treatment was an underestimate relative to the 10 degreesC data and for the Colombian soil it was an overestimate. The reasons for these differences need to be explored further before high temperature incubation can be used to simulate long-term changes in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been generally accepted that the method of moments (MoM) variogram, which has been widely applied in soil science, requires about 100 sites at an appropriate interval apart to describe the variation adequately. This sample size is often larger than can be afforded for soil surveys of agricultural fields or contaminated sites. Furthermore, it might be a much larger sample size than is needed where the scale of variation is large. A possible alternative in such situations is the residual maximum likelihood (REML) variogram because fewer data appear to be required. The REML method is parametric and is considered reliable where there is trend in the data because it is based on generalized increments that filter trend out and only the covariance parameters are estimated. Previous research has suggested that fewer data are needed to compute a reliable variogram using a maximum likelihood approach such as REML, however, the results can vary according to the nature of the spatial variation. There remain issues to examine: how many fewer data can be used, how should the sampling sites be distributed over the site of interest, and how do different degrees of spatial variation affect the data requirements? The soil of four field sites of different size, physiography, parent material and soil type was sampled intensively, and MoM and REML variograms were calculated for clay content. The data were then sub-sampled to give different sample sizes and distributions of sites and the variograms were computed again. The model parameters for the sets of variograms for each site were used for cross-validation. Predictions based on REML variograms were generally more accurate than those from MoM variograms with fewer than 100 sampling sites. A sample size of around 50 sites at an appropriate distance apart, possibly determined from variograms of ancillary data, appears adequate to compute REML variograms for kriging soil properties for precision agriculture and contaminated sites. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a procedure for association based analysis of nuclear families that allows for dichotomous and more general measurements of phenotype and inclusion of covariate information. Standard generalized linear models are used to relate phenotype and its predictors. Our test procedure, based on the likelihood ratio, unifies the estimation of all parameters through the likelihood itself and yields maximum likelihood estimates of the genetic relative risk and interaction parameters. Our method has advantages in modelling the covariate and gene-covariate interaction terms over recently proposed conditional score tests that include covariate information via a two-stage modelling approach. We apply our method in a study of human systemic lupus erythematosus and the C-reactive protein that includes sex as a covariate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a continuing effort to establish the structure-activity relationships (SARs) within the series of the angiotensin II antagonists (sartans), a pharmacophoric model was built by using novel TOPP 3D descriptors. Statistical values were satisfactory (PC4: r(2)=0.96, q(2) ((5) (random) (groups))=0.84; SDEP=0.26) and encouraged the synthesis and consequent biological evaluation of a series of new pyrrolidine derivatives. SAR together with a combined 3D quantitative SAR and high-throughput virtual screening showed that the newly synthesized 1-acyl-N-(biphenyl-4-ylmethyl)pyrrolidine-2-carboxamides may represent an interesting starting point for the design of new antihypertensive agents. In particular, biological tests performed on CHO-hAT(1) cells stably expressing the human AT(1) receptor showed that the length of the acyl chain is crucial for the receptor interaction and that the valeric chain is the optimal one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heterogeneity in lifetime data may be modelled by multiplying an individual's hazard by an unobserved frailty. We test for the presence of frailty of this kind in univariate and bivariate data with Weibull distributed lifetimes, using statistics based on the ordered Cox-Snell residuals from the null model of no frailty. The form of the statistics is suggested by outlier testing in the gamma distribution. We find through simulation that the sum of the k largest or k smallest order statistics, for suitably chosen k , provides a powerful test when the frailty distribution is assumed to be gamma or positive stable, respectively. We provide recommended values of k for sample sizes up to 100 and simple formulae for estimated critical values for tests at the 5% level.