912 resultados para Appearance-based methods


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Blood culture remains the best approach to identify the incriminating microorganisms when a bloodstream infection is suspected, and to guarantee that the antimicrobial treatment is adequate. Major improvements have been made in the last years to increase the sensitivity and specificity and to reduce the time to identification of microorganisms recovered from blood cultures. Among other factors, the introduction in clinical microbiology laboratories of the matrix-assisted laser desorption ionization time-of-flight mass spectrometry technology revolutionized the identification of microorganisms whereas the introduction of nucleic-acid-based methods, such as DNA hybridization or rapid PCR-based test, significantly reduce the time to results. Together with traditional antimicrobial susceptibility testing, new rapid methods for the detection of resistance mechanisms respond to major epidemiological concerns such as methicillin-resistant Staphylococcus aureus, extended-spectrum β-lactamase or carbapenemases. This review presents and discusses the recent developments in microbial diagnosis of bloodstream infections based on blood cultures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: HIV surveillance requires monitoring of new HIV diagnoses and differentiation of incident and older infections. In 2008, Switzerland implemented a system for monitoring incident HIV infections based on the results of a line immunoassay (Inno-Lia) mandatorily conducted for HIV confirmation and type differentiation (HIV-1, HIV-2) of all newly diagnosed patients. Based on this system, we assessed the proportion of incident HIV infection among newly diagnosed cases in Switzerland during 2008-2013. METHODS AND RESULTS: Inno-Lia antibody reaction patterns recorded in anonymous HIV notifications to the federal health authority were classified by 10 published algorithms into incident (up to 12 months) or older infections. Utilizing these data, annual incident infection estimates were obtained in two ways, (i) based on the diagnostic performance of the algorithms and utilizing the relationship 'incident = true incident + false incident', (ii) based on the window-periods of the algorithms and utilizing the relationship 'Prevalence = Incidence x Duration'. From 2008-2013, 3'851 HIV notifications were received. Adult HIV-1 infections amounted to 3'809 cases, and 3'636 of them (95.5%) contained Inno-Lia data. Incident infection totals calculated were similar for the performance- and window-based methods, amounting on average to 1'755 (95% confidence interval, 1588-1923) and 1'790 cases (95% CI, 1679-1900), respectively. More than half of these were among men who had sex with men. Both methods showed a continuous decline of annual incident infections 2008-2013, totaling -59.5% and -50.2%, respectively. The decline of incident infections continued even in 2012, when a 15% increase in HIV notifications had been observed. This increase was entirely due to older infections. Overall declines 2008-2013 were of similar extent among the major transmission groups. CONCLUSIONS: Inno-Lia based incident HIV-1 infection surveillance proved useful and reliable. It represents a free, additional public health benefit of the use of this relatively costly test for HIV confirmation and type differentiation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Computational model-based simulation methods were developed for the modelling of bioaffinity assays. Bioaffinity-based methods are widely used to quantify a biological substance in biological research, development and in routine clinical in vitro diagnostics. Bioaffinity assays are based on the high affinity and structural specificity between the binding biomolecules. The simulation methods developed are based on the mechanistic assay model, which relies on the chemical reaction kinetics and describes the forming of a bound component as a function of time from the initial binding interaction. The simulation methods were focused on studying the behaviour and the reliability of bioaffinity assay and the possibilities the modelling methods of binding reaction kinetics provide, such as predicting assay results even before the binding reaction has reached equilibrium. For example, a rapid quantitative result from a clinical bioaffinity assay sample can be very significant, e.g. even the smallest elevation of a heart muscle marker reveals a cardiac injury. The simulation methods were used to identify critical error factors in rapid bioaffinity assays. A new kinetic calibration method was developed to calibrate a measurement system by kinetic measurement data utilizing only one standard concentration. A nodebased method was developed to model multi-component binding reactions, which have been a challenge to traditional numerical methods. The node-method was also used to model protein adsorption as an example of nonspecific binding of biomolecules. These methods have been compared with the experimental data from practice and can be utilized in in vitro diagnostics, drug discovery and in medical imaging.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we review the advances of monocular model-based tracking for last ten years period until 2014. In 2005, Lepetit, et. al, [19] reviewed the status of monocular model based rigid body tracking. Since then, direct 3D tracking has become quite popular research area, but monocular model-based tracking should still not be forgotten. We mainly focus on tracking, which could be applied to aug- mented reality, but also some other applications are covered. Given the wide subject area this paper tries to give a broad view on the research that has been conducted, giving the reader an introduction to the different disciplines that are tightly related to model-based tracking. The work has been conducted by searching through well known academic search databases in a systematic manner, and by selecting certain publications for closer examination. We analyze the results by dividing the found papers into different categories by their way of implementation. The issues which have not yet been solved are discussed. We also discuss on emerging model-based methods such as fusing different types of features and region-based pose estimation which could show the way for future research in this subject.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tannins, typically segregated into two major groups, the hydrolyzable tannins (HTs) and the proanthocyanidins (PAs), are plant polyphenolic secondary metabolites found throughout the plant kingdom. On one hand, tannins may cause harmful nutritional effects on herbivores, for example insects, and hence they work as plants’ defense against plant-eating animals. On the other hand, they may affect positively some herbivores, such as mammals, for example by their antioxidant, antimicrobial, anti-inflammatory or anticarcinogenic activities. This thesis focuses on understanding the bioactivity of plant tannins, their anthelmintic properties and the tools used for the qualitative and quantitative analysis of this endless source of structural diversity. The first part of the experimental work focused on the development of ultra-high performance liquid chromatography−tandem mass spectrometry (UHPLC-MS/MS) based methods for the rapid fingerprint analysis of bioactive polyphenols, especially tannins. In the second part of the experimental work the in vitro activity of isolated and purified HTs and their hydrolysis product, gallic acid, was tested against egg hatching and larval motility of two larval developmental stages, L1 and L2, of a common ruminant gastrointestinal parasite, Haemonchus contortus. The results indicated clear relationships between the HT structure and the anthelmintic activity. The activity of the studied compounds depended on many structural features, including size, functional groups present in the structure, and the structural rigidness. To further understand tannin bioactivity on a molecular level, the interaction between bovine serum albumin (BSA), and seven HTs and epigallocatechin gallate was examined. The objective was to define the effect of pH on the formation on tannin–protein complexes and to evaluate the stability of the formed complexes by gel electrophoresis and MALDI-TOF-MS. The results indicated that more basic pH values had a stabilizing effect on the tannin–protein complexes and that the tannin oxidative activity was directly linked with their tendency to form covalently stabilized complexes with BSA at increased pH.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Evapotranspiration (ET) is a complex process in the hydrological cycle that influences the quantity of runoff and thus the irrigation water requirements. Numerous methods have been developed to estimate potential evapotranspiration (PET). Unfortunately, most of the reliable PET methods are parameter rich models and therefore, not feasible for application in data scarce regions. On the other hand, accuracy and reliability of simple PET models vary widely according to regional climate conditions. The objective of the present study was to evaluate the performance of three temperature-based and three radiation-based simple ET methods in estimating historical ET and projecting future ET at Muda Irrigation Scheme at Kedah, Malaysia. The performance was measured by comparing those methods with the parameter intensive Penman-Monteith Method. It was found that radiation based methods gave better performance compared to temperature-based methods in estimation of ET in the study area. Future ET simulated from projected climate data obtained through statistical downscaling technique also showed that radiation-based methods can project closer ET values to that projected by Penman-Monteith Method. It is expected that the study will guide in selecting suitable methods for estimating and projecting ET in accordance to availability of meteorological data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Darrerament, l'interès pel desenvolupament d'aplicacions amb robots submarins autònoms (AUV) ha crescut de forma considerable. Els AUVs són atractius gràcies al seu tamany i el fet que no necessiten un operador humà per pilotar-los. Tot i això, és impossible comparar, en termes d'eficiència i flexibilitat, l'habilitat d'un pilot humà amb les escasses capacitats operatives que ofereixen els AUVs actuals. L'utilització de AUVs per cobrir grans àrees implica resoldre problemes complexos, especialment si es desitja que el nostre robot reaccioni en temps real a canvis sobtats en les condicions de treball. Per aquestes raons, el desenvolupament de sistemes de control autònom amb l'objectiu de millorar aquestes capacitats ha esdevingut una prioritat. Aquesta tesi tracta sobre el problema de la presa de decisions utilizant AUVs. El treball presentat es centra en l'estudi, disseny i aplicació de comportaments per a AUVs utilitzant tècniques d'aprenentatge per reforç (RL). La contribució principal d'aquesta tesi consisteix en l'aplicació de diverses tècniques de RL per tal de millorar l'autonomia dels robots submarins, amb l'objectiu final de demostrar la viabilitat d'aquests algoritmes per aprendre tasques submarines autònomes en temps real. En RL, el robot intenta maximitzar un reforç escalar obtingut com a conseqüència de la seva interacció amb l'entorn. L'objectiu és trobar una política òptima que relaciona tots els estats possibles amb les accions a executar per a cada estat que maximitzen la suma de reforços totals. Així, aquesta tesi investiga principalment dues tipologies d'algoritmes basats en RL: mètodes basats en funcions de valor (VF) i mètodes basats en el gradient (PG). Els resultats experimentals finals mostren el robot submarí Ictineu en una tasca autònoma real de seguiment de cables submarins. Per portar-la a terme, s'ha dissenyat un algoritme anomenat mètode d'Actor i Crític (AC), fruit de la fusió de mètodes VF amb tècniques de PG.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La presencia de microorganismos patógenos en alimentos es uno de los problemas esenciales en salud pública, y las enfermedades producidas por los mismos es una de las causas más importantes de enfermedad. Por tanto, la aplicación de controles microbiológicos dentro de los programas de aseguramiento de la calidad es una premisa para minimizar el riesgo de infección de los consumidores. Los métodos microbiológicos clásicos requieren, en general, el uso de pre-enriquecimientos no-selectivos, enriquecimientos selectivos, aislamiento en medios selectivos y la confirmación posterior usando pruebas basadas en la morfología, bioquímica y serología propias de cada uno de los microorganismos objeto de estudio. Por lo tanto, estos métodos son laboriosos, requieren un largo proceso para obtener resultados definitivos y, además, no siempre pueden realizarse. Para solucionar estos inconvenientes se han desarrollado diversas metodologías alternativas para la detección identificación y cuantificación de microorganismos patógenos de origen alimentario, entre las que destacan los métodos inmunológicos y moleculares. En esta última categoría, la técnica basada en la reacción en cadena de la polimerasa (PCR) se ha convertido en la técnica diagnóstica más popular en microbiología, y recientemente, la introducción de una mejora de ésta, la PCR a tiempo real, ha producido una segunda revolución en la metodología diagnóstica molecular, como pude observarse por el número creciente de publicaciones científicas y la aparición continua de nuevos kits comerciales. La PCR a tiempo real es una técnica altamente sensible -detección de hasta una molécula- que permite la cuantificación exacta de secuencias de ADN específicas de microorganismos patógenos de origen alimentario. Además, otras ventajas que favorecen su implantación potencial en laboratorios de análisis de alimentos son su rapidez, sencillez y el formato en tubo cerrado que puede evitar contaminaciones post-PCR y favorece la automatización y un alto rendimiento. En este trabajo se han desarrollado técnicas moleculares (PCR y NASBA) sensibles y fiables para la detección, identificación y cuantificación de bacterias patogénicas de origen alimentario (Listeria spp., Mycobacterium avium subsp. paratuberculosis y Salmonella spp.). En concreto, se han diseñado y optimizado métodos basados en la técnica de PCR a tiempo real para cada uno de estos agentes: L. monocytogenes, L. innocua, Listeria spp. M. avium subsp. paratuberculosis, y también se ha optimizado y evaluado en diferentes centros un método previamente desarrollado para Salmonella spp. Además, se ha diseñado y optimizado un método basado en la técnica NASBA para la detección específica de M. avium subsp. paratuberculosis. También se evaluó la aplicación potencial de la técnica NASBA para la detección específica de formas viables de este microorganismo. Todos los métodos presentaron una especificidad del 100 % con una sensibilidad adecuada para su aplicación potencial a muestras reales de alimentos. Además, se han desarrollado y evaluado procedimientos de preparación de las muestras en productos cárnicos, productos pesqueros, leche y agua. De esta manera se han desarrollado métodos basados en la PCR a tiempo real totalmente específicos y altamente sensibles para la determinación cuantitativa de L. monocytogenes en productos cárnicos y en salmón y productos derivados como el salmón ahumado y de M. avium subsp. paratuberculosis en muestras de agua y leche. Además este último método ha sido también aplicado para evaluar la presencia de este microorganismo en el intestino de pacientes con la enfermedad de Crohn's, a partir de biopsias obtenidas de colonoscopia de voluntarios afectados. En conclusión, este estudio presenta ensayos moleculares selectivos y sensibles para la detección de patógenos en alimentos (Listeria spp., Mycobacterium avium subsp. paratuberculosis) y para una rápida e inambigua identificación de Salmonella spp. La exactitud relativa de los ensayos ha sido excelente, si se comparan con los métodos microbiológicos de referencia y pueden serusados para la cuantificación de tanto ADN genómico como de suspensiones celulares. Por otro lado, la combinación con tratamientos de preamplificación ha resultado ser de gran eficiencia para el análisis de las bacterias objeto de estudio. Por tanto, pueden constituir una estrategia útil para la detección rápida y sensible de patógenos en alimentos y deberían ser una herramienta adicional al rango de herramientas diagnósticas disponibles para el estudio de patógenos de origen alimentario.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper reports the current state of work to simplify our previous model-based methods for visual tracking of vehicles for use in a real-time system intended to provide continuous monitoring and classification of traffic from a fixed camera on a busy multi-lane motorway. The main constraints of the system design were: (i) all low level processing to be carried out by low-cost auxiliary hardware, (ii) all 3-D reasoning to be carried out automatically off-line, at set-up time. The system developed uses three main stages: (i) pose and model hypothesis using 1-D templates, (ii) hypothesis tracking, and (iii) hypothesis verification, using 2-D templates. Stages (i) & (iii) have radically different computing performance and computational costs, and need to be carefully balanced for efficiency. Together, they provide an effective way to locate, track and classify vehicles.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In an immersive virtual environment, observers fail to notice the expansion of a room around them and consequently make gross errors when comparing the size of objects. This result is difficult to explain if the visual system continuously generates a 3-D model of the scene based on known baseline information from interocular separation or proprioception as the observer walks. An alternative is that observers use view-based methods to guide their actions and to represent the spatial layout of the scene. In this case, they may have an expectation of the images they will receive but be insensitive to the rate at which images arrive as they walk. We describe the way in which the eye movement strategy of animals simplifies motion processing if their goal is to move towards a desired image and discuss dorsal and ventral stream processing of moving images in that context. Although many questions about view-based approaches to scene representation remain unanswered, the solutions are likely to be highly relevant to understanding biological 3-D vision.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: This paper presents a detailed study of fractal-based methods for texture characterization of mammographic mass lesions and architectural distortion. The purpose of this study is to explore the use of fractal and lacunarity analysis for the characterization and classification of both tumor lesions and normal breast parenchyma in mammography. Materials and methods: We conducted comparative evaluations of five popular fractal dimension estimation methods for the characterization of the texture of mass lesions and architectural distortion. We applied the concept of lacunarity to the description of the spatial distribution of the pixel intensities in mammographic images. These methods were tested with a set of 57 breast masses and 60 normal breast parenchyma (dataset1), and with another set of 19 architectural distortions and 41 normal breast parenchyma (dataset2). Support vector machines (SVM) were used as a pattern classification method for tumor classification. Results: Experimental results showed that the fractal dimension of region of interest (ROIs) depicting mass lesions and architectural distortion was statistically significantly lower than that of normal breast parenchyma for all five methods. Receiver operating characteristic (ROC) analysis showed that fractional Brownian motion (FBM) method generated the highest area under ROC curve (A z = 0.839 for dataset1, 0.828 for dataset2, respectively) among five methods for both datasets. Lacunarity analysis showed that the ROIs depicting mass lesions and architectural distortion had higher lacunarities than those of ROIs depicting normal breast parenchyma. The combination of FBM fractal dimension and lacunarity yielded the highest A z value (0.903 and 0.875, respectively) than those based on single feature alone for both given datasets. The application of the SVM improved the performance of the fractal-based features in differentiating tumor lesions from normal breast parenchyma by generating higher A z value. Conclusion: FBM texture model is the most appropriate model for characterizing mammographic images due to self-affinity assumption of the method being a better approximation. Lacunarity is an effective counterpart measure of the fractal dimension in texture feature extraction in mammographic images. The classification results obtained in this work suggest that the SVM is an effective method with great potential for classification in mammographic image analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most active-contour methods are based either on maximizing the image contrast under the contour or on minimizing the sum of squared distances between contour and image 'features'. The Marginalized Likelihood Ratio (MLR) contour model uses a contrast-based measure of goodness-of-fit for the contour and thus falls into the first class. The point of departure from previous models consists in marginalizing this contrast measure over unmodelled shape variations. The MLR model naturally leads to the EM Contour algorithm, in which pose optimization is carried out by iterated least-squares, as in feature-based contour methods. The difference with respect to other feature-based algorithms is that the EM Contour algorithm minimizes squared distances from Bayes least-squares (marginalized) estimates of contour locations, rather than from 'strongest features' in the neighborhood of the contour. Within the framework of the MLR model, alternatives to the EM algorithm can also be derived: one of these alternatives is the empirical-information method. Tracking experiments demonstrate the robustness of pose estimates given by the MLR model, and support the theoretical expectation that the EM Contour algorithm is more robust than either feature-based methods or the empirical-information method. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a software-based study of a hardware-based non-sorting median calculation method on a set of integer numbers. The method divides the binary representation of each integer element in the set into bit slices in order to find the element located in the middle position. The method exhibits a linear complexity order and our analysis shows that the best performance in execution time is obtained when slices of 4-bit in size are used for 8-bit and 16-bit integers, in mostly any data set size. Results suggest that software implementation of bit slice method for median calculation outperforms sorting-based methods with increasing improvement for larger data set size. For data set sizes of N > 5, our simulations show an improvement of at least 40%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Classical regression methods take vectors as covariates and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the model’s dimensionality to a manageable level, thus leading to efficient estimation. Most of the existing tensor-based methods independently estimate each individual regression problem based on tensor decomposition which allows the simultaneous projections of an input tensor to more than one direction along each mode. As a matter of fact, multi-dimensional data are collected under the same or very similar conditions, so that data share some common latent components but can also have their own independent parameters for each regression task. Therefore, it is beneficial to analyse regression parameters among all the regressions in a linked way. In this paper, we propose a tensor regression model based on Tucker Decomposition, which identifies not only the common components of parameters across all the regression tasks, but also independent factors contributing to each particular regression task simultaneously. Under this paradigm, the number of independent parameters along each mode is constrained by a sparsity-preserving regulariser. Linked multiway parameter analysis and sparsity modeling further reduce the total number of parameters, with lower memory cost than their tensor-based counterparts. The effectiveness of the new method is demonstrated on real data sets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The correlation between the microdilution (MD), Etest (R) (ET), and disk diffusion (DD) methods was determined for amphotericin B, itraconazole and fluconazole. The minimal inhibitory concentration (MIC) of those antifungal agents was established for a total of 70 Candida spp. isolates from colonization and infection. The species distribution was: Candida albicans (n = 27), C. tropicalis (n = 17), C. glabrata (n = 16), C. parapsilosis (n = 8), and C. lusitaniae (n = 2). Non-Candida albicans Candida species showed higher MICs for the three antifungal agents when compared with C. albicans isolates. The overall concordance (based on the MIC value obtained within two dilutions) between the ET and the MD method was 83% for amphotericin B, 63% for itraconazole, and 64% for fluconazole. Considering the breakpoint, the agreement between the DD and MD methods was 71% for itraconazole and 67% for fluconazole. The DD zone diameters are highly reproducible and correlate well with the MD method, making agar-based methods a viable alternative to MD for susceptibility testing. However, data on agar-based tests for itraconazole and amphotericin B are yet scarce. Thus, further research must still be carded out to ensure the standardization to other antifungal agents. J. Clin. Lab. Anal. 23:324-330, 2009. (C) 2009 Wiley-Liss, Inc.