922 resultados para automated


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The general strategy to perform anti-doping analyses of urine samples starts with the screening for a wide range of compounds. This step should be fast, generic and able to detect any sample that may contain a prohibited substance while avoiding false negatives and reducing false positive results. The experiments presented in this work were based on ultra-high-pressure liquid chromatography coupled to hybrid quadrupole time-of-flight mass spectrometry. Thanks to the high sensitivity of the method, urine samples could be diluted 2-fold prior to injection. One hundred and three forbidden substances from various classes (such as stimulants, diuretics, narcotics, anti-estrogens) were analysed on a C(18) reversed-phase column in two gradients of 9min (including two 3min equilibration periods) for positive and negative electrospray ionisation and detected in the MS full scan mode. The automatic identification of analytes was based on retention time and mass accuracy, with an automated tool for peak picking. The method was validated according to the International Standard for Laboratories described in the World Anti-Doping Code and was selective enough to comply with the World Anti-Doping Agency recommendations. In addition, the matrix effect on MS response was measured on all investigated analytes spiked in urine samples. The limits of detection ranged from 1 to 500ng/mL, allowing the identification of all tested compounds in urine. When a sample was reported positive during the screening, a fast additional pre-confirmatory step was performed to reduce the number of confirmatory analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose of the study: Basic life support (BLS) and automated externaldefibrillation (AED) represent important skills to be acquired duringpregraduate medical training. Since 3 years, our medical school hasintroduced a BLS-AED course (with certification) for all second yearmedical students. Few reports about quality and persistence over timeof BLS-AED learning are available to date in the medical literature.Comprehensive evaluation of students' acquired skills was performedat the end of the 2008 academic year, 6 month after certification.Materials and methods: The students (N = 142) were evaluated duringa 9 minutes «objective structured clinical examination» (OSCE) station.Out of a standardized scenario, they had to recognize a cardiac arrestsituation and start a resuscitation process. Their performance wererecorded on a PC using an Ambuman(TM) mannequin and the AmbuCPR software kit(TM) during a minimum of 8 cycles (30 compressions:2 ventilations each). BLS parameters were systematically checked. Nostudent-rater interactions were allowed during the whole evaluation.Results: Response of the victim was checked by 99% of the students(N = 140), 96% (N = 136) called for an ambulance and/or an AED. Openthe airway and check breathing were done by 96% (N = 137), 92% (N =132) gave 2 rescue breaths. Pulse was checked by 95% (N=135), 100%(N = 142) begun chest compression, 96% (N = 136) within 1 minute.Chest compression rate was 101 ± 18 per minute (mean ± SD), depthcompression 43 ± 8 mm, 97% (N = 138) respected a compressionventilationratio of 30:2.Conclusions: Quality of BLS skills acquisition is maintained during a6-month period after a BLS-AED certification. Main targets of 2005 AHAguidelines were well respected. This analysis represents one of thelargest evaluations of specific BLS teaching efficiency reported. Furtherfollow-up is needed to control the persistence of these skills during alonger time period and noteworthy at the end of the pregraduatemedical curriculum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Conventional magnetic resonance imaging (MRI) techniques are highly sensitive to detect multiple sclerosis (MS) plaques, enabling a quantitative assessment of inflammatory activity and lesion load. In quantitative analyses of focal lesions, manual or semi-automated segmentations have been widely used to compute the total number of lesions and the total lesion volume. These techniques, however, are both challenging and time-consuming, being also prone to intra-observer and inter-observer variability.Aim: To develop an automated approach to segment brain tissues and MS lesions from brain MRI images. The goal is to reduce the user interaction and to provide an objective tool that eliminates the inter- and intra-observer variability.Methods: Based on the recent methods developed by Souplet et al. and de Boer et al., we propose a novel pipeline which includes the following steps: bias correction, skull stripping, atlas registration, tissue classification, and lesion segmentation. After the initial pre-processing steps, a MRI scan is automatically segmented into 4 classes: white matter (WM), grey matter (GM), cerebrospinal fluid (CSF) and partial volume. An expectation maximisation method which fits a multivariate Gaussian mixture model to T1-w, T2-w and PD-w images is used for this purpose. Based on the obtained tissue masks and using the estimated GM mean and variance, we apply an intensity threshold to the FLAIR image, which provides the lesion segmentation. With the aim of improving this initial result, spatial information coming from the neighbouring tissue labels is used to refine the final lesion segmentation.Results:The experimental evaluation was performed using real data sets of 1.5T and the corresponding ground truth annotations provided by expert radiologists. The following values were obtained: 64% of true positive (TP) fraction, 80% of false positive (FP) fraction, and an average surface distance of 7.89 mm. The results of our approach were quantitatively compared to our implementations of the works of Souplet et al. and de Boer et al., obtaining higher TP and lower FP values.Conclusion: Promising MS lesion segmentation results have been obtained in terms of TP. However, the high number of FP which is still a well-known problem of all the automated MS lesion segmentation approaches has to be improved in order to use them for the standard clinical practice. Our future work will focus on tackling this issue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: Small nodal tumor infiltrates are identified by applying multilevel sectioning and immunohistochemistry (IHC) in addition to H&E (hematoxylin and eosin) stains of resected lymph nodes. However, the use of multilevel sectioning and IHC is very time-consuming and costly. The current standard analysis of lymph nodes in colon cancer patients is based on one slide per lymph node stained by H&E. A new molecular diagnostic system called ''One tep Nucleic Acid Amplification'' (OSNA) was designed for a more accurate detection of lymph node metastases. The objective of the present investigation was to compare the performance ofOSNAto current standard histology (H&E). We hypothesize that OSNA provides a better staging than the routine use of one slide H&E per lymph node.Methods: From 22 colon cancer patients 307 frozen lymph nodes were used to compare OSNA with H&E. The lymph nodes were cut into halves. One half of the lymph node was analyzed by OSNA. The semi-automated OSNA uses amplification of reverse-transcribed cytokeratin19 (CK19) mRNA directly from the homogenate. The remaining tissue was dedicated to histology, with 5 levels of H&E and IHC staining (CK19).Results: On routine evaluation of oneH&Eslide 7 patients were nodal positive (macro-metastases). All these patients were recognized by OSNA analysis as being positive (sensitivity 100%). Two of the remaining 15 patients had lymph node micro-metastases and 9 isolated tumor cells. For the patients with micrometastases both H&E and OSNA were positive in 1 of the 2 patients. For patients with isolated tumor cells, H&E was positive in 1/9 cases whereas OSNA was positive in 3/9 patients (IHC as a reference). There was only one case to be described as IHC negative/OSNA positive. On the basis of single lymph nodes the sensitivity of OSNA and the 5 levels of H&E and IHC was 94・5%.Conclusion: OSNA is a novel molecular tool for the detection of lymph node metastases in colon cancer patients which provides better staging compared to the current standard evaluation of one slide H&E stain. Since the use of OSNA allows the analysis of the whole lymph node, sampling bias and undetected tumor deposits due to uninvestigated material will be overcome. OSNA improves staging in colon cancer patients and may replace the current standard of H&E staining in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The activity of the renin-angiotensin system is usually evaluated as plasma renin activity (PRA, ngAI/ml per h) but the reproducibility of this enzymatic assay is notoriously scarce. We compared the inter and intralaboratory reproducibilities of PRA with those of a new automated chemiluminescent assay, which allows the direct quantification of immunoreactive renin [chemiluminescent immunoreactive renin (CLIR), microU/ml]. METHODS: Aliquots from six pool plasmas of patients with very low to very high PRA levels were measured in 12 centres with both the enzymatic and the direct assays. The same methods were applied to three control plasma preparations with known renin content. RESULTS: In pool plasmas, mean PRA values ranged from 0.14 +/- 0.08 to 18.9 +/- 4.1 ngAI/ml per h, whereas those of CLIR ranged from 4.2 +/- 1.7 to 436 +/- 47 microU/ml. In control plasmas, mean values of PRA and of CLIR were always within the expected range. Overall, there was a significant correlation between the two methods (r = 0.73, P < 0.01). Similar correlations were found in plasmas subdivided in those with low, intermediate and high PRA. However, the coefficients of variation among laboratories found for PRA were always higher than those of CLIR, ranging from 59.4 to 17.1% for PRA, and from 41.0 to 10.7% for CLIR (P < 0.01). Also, the mean intralaboratory variability was higher for PRA than for CLIR, being respectively, 8.5 and 4.5% (P < 0.01). CONCLUSION: The measurement of renin with the chemiluminescent method is a reliable alternative to PRA, having the advantage of a superior inter and intralaboratory reproducibility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aquest projecte es dirigeix a tots aquells docents que gestionen informació d'activitats dinvestigació des de diferents eines tecnològiques i que precisen de connexió a Internet i de diverses Webs. Per centralitzar aquestes dades, neix TeachPro, un software automatitzat que gestiona i emmagatzema informació sobre les activitats de gestió per tal que l'usuari en disposi de manera ràpida i organitzada. Tot això, sense connexió a Internet. Les dades són gestionades mitjançant una interfície a partir de la qual l'usuari podrà consultar, afegir, modificar, o eliminar tota aquella informació que desitgi. Totes aquestes dades seran emmagatzemades en una base de dades local.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este proyecto consiste en el desarrollo de una aplicación informática que permite gestionar de forma automatizada y consistente los datos requeridos para la actividad docente de un profesor universitario. La aplicación permite gestionar: plan docente, asignaturas, horario docente, calendario de exámenes y proyectos final de carrera. Todas estas opciones tienen las funciones de, agregar, buscar, modificar y eliminar datos. Además tiene otras opciones como calendario docente y webs, cuya finalidad será consultar, de forma directa, páginas web de interés docente. Finalmente, la opción material docente tendrá como finalidad, crear, modificar y eliminar ficheros de diferente formato (word, excel, powerpoint, pdf) asociados a las asignaturas registradas en la aplicación. La aplicación se ha implementado en el sistema operativo Windows en el lenguaje de programación Java. Los datos utilizados se almacenan en la base de datos MySql Workbench. Para las validaciones de entrada de datos se ha utilizado JavaScript y JQuery. El diseño de la interfaz se ha llevado a cabo con Java Server Pages, Html, Css y framework Struts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este proyecto es un estudio que pretende realizar una aplicación dirigida a centros deportivos, implementando a través de BIOMAX el acceso a las instalaciones utilizando tarjetas de proximidad. Además la aplicación permitirá gestionar de forma eficiente a sus socios e instalaciones. De esta manera se consigue automatizar - mejorando en tiempo y calidad -una tarea imprescindible de realizar, control de accesos de forma sencilla y con un mantenimiento fácil por parte de los usuarios que la utilicen. Para ello se utilizará un dispositivo KIMALDI que nos permitirá gestionar el control de accesos

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Normal and abnormal brains can be segmented by registering the target image with an atlas. Here, an atlas is defined as the combination of an intensity image (template) and its segmented image (the atlas labels). After registering the atlas template and the target image, the atlas labels are propagated to the target image. We define this process as atlas-based segmentation. In recent years, researchers have investigated registration algorithms to match atlases to query subjects and also strategies for atlas construction. In this paper we present a review of the automated approaches for atlas-based segmentation of magnetic resonance brain images. We aim to point out the strengths and weaknesses of atlas-based methods and suggest new research directions. We use two different criteria to present the methods. First, we refer to the algorithms according to their atlas-based strategy: label propagation, multi-atlas methods, and probabilistic techniques. Subsequently, we classify the methods according to their medical target: the brain and its internal structures, tissue segmentation in healthy subjects, tissue segmentation in fetus, neonates and elderly subjects, and segmentation of damaged brains. A quantitative comparison of the results reported in the literature is also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Red light running continues to be a serious safety concern for many communities in the United States. The Federal Highway Administration reported that in 2011, red light running accounted for 676 fatalities nationwide. Red light running crashes at a signalized intersections are more serious, especially in high speed corridors where speeds are above 35 mph. Many communities have invested in red light countermeasures including low-cost strategies (e.g. signal backplates, targeted enforcement, signal timing adjustments and improvement with signage) to high-cost strategies (e.g. automated enforcement and intersection geometric improvements). This research study investigated intersection confirmation lights as a low-cost strategy to reduce red light running violations. Two intersections in Altoona and Waterloo, Iowa were equipped with confirmation lights which targeted the through and left turning movements. Confirmation lights enable a single police officer to monitor a specific lane of traffic downstream of the intersection. A before-after analysis was conducted in which a change in red light running violations prior to- and 1 and 3 months after installation were evaluated. A test of proportions was used to determine if the change in red light running violation rates were statistically significant at the 90 and 95 percent levels of confidence. The two treatment intersections were then compared to the changes of red light running violation rates at spillover intersections (directly adjacent to the treatment intersections) and control intersections. The results of the analysis indicated a 10 percent reduction of red light running violations in Altoona and a 299 percent increase in Waterloo at the treatment locations. Finally, the research team investigated the time into red for each observed red light running violation. The analysis indicated that many of the violations occurred less than one second into the red phase and that most of the violation occurred during or shortly after the all-red phase.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The major objective of this research project was to use thermal analysis techniques in conjunction with x-ray analysis methods to identify and explain chemical reactions that promote aggregate related deterioration in portland cement concrete. Twenty-two different carbonate aggregate samples were subjected to a chemical testing scheme that included: • bulk chemistry (major, minor and selected trace elements) • bulk mineralogy (minor phases concentrated by acid extraction) • solid-solution in the major carbonate phases • crystallite size determinations for the major carbonate phases • a salt treatment study to evaluate the impact of deicer salts Test results from these different studies were then compared to information that had been obtained using thermogravimetric analysis techniques. Since many of the limestones and dolomites that were used in the study had extensive field service records it was possible to correlate many of the variables with service life. The results of this study have indicated that thermogravimetric analysis can play an important role in categorizing carbonate aggregates. In fact, with modern automated thermal analysis systems it should be possible to utilize such methods on a quality control basis. Strong correlations were found between several of the variables that were monitored in this study. In fact, several of the variables exhibited significant correlations to concrete service life. When the full data set was utilized (n = 18), the significant correlations to service life can be summarized as follows ( a = 5% level): • Correlation coefficient, r, = -0.73 for premature TG loss versus service life. • Correlation coefficient, r, = 0.74 for relative crystallite size versus service life. • Correlation coefficient, r, = 0.53 for ASTM C666 durability factor versus service life. • Correlation coefficient, r, = -0.52 for acid-insoluble residue versus service life. Separation of the carbonate aggregates into their mineralogical categories (i.e., calcites and dolomites) tended to increase the correlation coefficients for some specific variables (r sometimes approached 0.90); however, the reliability of such correlations was questionable because of the small number of samples that were present in this study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the main report concerning the role that magnesium may have in highway concrete aggregate, over 20,000 electron microprobe data were obtained, primarily from automated scans, or traverses, across dolomite aggregate grains and the adjacent cement paste. Representative traverses were shown in figures and averages of the data were presented in Table II. In this Appendix, detailed representative and selected analyses of carbonate aggregate only are presented. These analyses were not presented in the main report because they would be interesting to only a few specialists in dolomite· rocks. In this Appendix, individual point analyses of mineral compositions in the paste have been omitted along with dolomite compositions at grain boundaries and cracks. Clay minerals and quartz inclusions in the aggregate are also not included. In the analyses, the first three column headings from left to right show line number, x-axis, and y-axis (Line number is an artifact of the computer print-out for each new traverse. Consecutive line numbers indicate a continuous traverse with distances between each point of 1.5 to a few μ-m. X-axis and y-axis are coordinates on the electron microscope stage). The next columns present weight percent oxide content of FeO, K20, CaO, Si02, Al203, MgO, SrO, BaO, MnO, Na20, and C02 (calculated assuming the number of moles of C02 is equal to the sum of moles of oxides, chiefly CaO and MgO), TOTAL (the sum of all oxides), and total (sum of all oxides excluding COi). In many of the analyses total is omitted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The report compares and contrasts the automated PASCO method of pavement evaluation to the manual procedures used by the Iowa Department of Transportation (DOT) to evaluate pavement condition. Iowa DOT's use of IJK and BPR roadmeters and manual crack and patch surveys are compared to PASCO's use of 35-mm photography, artificial lighting and hairline projection, tracking wheels and lasers to measure ride, cracking and patching, rut depths, and roughness. The Iowa DOT method provides a Present Serviceability Index (PSI) value and PASCO provides a Maintenance Control Index (MCI). Seven sections of Interstate Highway, county roads and city streets, and one shoulder section were tested with different speeds of data collection, surface types and textures, and stop and start conditions. High correlation of results between the two methods in the measurement of roughness (0.93 for the tracking wheel and 0.84 for the laser method) were recorded. Rut depth correlations of 0.61 and cracking of 0.32 are attributed to PASCO's more comprehensive measurement techniques. A cost analysis of the data provided by both systems indicates that PASCO is capable of providing a comparable result with improved accuracy at a cost of $125-$150 or less per two-lane mile depending on survey mileage. Improved data collection speed, accuracy, and reliability, and a visible record of pavement condition for comparable costs are available. The PASCO system's ability to provide the data required in the Highway Pavement Distress Identification Manual, the Pavement Condition Rating Guide, and the Strategic Highway Research Program Long Term Pavement Performance (LTPP) Studies, is also outlined in the report.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A discussion is presented of daytime sky imaging and techniques that may be applied to the analysis of full-color sky images to infer cloud macrophysical properties. Descriptions of two different types of skyimaging systems developed by the authors are presented, one of which has been developed into a commercially available instrument. Retrievals of fractional sky cover from automated processing methods are compared to human retrievals, both from direct observations and visual analyses of sky images. Although some uncertainty exists in fractional sky cover retrievals from sky images, this uncertainty is no greater than that attached to human observations for the commercially available sky-imager retrievals. Thus, the application of automatic digital image processing techniques on sky images is a useful method to complement, or even replace, traditional human observations of sky cover and, potentially, cloud type. Additionally, the possibilities for inferring other cloud parameters such as cloud brokenness and solar obstruction further enhance the usefulness of sky imagers

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.