958 resultados para automated lexical analysis
Resumo:
Purpose: To evaluate the suitability of an improved version of an automatic segmentation method based on geodesic active regions (GAR) for segmenting cerebral vasculature with aneurysms from 3D X-ray reconstruc-tion angiography (3DRA) and time of °ight magnetic resonance angiography (TOF-MRA) images available in the clinical routine.Methods: Three aspects of the GAR method have been improved: execution time, robustness to variability in imaging protocols and robustness to variability in image spatial resolutions. The improved GAR was retrospectively evaluated on images from patients containing intracranial aneurysms in the area of the Circle of Willis and imaged with two modalities: 3DRA and TOF-MRA. Images were obtained from two clinical centers, each using di®erent imaging equipment. Evaluation included qualitative and quantitative analyses ofthe segmentation results on 20 images from 10 patients. The gold standard was built from 660 cross-sections (33 per image) of vessels and aneurysms, manually measured by interventional neuroradiologists. GAR has also been compared to an interactive segmentation method: iso-intensity surface extraction (ISE). In addition, since patients had been imaged with the two modalities, we performed an inter-modality agreement analysis with respect to both the manual measurements and each of the two segmentation methods. Results: Both GAR and ISE di®ered from the gold standard within acceptable limits compared to the imaging resolution. GAR (ISE, respectively) had an average accuracy of 0.20 (0.24) mm for 3DRA and 0.27 (0.30) mm for TOF-MRA, and had a repeatability of 0.05 (0.20) mm. Compared to ISE, GAR had a lower qualitative error in the vessel region and a lower quantitative error in the aneurysm region. The repeatabilityof GAR was superior to manual measurements and ISE. The inter-modality agreement was similar between GAR and the manual measurements. Conclusions: The improved GAR method outperformed ISE qualitatively as well as quantitatively and is suitable for segmenting 3DRA and TOF-MRA images from clinical routine.
Resumo:
Objective: An inverse relationship between blood pressure and cognitive function has been found in adults, but limited data are available in adolescents and young adults. We prospectively examined the relation between blood pressure and cognitive function in adolescence. Methods: We examined the association between BP measured at the ages of 12-15 years in school surveys and cognitive endpoints measured in the Seychelles Child Development Study at ages 17 (n=407) and 19 (n=429) years respectively. We evaluated multiple domains of cognition based on subtests of the Cambridge Neurological Test Automated Battery (CANTAB), the Woodcock Johnson Test of Scholastic Achievement (WJTA), the Finger Tapping test (FT) and the Kaufman Brief Intelligence Test (K-BIT). We used age-, sex- and height-specific z-scores of systolic blood pressure (SBP), diastolic blood pressure (DBP) and mean arterial pressure (MAP). Results: Six out of the 21 cognitive endpoints tested were associated with BP. However, none of these associations were found to hold for both males and females or for different subtests within the same neurodevelopmental domain or for both SBP and DBP. Most of these associations disappeared when analyses were adjusted for selected potential confounding factors such as socio-economic status, birth weight, gestational age, body mass index, alcohol consumption, blood glucose, and total n-3 and n-6 polyunsaturated fats. Conclusions: Our findings do not support a consistent association between BP and subsequent performance on tests assessing various cognitive domains in adolescents.
Resumo:
Purpose: To examine the relationship of functional measurements with structural measures. Methods: 146 eyes of 83 test subjects underwent Heidelberg Retinal Tomography (HRTIII) (disc area<2.43, mphsd<40), and perimetry testing with Octopus (SAP; Dynamic), Pulsar (PP; TOP) and Moorfields MDT (ESTA). Glaucoma was defined as progressive structural or functional loss (20 eyes). Perimetry test points were grouped into 6 sectors based on the estimated optic nerve head angle into which the associated nerve fiber bundle enters (Garway-Heath map). Perimetry summary measures (PSM) (MD SAP/ MD PP/ PTD MDT) were calculated from the average total deviation of each measured threshold from the normal for each sector. We calculated the 95% significance level of the sectorial PSM from the respective normative data. We calculated the percentage agreement with group1 (G1), healthy on HRT and within normal perimetric limits, and group 2 (G2), abnormal on HRT and outside normal perimetric limits. We also examined the relationship of PSM and rim area (RA) in those sectors classified as abnormal by MRA (Moorfields Regression Analysis) of HRT. Results: The mean age was 65 (range= [37, 89]). The global sensitivity versus specificity of each instrument in detecting glaucomatous eyes was: MDT 80% vs. 88%, SAP 80% vs. 80%, PP 70% vs. 89% and HRT 80% vs. 79%. Highest percentage agreement of HRT (respectively G1, G2, sector) with PSM were MDT (89%, 57%, nasal superior), SAP (83%, 74%, temporal superior), PP (74%, 63%, nasal superior). Globally percentage agreement (respectively G1, G2) was MDT (92%, 28%), SAP (87%, 40%) and PP (77%, 49%). Linear regression showed there was no significant trend globally associating RA and PSM. However, sectorally the supero-nasal sector had a statistically significant (p<0.001) trend with each instrument, the associated r2 coefficients are (MDT 0.38 SAP 0.56 and PP 0.39). Conclusions: There were no significant differences in global sensitivity or specificity between instruments. Structure-function relationships varied significantly between instruments and were consistently strongest supero-nasally. Further studies are required to investigate these relationships in detail.
Resumo:
Acute brain slices are slices of brain tissue that are kept vital in vitro for further recordings and analyses. This tool is of major importance in neurobiology and allows the study of brain cells such as microglia, astrocytes, neurons and their inter/intracellular communications via ion channels or transporters. In combination with light/fluorescence microscopies, acute brain slices enable the ex vivo analysis of specific cells or groups of cells inside the slice, e.g. astrocytes. To bridge ex vivo knowledge of a cell with its ultrastructure, we developed a correlative microscopy approach for acute brain slices. The workflow begins with sampling of the tissue and precise trimming of a region of interest, which contains GFP-tagged astrocytes that can be visualised by fluorescence microscopy of ultrathin sections. The astrocytes and their surroundings are then analysed by high resolution scanning transmission electron microscopy (STEM). An important aspect of this workflow is the modification of a commercial cryo-ultramicrotome to observe the fluorescent GFP signal during the trimming process. It ensured that sections contained at least one GFP astrocyte. After cryo-sectioning, a map of the GFP-expressing astrocytes is established and transferred to correlation software installed on a focused ion beam scanning electron microscope equipped with a STEM detector. Next, the areas displaying fluorescence are selected for high resolution STEM imaging. An overview area (e.g. a whole mesh of the grid) is imaged with an automated tiling and stitching process. In the final stitched image, the local organisation of the brain tissue can be surveyed or areas of interest can be magnified to observe fine details, e.g. vesicles or gold labels on specific proteins. The robustness of this workflow is contingent on the quality of sample preparation, based on Tokuyasu's protocol. This method results in a reasonable compromise between preservation of morphology and maintenance of antigenicity. Finally, an important feature of this approach is that the fluorescence of the GFP signal is preserved throughout the entire preparation process until the last step before electron microscopy.
Resumo:
The aim of this study was to evaluate the forensic protocol recently developed by Qiagen for the QIAsymphony automated DNA extraction platform. Samples containing low amounts of DNA were specifically considered, since they represent the majority of samples processed in our laboratory. The analysis of simulated blood and saliva traces showed that the highest DNA yields were obtained with the maximal elution volume available for the forensic protocol, that is 200 ml. Resulting DNA extracts were too diluted for successful DNA profiling and required a concentration. This additional step is time consuming and potentially increases inversion and contamination risks. The 200 ml DNA extracts were concentrated to 25 ml, and the DNA recovery estimated with real-time PCR as well as with the percentage of SGM Plus alleles detected. Results using our manual protocol, based on the QIAamp DNA mini kit, and the automated protocol were comparable. Further tests will be conducted to determine more precisely DNA recovery, contamination risk and PCR inhibitors removal, once a definitive procedure, allowing the concentration of DNA extracts from low yield samples, will be available for the QIAsymphony.
Resumo:
TCRep 3D is an automated systematic approach for TCR-peptide-MHC class I structure prediction, based on homology and ab initio modeling. It has been considerably generalized from former studies to be applicable to large repertoires of TCR. First, the location of the complementary determining regions of the target sequences are automatically identified by a sequence alignment strategy against a database of TCR Vα and Vβ chains. A structure-based alignment ensures automated identification of CDR3 loops. The CDR are then modeled in the environment of the complex, in an ab initio approach based on a simulated annealing protocol. During this step, dihedral restraints are applied to drive the CDR1 and CDR2 loops towards their canonical conformations, described by Al-Lazikani et. al. We developed a new automated algorithm that determines additional restraints to iteratively converge towards TCR conformations making frequent hydrogen bonds with the pMHC. We demonstrated that our approach outperforms popular scoring methods (Anolea, Dope and Modeller) in predicting relevant CDR conformations. Finally, this modeling approach has been successfully applied to experimentally determined sequences of TCR that recognize the NY-ESO-1 cancer testis antigen. This analysis revealed a mechanism of selection of TCR through the presence of a single conserved amino acid in all CDR3β sequences. The important structural modifications predicted in silico and the associated dramatic loss of experimental binding affinity upon mutation of this amino acid show the good correspondence between the predicted structures and their biological activities. To our knowledge, this is the first systematic approach that was developed for large TCR repertoire structural modeling.
Resumo:
The quadrennial need study was developed to assist in identifying county highway financial needs (construction, rehabilitation, maintenance, and administration) and in the distribution of the road use tax fund (RUTF) among the counties in the state. During the period since the need study was first conducted using HWYNEEDS software, between 1982 and 1998, there have been large fluctuations in the level of funds distributed to individual counties. A recent study performed by Jim Cable (HR-363, 1993), found that one of the major factors affecting the volatility in the level of fluctuations is the quality of the pavement condition data collected and the accuracy of these data. In 1998, the Center for Transportation Research and Education researchers (Maze and Smadi) completed a project to study the feasibility of using automated pavement condition data collected for the Iowa Pavement Management Program (IPMP) for the paved county roads to be used in the HWYNEEDS software (TR-418). The automated condition data are objective and also more current since they are collected in a two year cycle compared to the 10-year cycle used by HWYNEEDS right now. The study proved the use of the automated condition data in HWYNEEDS would be feasible and beneficial in educing fluctuations when applied to a pilot study area. In another recommendation from TR-418, the researchers recommended a full analysis and investigation of HWYNEEDS methodology and parameters (for more information on the project, please review the TR-418 project report). The study reported in this document builds on the previous study on using the automated condition data in HWYNEEDS and covers the analysis and investigation of the HWYNEEDS computer program methodology and parameters. The underlying hypothesis for this study is thatalong with the IPMP automated condition data, some changes need to be made to HWYNEEDS parameters to accommodate the use of the new data, which will stabilize the process of allocating resources and reduce fluctuations from one quadrennial need study to another. Another objective of this research is to investigate the gravel roads needs and study the feasibility of developing a more objective approach to determining needs on the counties gravel road network. This study identifies new procedures by which the HWYNEEDS computer program is used to conduct the quadrennial needs study on paved roads. Also, a new procedure will be developed to determine gravel roads needs outside of the HWYNEED program. Recommendations are identified for the new procedures and also in terms of making changes to the current quadrennial need study. Future research areas are also identified.
Resumo:
Amplified Fragment Length Polymorphisms (AFLPs) are a cheap and efficient protocol for generating large sets of genetic markers. This technique has become increasingly used during the last decade in various fields of biology, including population genomics, phylogeography, and genome mapping. Here, we present RawGeno, an R library dedicated to the automated scoring of AFLPs (i.e., the coding of electropherogram signals into ready-to-use datasets). Our program includes a complete suite of tools for binning, editing, visualizing, and exporting results obtained from AFLP experiments. RawGeno can either be used with command lines and program analysis routines or through a user-friendly graphical user interface. We describe the whole RawGeno pipeline along with recommendations for (a) setting the analysis of electropherograms in combination with PeakScanner, a program freely distributed by Applied Biosystems; (b) performing quality checks; (c) defining bins and proceeding to scoring; (d) filtering nonoptimal bins; and (e) exporting results in different formats.
Resumo:
The reliable and objective assessment of chronic disease state has been and still is a very significant challenge in clinical medicine. An essential feature of human behavior related to the health status, the functional capacity, and the quality of life is the physical activity during daily life. A common way to assess physical activity is to measure the quantity of body movement. Since human activity is controlled by various factors both extrinsic and intrinsic to the body, quantitative parameters only provide a partial assessment and do not allow for a clear distinction between normal and abnormal activity. In this paper, we propose a methodology for the analysis of human activity pattern based on the definition of different physical activity time series with the appropriate analysis methods. The temporal pattern of postures, movements, and transitions between postures was quantified using fractal analysis and symbolic dynamics statistics. The derived nonlinear metrics were able to discriminate patterns of daily activity generated from healthy and chronic pain states.
Resumo:
The general strategy to perform anti-doping analyses of urine samples starts with the screening for a wide range of compounds. This step should be fast, generic and able to detect any sample that may contain a prohibited substance while avoiding false negatives and reducing false positive results. The experiments presented in this work were based on ultra-high-pressure liquid chromatography coupled to hybrid quadrupole time-of-flight mass spectrometry. Thanks to the high sensitivity of the method, urine samples could be diluted 2-fold prior to injection. One hundred and three forbidden substances from various classes (such as stimulants, diuretics, narcotics, anti-estrogens) were analysed on a C(18) reversed-phase column in two gradients of 9min (including two 3min equilibration periods) for positive and negative electrospray ionisation and detected in the MS full scan mode. The automatic identification of analytes was based on retention time and mass accuracy, with an automated tool for peak picking. The method was validated according to the International Standard for Laboratories described in the World Anti-Doping Code and was selective enough to comply with the World Anti-Doping Agency recommendations. In addition, the matrix effect on MS response was measured on all investigated analytes spiked in urine samples. The limits of detection ranged from 1 to 500ng/mL, allowing the identification of all tested compounds in urine. When a sample was reported positive during the screening, a fast additional pre-confirmatory step was performed to reduce the number of confirmatory analyses.
Resumo:
OBJECTIVE: Imaging during a period of minimal myocardial motion is of paramount importance for coronary MR angiography (MRA). The objective of our study was to evaluate the utility of FREEZE, a custom-built automated tool for the identification of the period of minimal myocardial motion, in both a moving phantom at 1.5 T and 10 healthy adults (nine men, one woman; mean age, 24.9 years; age range, 21-32 years) at 3 T. CONCLUSION: Quantitative analysis of the moving phantom showed that dimension measurements approached those obtained in the static phantom when using FREEZE. In vitro, vessel sharpness, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) were significantly improved when coronary MRA was performed during the software-prescribed period of minimal myocardial motion (p < 0.05). Consistent with these objective findings, image quality assessments by consensus review also improved significantly when using the automated prescription of the period of minimal myocardial motion. The use of FREEZE improves image quality of coronary MRA. Simultaneously, operator dependence can be minimized while the ease of use is improved.
Resumo:
Commercially available instruments for road-side data collection take highly limited measurements, require extensive manual input, or are too expensive for widespread use. However, inexpensive computer vision techniques for digital video analysis can be applied to automate the monitoring of driver, vehicle, and pedestrian behaviors. These techniques can measure safety-related variables that cannot be easily measured using existing sensors. The use of these techniques will lead to an improved understanding of the decisions made by drivers at intersections. These automated techniques allow the collection of large amounts of safety-related data in a relatively short amount of time. There is a need to develop an easily deployable system to utilize these new techniques. This project implemented and tested a digital video analysis system for use at intersections. A prototype video recording system was developed for field deployment. A computer interface was implemented and served to simplify and automate the data analysis and the data review process. Driver behavior was measured at urban and rural non-signalized intersections. Recorded digital video was analyzed and used to test the system.
Resumo:
The Federal Highway Administration estimates that red light running causes more than 100,000 crashes and 1,000 fatalities annually and results in an estimated economic loss of over $14 billion per year in the United States. In Iowa alone, a statewide analysis of red light running crashes, using crash data from 2001 to 2006, indicates that an average of 1,682 red light running crashes occur at signalized intersections every year. As a result, red light running poses a significant safety issue for communities. Communities rarely have the resources to place additional law enforcement in the field to combat the problem and they are increasingly using automated red light running camera-enforcement systems at signalized intersections. In Iowa, three communities currently use camera enforcement since 2004. These communities include Davenport, Council Bluffs, and Clive. As communities across the United States attempt to address red light running, a number of communities have implemented red light running camera enforcement programs. This report examines the red light running programs in Iowa and summarizes results of analyses to evaluate the effectiveness of such cameras.
Resumo:
An assay for the simultaneous analysis of pharmaceutical compounds and their metabolites from micro-whole blood samples (i.e. 5 microL) was developed using an on-line dried blood spot (on-line DBS) device coupled with hydrophilic interaction/reversed-phase (HILIC/RP) LC/MS/MS. Filter paper is directly integrated to the LC device using a homemade inox desorption cell. Without any sample pretreatment, analytes are desorbed from the paper towards an automated system of valves linking a zwitterionic-HILIC column to an RP C18 column. In the same run, the polar fraction is separated by the zwitterionic-HILIC column while the non-polar fraction is eluted on the RP C18. Both fractions are detected by IT-MS operating in full scan mode for the survey scan and in product ion mode for the dependant scan using an ESI source. The procedure was evaluated by the simultaneous qualitative analysis of four probes and their relative phase I and II metabolites spiked in whole blood. In addition, the method was successfully applied to the in vivo monitoring of buprenorphine metabolism after the administration of an intraperitoneal injection of 30 mg/kg on adult female Wistar rat.
Resumo:
Currently, there is no simple direct screening method for the misuse of blood transfusions in sports. In this study, we investigated whether the measurement of iron in EDTA-plasma can serve as biomarker for such purpose. Our results revealed an increase of the plasma iron level up to 25-fold 6 h after blood re-infusion. The variable remained elevated 10-fold one day after the procedure. A specificity of 100% and a sensitivity of 93% were obtained with a proposed threshold at 45 µg/dL of plasma iron. Therefore, our test could be used as a simple, cost effective biomarker for the screening for blood transfusion misuse in sports. Copyright © 2014 John Wiley & Sons, Ltd.