998 resultados para computerized data
Resumo:
This paper presents an automated solution for precise detection of fiducial screws from three-dimensional (3D) Computerized Tomography (CT)/Digital Volume Tomography (DVT) data for image-guided ENT surgery. Unlike previously published solutions, we regard the detection of the fiducial screws from the CT/DVT volume data as a pose estimation problem. We thus developed a model-based solution. Starting from a user-supplied initialization, our solution detects the fiducial screws by iteratively matching a computer aided design (CAD) model of the fiducial screw to features extracted from the CT/DVT data. We validated our solution on one conventional CT dataset and on five DVT volume datasets, resulting in a total detection of 24 fiducial screws. Our experimental results indicate that the proposed solution achieves much higher reproducibility and precision than the manual detection. Further comparison shows that the proposed solution produces better results on the DVT dataset than on the conventional CT dataset.
Resumo:
This study evaluates the clinical applicability of administering sodium nitroprusside by a closed-loop titration system compared with a manually adjusted system. The mean arterial pressure (MAP) was registered every 10 and 30 sec during the first 150 min after open heart surgery in 20 patients (group 1: computer regulation) and in ten patients (group 2: manual regulation). The results (16,343 and 2,912 data points in groups 1 and 2, respectively), were then analyzed in four time frames and five pressure ranges to indicate clinical efficacy. Sixty percent of the measured MAP in both groups was within the desired +/- 10% during the first 10 min. Thereafter until the end of observation, the MAP was maintained within +/- 10% of the desired set-point 90% of the time in group 1 vs. 60% of the time in group 2. One percent and 11% of data points were +/- 20% from the set-point in groups 1 and 2, respectively (p less than .05, chi-square test). The computer-assisted therapy provided better control of MAP, was safe to use, and helped to reduce nursing demands.
Resumo:
OBJECTIVE: To identify and describe unintended adverse consequences related to clinical workflow when implementing or using computerized provider order entry (CPOE) systems. METHODS: We analyzed qualitative data from field observations and formal interviews gathered over a three-year period at five hospitals in three organizations. Five multidisciplinary researchers worked together to identify themes related to the impacts of CPOE systems on clinical workflow. RESULTS: CPOE systems can affect clinical work by 1) introducing or exposing human/computer interaction problems, 2) altering the pace, sequencing, and dynamics of clinical activities, 3) providing only partial support for the work activities of all types of clinical personnel, 4) reducing clinical situation awareness, and 5) poorly reflecting organizational policy and procedure. CONCLUSIONS: As CPOE systems evolve, those involved must take care to mitigate the many unintended adverse effects these systems have on clinical workflow. Workflow issues resulting from CPOE can be mitigated by iteratively altering both clinical workflow and the CPOE system until a satisfactory fit is achieved.
Resumo:
Coalescent theory represents the most significant progress in theoretical population genetics in the past three decades. The coalescent theory states that all genes or alleles in a given population are ultimately inherited from a single ancestor shared by all members of the population, known as the most recent common ancestor. It is now widely recognized as a cornerstone for rigorous statistical analyses of molecular data from population [1]. The scientists have developed a large number of coalescent models and methods[2,3,4,5,6], which are not only applied in coalescent analysis and process, but also in today’s population genetics and genome studies, even public health. The thesis aims at completing a statistical framework based on computers for coalescent analysis. This framework provides a large number of coalescent models and statistic methods to assist students and researchers in coalescent analysis, whose results are presented in various formats as texts, graphics and printed pages. In particular, it also supports to create new coalescent models and statistical methods. ^
Resumo:
This study evaluated the administration-time-dependent effects of a stimulant (Dexedrine 5-mg), a sleep-inducer (Halcion 0.25-mg) and placebo (control) on human performance. The investigation was conducted on 12 diurnally active (0700-2300) male adults (23-38 yrs) using a double-blind, randomized sixway-crossover three-treatment, two-timepoint (0830 vs 2030) design. Performance tests were conducted hourly during sleepless 13-hour studies using a computer generated, controlled and scored multi-task cognitive performance assessment battery (PAB) developed at the Walter Reed Army Institute of Research. Specific tests were Simple and Choice Reaction Time, Serial Addition/Subtraction, Spatial Orientation, Logical Reasoning, Time Estimation, Response Timing and the Stanford Sleepiness Scale. The major index of performance was "Throughput", a combined measure of speed and accuracy.^ For the Placebo condition, Single and Group Cosinor Analysis documented circadian rhythms in cognitive performance for the majority of tests, both for individuals and for the group. Performance was best around 1830-2030 and most variable around 0530-0700 when sleepiness was greatest (0300).^ Morning Dexedrine dosing marginally enhanced performance an average of 3% with reference to the corresponding in time control level. Dexedrine AM also increased alertness by 10% over the AM control. Dexedrine PM failed to improve performance with reference to the corresponding PM control baseline. With regard to AM and PM Dexedrine administrations, AM performance was 6% better with subjects 25% more alert.^ Morning Halcion administration caused a 7% performance decrement and 16% increase in sleepiness and a 13% decrement and 10% increase in sleepiness when administered in the evening compared to corresponding in time control data. Performance was 9% worse and sleepiness 24% greater after evening versus morning Halcion administration.^ These results suggest that for evening Halcion dosing, the overnight sleep deprivation occurring in coincidence with the nadir in performance due to circadian rhythmicity together with the CNS depressant effects combine to produce performance degradation. For Dexedrine, morning administration resulted in only marginal performance enhancement; Dexedrine in the evening was less effective, suggesting the 5-mg dose level may be too low to counteract the partial sleep deprivation and nocturnal nadir in performance. ^
Resumo:
The purpose of this study was to evaluate the adequacy of computerized vital records in Texas for conducting etiologic studies on neural tube defects (NTDs), using the revised and expanded National Centers for Health Statistics vital record forms introduced in Texas in 1989.^ Cases of NTDs (anencephaly and spina bifida) among Harris County (Houston) residents were identified from the computerized birth and death records for 1989-1991. The validity of the system was then measured against cases ascertained independently through medical records and death certificates. The computerized system performed poorly in its identification of NTDs, particularly for anencephaly, where the false positive rate was 80% with little or no improvement over the 3-year period. For both NTDs the sensitivity and predictive value positive of the tapes were somewhat higher for Hispanic than non-Hispanic mothers.^ Case control studies were conducted utilizing the tape set and the independently verified data set, using controls selected from the live birth tapes. Findings varied widely between the data sets. For example, the anencephaly odds ratio for Hispanic mothers (vs. non-Hispanic) was 1.91 (CI = 1.38-2.65) for the tape file, but 3.18 (CI = 1.81-5.58) for verified records. The odds ratio for diabetes was elevated for the tape set (OR = 3.33, CI = 1.67-6.66) but not for verified cases (OR = 1.09, CI = 0.24-4.96), among whom few mothers were diabetic. It was concluded that computerized tapes should not be solely relied on for NTD studies.^ Using the verified cases, Hispanic mother was associated with spina bifida, and Hispanic mother, teen mother, and previous pregnancy terminations were associated with anencephaly. Mother's birthplace, education, parity, and diabetes were not significant for either NTD.^ Stratified analyses revealed several notable examples of statistical interaction. For anencephaly, strong interaction was observed between Hispanic origin and trimester of first prenatal care.^ The prevalence was 3.8 per 10,000 live births for anencephaly and 2.0 for spina bifida (5.8 per 10,000 births for the combined categories). ^
Resumo:
Desde el inicio de los tiempos el ser humano ha tenido la necesidad de comprender y analizar todo lo que nos rodea, para ello se ha valido de diferentes herramientas como las pinturas rupestres, la biblioteca de Alejandría, bastas colecciones de libros y actualmente una enorme cantidad de información informatizada. Todo esto siempre se ha almacenado, según la tecnología de la época lo permitía, con la esperanza de que fuera útil mediante su consulta y análisis. En la actualidad continúa ocurriendo lo mismo. Hasta hace unos años se ha realizado el análisis de información manualmente o mediante bases de datos relacionales. Ahora ha llegado el momento de una nueva tecnología, Big Data, con la cual se puede realizar el análisis de extensas cantidades de datos de todo tipo en tiempos relativamente pequeños. A lo largo de este libro, se estudiarán las características y ventajas de Big Data, además de realizar un estudio de la plataforma Hadoop. Esta es una plataforma basada en Java y puede realizar el análisis de grandes cantidades de datos de diferentes formatos y procedencias. Durante la lectura de estas páginas se irá dotando al lector de los conocimientos previos necesarios para su mejor comprensión, así como de ubicarle temporalmente en el desarrollo de este concepto, de su uso, las previsiones y la evolución y desarrollo que se prevé tenga en los próximos años. ABSTRACT. Since the beginning of time, human being was in need of understanding and analyzing everything around him. In order to do that, he used different media as cave paintings, Alexandria library, big amount of book collections and nowadays massive amount of computerized information. All this information was stored, depending on the age and technology capability, with the expectation of being useful though it consulting and analysis. Nowadays they keep doing the same. In the last years, they have been processing the information manually or using relational databases. Now it is time for a new technology, Big Data, which is able to analyze huge amount of data in a, relatively, small time. Along this book, characteristics and advantages of Big Data will be detailed, so as an introduction to Hadoop platform. This platform is based on Java and can perform the analysis of massive amount of data in different formats and coming from different sources. During this reading, the reader will be provided with the prior knowledge needed to it understanding, so as the temporal location, uses, forecast, evolution and growth in the next years.
Resumo:
A computational system for the prediction of polymorphic loci directly and efficiently from human genomic sequence was developed and verified. A suite of programs, collectively called pompous (polymorphic marker prediction of ubiquitous simple sequences) detects tandem repeats ranging from dinucleotides up to 250 mers, scores them according to predicted level of polymorphism, and designs appropriate flanking primers for PCR amplification. This approach was validated on an approximately 750-kilobase region of human chromosome 3p21.3, involved in lung and breast carcinoma homozygous deletions. Target DNA from 36 paired B lymphoblastoid and lung cancer lines was amplified and allelotyped for 33 loci predicted by pompous to be variable in repeat size. We found that among those 36 predominately Caucasian individuals 22 of the 33 (67%) predicted loci were polymorphic with an average heterozygosity of 0.42. Allele loss in this region was found in 27/36 (75%) of the tumor lines using these markers. pompous provides the genetic researcher with an additional tool for the rapid and efficient identification of polymorphic markers, and through a World Wide Web site, investigators can use pompous to identify polymorphic markers for their research. A catalog of 13,261 potential polymorphic markers and associated primer sets has been created from the analysis of 141,779,504 base pairs of human genomic sequence in GenBank. This data is available on our Web site (pompous.swmed.edu) and will be updated periodically as GenBank is expanded and algorithm accuracy is improved.
Resumo:
Mode of access: Internet.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
Study Design. Survey of intraobserver and interobserver measurement variability. Objective. To assess the use of reformatted computerized tomography (CT) images for manual measurement of coronal Cobb angles in idiopathic scoliosis. Summary of Background Data. Cobb angle measurements in idiopathic scoliosis are traditionally made from standing radiographs, whereas CT is often used for assessment of vertebral rotation. Correlating Cobb angles from standing radiographs with vertebral rotations from supine CT is problematic because the geometry of the spine changes significantly from standing to supine positions, and 2 different imaging methods are involved. Methods. We assessed the use of reformatted thoracolumbar CT images for Cobb angle measurement. Preoperative CT of 12 patients with idiopathic scoliosis were used to generate reformatted coronal images. Five observers measured coronal Cobb angles on 3 occasions from each of the images. Intraobserver and interobserver variability associated with Cobb measurement from reformatted CT scans was assessed and compared with previous studies of measurement variability using plain radiographs. Results. For major curves, 95% confidence intervals for intraobserver and interobserver variability were +/- 6.6 degrees and +/- 7.7 degrees, respectively. For minor curves, the intervals were +/- 7.5 degrees and +/- 8.2 degrees, respectively. Intraobserver and interobserver technical error of measurement was 2.4 degrees and 2.7 degrees, with reliability coefficients of 88% and 84%, respectively. There was no correlation between measurement variability and curve severity. Conclusions. Reformatted CT images may be used for manual measurement of coronal Cobb angles in idiopathic scoliosis with similar variability to manual measurement of plain radiographs.
Resumo:
This research describes a computerized model of human classification which has been constructed to represent the process by which assessments are made for psychodynamic psychotherapy. The model assigns membership grades (MGs) to clients so that the most suitable ones have high values in the therapy category. Categories consist of a hierarchy of components, one of which, ego strength, is analysed in detail to demonstrate the way it has captured the psychotherapist's knowledge. The bottom of the hierarchy represents the measurable factors being assessed during an interview. A questionnaire was created to gather the identified information and was completed by the psychotherapist after each assessment. The results were fed into the computerized model, demonstrating a high correlation between the model MGs and the suitability ratings of the psychotherapist (r = .825 for 24 clients). The model has successfully identified the relevant data involved in assessment and simulated the decision-making process of the expert. Its cognitive validity enables decisions to be explained, which means that it has potential for therapist training and also for enhancing the referral process, with benefits in cost effectiveness as well as in the reduction of trauma to clients. An adapted version measuring client improvement would give quantitative evidence for the benefit of therapy, thereby supporting auditing and accountability. © 1997 The British Psychological Society.
Resumo:
Melanoma is a type of skin cancer and is caused by the uncontrolled growth of atypical melanocytes. In recent decades, computer aided diagnosis is used to support medical professionals; however, there is still no globally accepted tool. In this context, similar to state-of-the-art we propose a system that receives a dermatoscopy image and provides a diagnostic if the lesion is benign or malignant. This tool is composed with next modules: Preprocessing, Segmentation, Feature Extraction, and Classification. Preprocessing involves the removal of hairs. Segmentation is to isolate the lesion. Feature extraction is considering the ABCD dermoscopy rule. The classification is performed by the Support Vector Machine. Experimental evidence indicates that the proposal has 90.63 % accuracy, 95 % sensitivity, and 83.33 % specificity on a data-set of 104 dermatoscopy images. These results are favorable considering the performance of diagnosis by traditional progress in the area of dermatology