956 resultados para automated analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visual analysis of electroencephalography (EEG) background and reactivity during therapeutic hypothermia provides important outcome information, but is time-consuming and not always consistent between reviewers. Automated EEG analysis may help quantify the brain damage. Forty-six comatose patients in therapeutic hypothermia, after cardiac arrest, were included in the study. EEG background was quantified with burst-suppression ratio (BSR) and approximate entropy, both used to monitor anesthesia. Reactivity was detected through change in the power spectrum of signal before and after stimulation. Automatic results obtained almost perfect agreement (discontinuity) to substantial agreement (background reactivity) with a visual score from EEG-certified neurologists. Burst-suppression ratio was more suited to distinguish continuous EEG background from burst-suppression than approximate entropy in this specific population. Automatic EEG background and reactivity measures were significantly related to good and poor outcome. We conclude that quantitative EEG measurements can provide promising information regarding current state of the patient and clinical outcome, but further work is needed before routine application in a clinical setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A sensitive, selective, and reproducible in-tube solid-phase microextraction and liquid chromatographic (in-tube SPME/LC-UV) method for determination of lidocaine and its metabolite monoethylglycinexylidide (MEGX) in human plasma has been developed, validated, and further applied to pharmacokinetic study in pregnant women with gestational diabetes mellitus (GDM) subjected to epidural anesthesia. Important factors in the optimization of in-tube SPME performance are discussed, including the draw/eject sample volume, draw/eject cycle number, draw/eject flow rate, sample pH, and influence of plasma proteins. The limits of quantification of the in-tube SPME/LC method were 50 ng/mL for both metabolite and lidocaine. The interday and intraday precision had coefficients of variation lower than 8%, and accuracy ranged from 95 to 117%. The response of the in-tube SPME/LC method for analytes was linear over a dynamic range from 50 to 5000 ng/mL, with correlation coefficients higher than 0.9976. The developed in-tube SPME/LC method was successfully used to analyze lidocaine and its metabolite in plasma samples from pregnant women with GDM subjected to epidural anesthesia for pharmacokinetic study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A first phase of the research activity has been related to the study of the state of art of the infrastructures for cycling, bicycle use and methods for evaluation. In this part, the candidate has studied the "bicycle system" in countries with high bicycle use and in particular in the Netherlands. Has been carried out an evaluation of the questionnaires of the survey conducted within the European project BICY on mobility in general in 13 cities of the participating countries. The questionnaire was designed, tested and implemented, and was later validated by a test in Bologna. The results were corrected with information on demographic situation and compared with official data. The cycling infrastructure analysis was conducted on the basis of information from the OpenStreetMap database. The activity consisted in programming algorithms in Python that allow to extract data from the database infrastructure for a region, to sort and filter cycling infrastructure calculating some attributes, such as the length of the arcs paths. The results obtained were compared with official data where available. The structure of the thesis is as follows: 1. Introduction: description of the state of cycling in several advanced countries, description of methods of analysis and their importance to implement appropriate policies for cycling. Supply and demand of bicycle infrastructures. 2. Survey on mobility: it gives details of the investigation developed and the method of evaluation. The results obtained are presented and compared with official data. 3. Analysis cycling infrastructure based on information from the database of OpenStreetMap: describes the methods and algorithms developed during the PhD. The results obtained by the algorithms are compared with official data. 4. Discussion: The above results are discussed and compared. In particular the cycle demand is compared with the length of cycle networks within a city. 5. Conclusions

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we describe an algorithm that automatically detects and labels peaks I - VII of the normal, suprathreshold auditory brainstem response (ABR). The algorithm proceeds in three stages, with the option of a fourth: ( 1) all candidate peaks and troughs in the ABR waveform are identified using zero crossings of the first derivative, ( 2) peaks I - VII are identified from these candidate peaks based on their latency and morphology, ( 3) if required, peaks II and IV are identified as points of inflection using zero crossings of the second derivative and ( 4) interpeak troughs are identified before peak latencies and amplitudes are measured. The performance of the algorithm was estimated on a set of 240 normal ABR waveforms recorded using a stimulus intensity of 90 dBnHL. When compared to an expert audiologist, the algorithm correctly identified the major ABR peaks ( I, III and V) in 96 - 98% of the waveforms and the minor ABR peaks ( II, IV, VI and VII) in 45 - 83% of waveforms. Whilst peak II was correctly identified in only 83% and peak IV in 77% of waveforms, it was shown that 5% of the peak II identifications and 31% of the peak IV identifications came as a direct result of allowing these peaks to be found as points of inflection. Copyright (C) 2005 S. Karger AG, Basel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Automated measurement of LV function could extend the clinical utility of echo by less expert readers. We sought to define normal ranges of global 2D strain (2DS) and strain-rate (SR) in an international, multicenter study of healthy subjects, and to assess the determinants of variation. Methods: SR and 2DS were measured in 18 myocardial segts in both apical and short axis views of 227 normal subjects (38% men, 48±14y) with no cardiac history, risk factors or drug therapy. The association of age and resting hemodynamics with global strain indices was sought using multiple regression. Differences in variance were expressed as F values. Results: Baseline SBP was 127±18 mmHg, pulse was 76±13/min and ejection fraction 50±20%. Although global longitudinal strain was influenced by endsystolic volume (F=4.2, p

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this article, we show how the use of state-of-the-art methods in computer science based on machine perception and learning allows the unobtrusive capture and automated analysis of interpersonal behavior in real time (social sensing). Given the high ecological validity of the behavioral sensing, the ease of behavioral-cue extraction for large groups over long observation periods in the field, the possibility of investigating completely new research questions, and the ability to provide people with immediate feedback on behavior, social sensing will fundamentally impact psychology.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The distributions of times to first cell division were determined for populations of Escherichia coli stationary-phase cells inoculated onto agar media. This was accomplished by using automated analysis of digital images of individual cells growing on agar and calculation of the "box area ratio." Using approximately 300 cells per experiment, the mean time to first division and standard deviation for cells grown in liquid medium at 37 degrees C and inoculated on agar and incubated at 20 degrees C were determined as 3.0 h and 0.7 h, respectively. Distributions were observed to tail toward the higher values, but no definitive model distribution was identified. Both preinoculation stress by heating cultures at 50 degrees C and postinoculation stress by growth in the presence of higher concentrations of NaCl increased mean times to first division. Both stresses also resulted in an increase in the spread of the distributions that was proportional to the mean division time, the coefficient of variation being constant at approximately 0.2 in all cases. The "relative division time," which is the time to first division for individual cells expressed in terms of the cell size doubling time, was used as measure of the "work to be done" to prepare for cell division. Relative division times were greater for heat-stressed cells than for those growing under osmotic stress.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Objective: In an attempt to clarify the clonality and genetic relationships that are involved in the tumorigenesis of uterine leiomyomas, we used a total of 43 multiple leiomyomas from 14 patients and analyzed the allelic status with 15 microsatellite markers and X chromosome inactivation analysis.Study design: We have used a set of 15 microsatellite polymorphism markers mapped on 3q, 7p, 11, and 15q by automated analysis. The X chromosome inactivation was evaluated by the methylation status of the X-linked androgen receptor gene.Results: Loss of heterozygosity analysis showed a different pattern in 7 of the 8 cases with allelic loss for at least 1 of 15 microsatellite markers that were analyzed. A similar loss of heterozygosity findings at 7p22-15 was detected in 3 samples from the same patient. X chromosome inactivation analysis demonstrated the same inactivated allele in all tumors of the 9 of 12 informative patients;. different inactivation patterns were observed in 3 cases.Conclusion: Our data support the concept that uterine leiomyomas are derived from a single cell but are generated independently in the uterus. Loss of heterozygosity findings at 7p22-15 are consistent with previous data that suggested the relevance of chromosomal aberrations at 7p that were involved in individual uterine leiomyomas. (C) 2005 Mosby, Inc. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Since Sharir and Pnueli, algorithms for context-sensitivity have been defined in terms of 'valid' paths in an interprocedural flow graph. The definition of valid paths requires atomic call and ret statements, and encapsulated procedures. Thus, the resulting algorithms are not directly applicable when behavior similar to call and ret instructions may be realized using non-atomic statements, or when procedures do not have rigid boundaries, such as with programs in low level languages like assembly or RTL. We present a framework for context-sensitive analysis that requires neither atomic call and ret instructions, nor encapsulated procedures. The framework presented decouples the transfer of control semantics and the context manipulation semantics of statements. A new definition of context-sensitivity, called stack contexts, is developed. A stack context, which is defined using trace semantics, is more general than Sharir and Pnueli's interprocedural path based calling-context. An abstract interpretation based framework is developed to reason about stack-contexts and to derive analogues of calling-context based algorithms using stack-context. The framework presented is suitable for deriving algorithms for analyzing binary programs, such as malware, that employ obfuscations with the deliberate intent of defeating automated analysis. The framework is used to create a context-sensitive version of Venable et al.'s algorithm for analyzing x86 binaries without requiring that a binary conforms to a standard compilation model for maintaining procedures, calls, and returns. Experimental results show that a context-sensitive analysis using stack-context performs just as well for programs where the use of Sharir and Pnueli's calling-context produces correct approximations. However, if those programs are transformed to use call obfuscations, a contextsensitive analysis using stack-context still provides the same, correct results and without any additional overhead. © Springer Science+Business Media, LLC 2011.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Ziel dieser Dissertation ist die experimentelle Charakterisierung und quantitative Beschreibung der Hybridisierung von komplementären Nukleinsäuresträngen mit oberflächengebundenen Fängermolekülen für die Entwicklung von integrierten Biosensoren. Im Gegensatz zu lösungsbasierten Verfahren ist mit Microarray Substraten die Untersuchung vieler Nukleinsäurekombinationen parallel möglich. Als biologisch relevantes Evaluierungssystem wurde das in Eukaryoten universell exprimierte Actin Gen aus unterschiedlichen Pflanzenspezies verwendet. Dieses Testsystem ermöglicht es, nahe verwandte Pflanzenarten auf Grund von geringen Unterschieden in der Gen-Sequenz (SNPs) zu charakterisieren. Aufbauend auf dieses gut studierte Modell eines House-Keeping Genes wurde ein umfassendes Microarray System, bestehend aus kurzen und langen Oligonukleotiden (mit eingebauten LNA-Molekülen), cDNAs sowie DNA und RNA Targets realisiert. Damit konnte ein für online Messung optimiertes Testsystem mit hohen Signalstärken entwickelt werden. Basierend auf den Ergebnissen wurde der gesamte Signalpfad von Nukleinsärekonzentration bis zum digitalen Wert modelliert. Die aus der Entwicklung und den Experimenten gewonnen Erkenntnisse über die Kinetik und Thermodynamik von Hybridisierung sind in drei Publikationen zusammengefasst die das Rückgrat dieser Dissertation bilden. Die erste Publikation beschreibt die Verbesserung der Reproduzierbarkeit und Spezifizität von Microarray Ergebnissen durch online Messung von Kinetik und Thermodynamik gegenüber endpunktbasierten Messungen mit Standard Microarrays. Für die Auswertung der riesigen Datenmengen wurden zwei Algorithmen entwickelt, eine reaktionskinetische Modellierung der Isothermen und ein auf der Fermi-Dirac Statistik beruhende Beschreibung des Schmelzüberganges. Diese Algorithmen werden in der zweiten Publikation beschrieben. Durch die Realisierung von gleichen Sequenzen in den chemisch unterschiedlichen Nukleinsäuren (DNA, RNA und LNA) ist es möglich, definierte Unterschiede in der Konformation des Riboserings und der C5-Methylgruppe der Pyrimidine zu untersuchen. Die kompetitive Wechselwirkung dieser unterschiedlichen Nukleinsäuren gleicher Sequenz und die Auswirkungen auf Kinetik und Thermodynamik ist das Thema der dritten Publikation. Neben der molekularbiologischen und technologischen Entwicklung im Bereich der Sensorik von Hybridisierungsreaktionen oberflächengebundener Nukleinsäuremolekülen, der automatisierten Auswertung und Modellierung der anfallenden Datenmengen und der damit verbundenen besseren quantitativen Beschreibung von Kinetik und Thermodynamik dieser Reaktionen tragen die Ergebnisse zum besseren Verständnis der physikalisch-chemischen Struktur des elementarsten biologischen Moleküls und seiner nach wie vor nicht vollständig verstandenen Spezifizität bei.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND: In contrast to RIA, recently available ELISAs provide the potential for fully automated analysis of adiponectin. To date, studies reporting on the diagnostic characteristics of ELISAs and investigating on the relationship between ELISA- and RIA-based methods are rare. METHODS: Thus, we established and evaluated a fully automated platform (BEP 2000; Dade-Behring, Switzerland) for determination of adiponectin levels in serum by two different ELISA methods (competitive human adiponectin ELISA; high sensitivity human adiponectin sandwich ELISA; both Biovendor, Czech Republic). Further, as a reference method, we also employed a human adiponectin RIA (Linco Research, USA). Samples from 150 patients routinely presenting to our cardiology unit were tested. RESULTS: ELISA measurements could be accomplished in less than 3 h, measurement of RIA had a duration of 24 h. The ELISAs were evaluated for precision, analytical sensitivity and specificity, linearity on dilution and spiking recovery. In the investigated patients, type 2 diabetes, higher age and male gender were significantly associated with lower serum adiponectin concentrations. Correlations between the ELISA methods and the RIA were strong (competitive ELISA, r=0.82; sandwich ELISA, r=0.92; both p<0.001). However, Deming regression and Bland-Altman analysis indicated lack of agreement of the 3 methods preventing direct comparison of results. The equations of the regression lines are: Competitive ELISA=1.48 x RIA-0.88; High sensitivity sandwich ELISA=0.77 x RIA+1.01. CONCLUSIONS: Fully automated measurement of adiponectin by ELISA is feasible and substantially more rapid than RIA. The investigated ELISA test systems seem to exhibit analytical characteristics allowing for clinical application. In addition, there is a strong correlation between the ELISA methods and RIA. These findings might promote a more widespread use of adiponectin measurements in clinical research.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Automatic cost analysis of programs has been traditionally concentrated on a reduced number of resources such as execution steps, time, or memory. However, the increasing relevance of analysis applications such as static debugging and/or certiflcation of user-level properties (including for mobile code) makes it interesting to develop analyses for resource notions that are actually application-dependent. This may include, for example, bytes sent or received by an application, number of files left open, number of SMSs sent or received, number of accesses to a datábase, money spent, energy consumption, etc. We present a fully automated analysis for inferring upper bounds on the usage that a Java bytecode program makes of a set of application programmer-deflnable resources. In our context, a resource is defined by programmer-provided annotations which state the basic consumption that certain program elements make of that resource. From these deflnitions our analysis derives functions which return an upper bound on the usage that the whole program (and individual blocks) make of that resource for any given set of input data sizes. The analysis proposed is independent of the particular resource. We also present some experimental results from a prototype implementation of the approach covering a signiflcant set of interesting resources.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Automatic cost analysis of programs has been traditionally studied in terms of a number of concrete, predefined resources such as execution steps, time, or memory. However, the increasing relevance of analysis applications such as static debugging and/or certification of user-level properties (including for mobile code) makes it interesting to develop analyses for resource notions that are actually applicationdependent. This may include, for example, bytes sent or received by an application, number of files left open, number of SMSs sent or received, number of accesses to a database, money spent, energy consumption, etc. We present a fully automated analysis for inferring upper bounds on the usage that a Java bytecode program makes of a set of application programmer-definable resources. In our context, a resource is defined by programmer-provided annotations which state the basic consumption that certain program elements make of that resource. From these definitions our analysis derives functions which return an upper bound on the usage that the whole program (and individual blocks) make of that resource for any given set of input data sizes. The analysis proposed is independent of the particular resource. We also present some experimental results from a prototype implementation of the approach covering an ample set of interesting resources.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The recent ability to sequence whole genomes allows ready access to all genetic material. The approaches outlined here allow automated analysis of sequence for the synthesis of optimal primers in an automated multiplex oligonucleotide synthesizer (AMOS). The efficiency is such that all ORFs for an organism can be amplified by PCR. The resulting amplicons can be used directly in the construction of DNA arrays or can be cloned for a large variety of functional analyses. These tools allow a replacement of single-gene analysis with a highly efficient whole-genome analysis.