910 resultados para Automated quantification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current noninvasive techniques for the routine and frequent quantification of peripheral lymphedema in patients are total limb volume measurement (by water immersion or by circumferential measurements) and bioelectrical impedance analysis (BIA). However both of these techniques require standardizing the measurement using a contralateral measurement from the unaffected limb, Hence these techniques are essentially restricted to unilateral lymphedema. This paper describes the results from a preliminary study to investigate an alternative approach to the analysis of the data from multiple frequency BIA to produce an index of lymphedema without the need for normalization to another body segment. Twenty patients receiving surgical treatment for breast cancer were monitored prior to surgery and again after diagnosis with unilateral lymphedema. The data recorded were total limb volume, by circumferential measurements; and BIA measurements of both limbs. From these measurements total limb volumes and extracellular fluid volumes were calculated and expressed as ratios of the affected limb to that of the unaffected limb. An index of the ratio of the extracellular fluid volume to the intracellular fluid volume was determined. This ECW/ICW index was calculated for both the affected and unaffected limbs at both measurement times. Results confirmed that the established techniques of total limb volume and extracellular fluid volume normalized to the unaffected contralateral limb were accurate in the detection of lymphedema (p < 10(-6)). Comparison of the ECW/ICW index from the affected limb after diagnosis with that from the pre-surgery measurement revealed a significant (p< 10(-6)) and considerable (75%) increase. The results of this pilot study suggest that by using multiple frequency bioelectrical impedance analysis, an index of the ECW/ICW ratio can be obtained and this index appears to have an equal, or better, sensitivity, than the other techniques in detecting lymphedema. More importantly, this index does not require normalization to another body segment and can be used to detect all types of peripheral edema including both unilateral and bilateral lymphedema.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Tissue Doppler may be used to quantify regional left ventricular function but is limited by segmental variation of longitudinal velocity from base to apex and free to septal walls. We sought to overcome this by developing a composite of longitudinal and radial velocities. Methods and Results. We examined 82 unselected patients undergoing a standard dobutamine echocardiogram. Longitudinal velocity was obtained in the basal and mid segments of each wall using tissue Doppler in the apical views. Radial velocities were derived in the same segments using an automated border detection system and centerline method with regional chords grouped according to segment location and temporally averaged. In 25 patients at low probability of coronary disease, the pattern of regional variation in longitudinal velocity (higher in the septum) was the opposite of radial velocity (higher in the free wall) and the combination was homogenous. In 57 patients undergoing angiography, velocity in abnormal segments was less than normal segments using longitudinal (6.0 +/- 3.6 vs 9.0 +/- 2.2 cm/s, P = .01) and radial velocity (6.0 +/- 4.0 vs 8.0 +/- 3.9 cm/s, P = .02). However, the composite velocity permitted better separation of abnormal and normal segments (13.3 +/- 5.6 vs 17.5 +/- 4.2 cm/s, P = .001). There was no significant difference between the accuracy of this quantitative approach and expert visual wall motion analysis (81% vs 84%, P = .56). Conclusion: Regional variation of uni-dimensional myocardial velocities necessitates site-specific normal ranges, probably because of different fiber directions. Combined analysis of longitudinal and radial velocities allows the derivation of a composite velocity, which is homogenous in all segments and may allow better separation of normal and abnormal myocardium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report here a validated method for the quantification of a new immunosuppressant drug, everolimus (SDZ RAD), using HPLC-tandem mass spectrometry. Whole blood samples (500 mul) were prepared by protein precipitation, followed by C-18 solid-phase extraction. Mass spectrometric detection was by selected reaction monitoring with an electrospray interface operating in positive ionization mode. The assay was linear from 0.5 to 100 mug/l (r(2) > 0.996, n = 9). The analytical recovery and inter-day imprecision, determined using whole blood quality control samples (n = 5) at 0.5, 1.2, 20.0, and 75.0 mug/l, was 100.3-105.4% and less than or equal to7.6%, respectively. The assay had a mean relative recovery of 94.8 +/- 3.8%. Extracted samples were stable for up to 24 h. Fortified everolimus blood samples were stable at -80 degreesC for at least 8 months and everolimus was found to be stable in blood when taken through at least three freeze-thaw cycles. The reported method provides accurate, precise and specific measurement of everolimus in blood over a wide analytical range and is currently supporting phase 11 and III clinical trials. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The refinement calculus is a well-established theory for deriving program code from specifications. Recent research has extended the theory to handle timing requirements, as well as functional ones, and we have developed an interactive programming tool based on these extensions. Through a number of case studies completed using the tool, this paper explains how the tool helps the programmer by supporting the many forms of variables needed in the theory. These include simple state variables as in the untimed calculus, trace variables that model the evolution of properties over time, auxiliary variables that exist only to support formal reasoning, subroutine parameters, and variables shared between parallel processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims: To quantify Listeria levels on the shell and flesh of artificially contaminated cooked prawns after peeling, and determine the efficacy of Listeria innocua as a model for L. monocytogenes in this system. Methods and Results: A L. monocytogenes and L. innocua strain were inoculated separately onto cooked black tiger prawns using two protocols ( immersion or swabbing with incubation). Prawns were peeled by two methods ( gloved hand or scalpel and forceps) and numbers of Listeria on shells, flesh and whole prawn controls were determined. Prawns were exposed to crystal violet dye to assess the penetration of liquids. Regardless of preparation method or bacterial strain there were ca 1log(10) CFU more Listeria per shell than per peeled prawn. Dye was able to penetrate to the flesh in all cases. Conclusions: Shell-on prawns may be only slightly safer than shell-off prawns. Listeria innocua is an acceptable model for L. monocytogenes in this system. Significance and Impact of the Study: Reduced risk from L. monocytogenes on prawns can only be assured by adequate hygiene or heating.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many organisations need to extract useful information from huge amounts of movement data. One example is found in maritime transportation, where the automated identification of a diverse range of traffic routes is a key management issue for improving the maintenance of ports and ocean routes, and accelerating ship traffic. This paper addresses, in a first stage, the research challenge of developing an approach for the automated identification of traffic routes based on clustering motion vectors rather than reconstructed trajectories. The immediate benefit of the proposed approach is to avoid the reconstruction of trajectories in terms of their geometric shape of the path, their position in space, their life span, and changes of speed, direction and other attributes over time. For clustering the moving objects, an adapted version of the Shared Nearest Neighbour algorithm is used. The motion vectors, with a position and a direction, are analysed in order to identify clusters of vectors that are moving towards the same direction. These clusters represent traffic routes and the preliminary results have shown to be promising for the automated identification of traffic routes with different shapes and densities, as well as for handling noise data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Protein aggregation became a widely accepted marker of many polyQ disorders, including Machado-Joseph disease (MJD), and is often used as readout for disease progression and development of therapeutic strategies. The lack of good platforms to rapidly quantify protein aggregates in a wide range of disease animal models prompted us to generate a novel image processing application that automatically identifies and quantifies the aggregates in a standardized and operator-independent manner. We propose here a novel image processing tool to quantify the protein aggregates in a Caenorhabditis elegans (C. elegans) model of MJD. Confocal mi-croscopy images were obtained from animals of different genetic conditions. The image processing application was developed using MeVisLab as a platform to pro-cess, analyse and visualize the images obtained from those animals. All segmenta-tion algorithms were based on intensity pixel levels.The quantification of area or numbers of aggregates per total body area, as well as the number of aggregates per animal were shown to be reliable and reproducible measures of protein aggrega-tion in C. elegans. The results obtained were consistent with the levels of aggrega-tion observed in the images. In conclusion, this novel imaging processing applica-tion allows the non-biased, reliable and high throughput quantification of protein aggregates in a C. elegans model of MJD, which may contribute to a significant improvement on the prognosis of treatment effectiveness for this group of disor-ders

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Regulating mechanisms of branching morphogenesis of fetal lung rat explants have been an essential tool for molecular research. This work presents a new methodology to accurately quantify the epithelial, outer contour and peripheral airway buds of lung explants during cellular development from microscopic images. Methods: The outer contour was defined using an adaptive and multi-scale threshold algorithm whose level was automatically calculated based on an entropy maximization criterion. The inner lung epithelial was defined by a clustering procedure that groups small image regions according to the minimum description length principle and local statistical properties. Finally, the number of peripheral buds were counted as the skeleton branched ends from a skeletonized image of the lung inner epithelial. Results: The time for lung branching morphometric analysis was reduced in 98% in contrast to the manual method. Best results were obtained in the first two days of cellular development, with lesser standard deviations. Non-significant differences were found between the automatic and manual results in all culture days. Conclusions: The proposed method introduces a series of advantages related to its intuitive use and accuracy, making the technique suitable to images with different lightning characteristics and allowing a reliable comparison between different researchers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regulating mechanisms of branchingmorphogenesis of fetal lung rat explants have been an essential tool formolecular research.This work presents a new methodology to accurately quantify the epithelial, outer contour, and peripheral airway buds of lung explants during cellular development frommicroscopic images. Methods.Theouter contour was defined using an adaptive and multiscale threshold algorithm whose level was automatically calculated based on an entropy maximization criterion. The inner lung epithelium was defined by a clustering procedure that groups small image regions according to the minimum description length principle and local statistical properties. Finally, the number of peripheral buds was counted as the skeleton branched ends from a skeletonized image of the lung inner epithelia. Results. The time for lung branching morphometric analysis was reduced in 98% in contrast to themanualmethod. Best results were obtained in the first two days of cellular development, with lesser standard deviations. Nonsignificant differences were found between the automatic and manual results in all culture days. Conclusions. The proposed method introduces a series of advantages related to its intuitive use and accuracy, making the technique suitable to images with different lighting characteristics and allowing a reliable comparison between different researchers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the purpose of at lowering costs and reendering the demanded information available to users with no access to the internet, service companies have adopted automated interaction technologies in their call centers, which may or may not meet the expectations of users. Based on different areas of knowledge (man-machine interaction, consumer behavior and use of IT) 13 propositions are raised and a research is carried out in three parts: focus group, field study with users and interviews with experts. Eleven automated service characteristics which support the explanation for user satisfaction are listed, a preferences model is proposed and evidence in favor or against each of the 13 propositions is brought in. With balance scorecard concepts, a managerial assessment model is proposed for the use of automated call center technology. In future works, the propositions may become verifiable hypotheses through conclusive empirical research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: The correct identification of the underlying cause of death and its precise assignment to a code from the International Classification of Diseases are important issues to achieve accurate and universally comparable mortality statistics These factors, among other ones, led to the development of computer software programs in order to automatically identify the underlying cause of death. OBJECTIVE: This work was conceived to compare the underlying causes of death processed respectively by the Automated Classification of Medical Entities (ACME) and the "Sistema de Seleção de Causa Básica de Morte" (SCB) programs. MATERIAL AND METHOD: The comparative evaluation of the underlying causes of death processed respectively by ACME and SCB systems was performed using the input data file for the ACME system that included deaths which occurred in the State of S. Paulo from June to December 1993, totalling 129,104 records of the corresponding death certificates. The differences between underlying causes selected by ACME and SCB systems verified in the month of June, when considered as SCB errors, were used to correct and improve SCB processing logic and its decision tables. RESULTS: The processing of the underlying causes of death by the ACME and SCB systems resulted in 3,278 differences, that were analysed and ascribed to lack of answer to dialogue boxes during processing, to deaths due to human immunodeficiency virus [HIV] disease for which there was no specific provision in any of the systems, to coding and/or keying errors and to actual problems. The detailed analysis of these latter disclosed that the majority of the underlying causes of death processed by the SCB system were correct and that different interpretations were given to the mortality coding rules by each system, that some particular problems could not be explained with the available documentation and that a smaller proportion of problems were identified as SCB errors. CONCLUSION: These results, disclosing a very low and insignificant number of actual problems, guarantees the use of the version of the SCB system for the Ninth Revision of the International Classification of Diseases and assures the continuity of the work which is being undertaken for the Tenth Revision version.