918 resultados para STANDARD AUTOMATED PERIMETRY


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: To evaluate the effect of cataract surgery on frequency doubling technology (FDT) perimetry in patients with coexisting cataract and glaucoma. Methods: In this consecutive prospective cohort study 27 patients with open angle glaucoma scheduled for cataract extraction alone or combined with trabeculectomy were enrolled. All patients underwent FDT threshold C-20 visual fields within 3 months before and 3 months after surgery. Changes in mean deviation (MD) and pattern standard deviation (PSD) were evaluated. Additionally, changes in best corrected logMAR visual acuity (VA), intraocular pressure (IOP), and number of glaucoma medications were also studied. Results: 22 patients completed the study. VA improved after surgery, from 0.47 (SD 0.19) to 0.12 (0.17) (p

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an automated design framework for the development of individual part forming tools for a composite stiffener. The framework uses parametrically developed design geometries for both the part and its layup tool. The framework has been developed with a functioning user interface where part / tool combinations are passed to a virtual environment for utility based assessment of their features and assemblability characteristics. The work demonstrates clear benefits in process design methods with conventional design timelines reduced from hours and days to minutes and seconds. The methods developed here were able to produce a digital mock up of a component with its associated layup tool in less than 3 minutes. The virtual environment presenting the design to the designer for interactive assembly planning was generated in 20 seconds. Challenges still exist in determining the level of reality required to provide an effective learning environment in the virtual world. Full representation of physical phenomena such as gravity, part clashes and the representation of standard build functions require further work to represent real physical phenomena more accurately.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the development and evaluation of a sequential injection method to automate the determination of methyl parathion by square wave adsorptive cathodic stripping voltammetry exploiting the concept of monosegmented flow analysis to perform in-line sample conditioning and standard addition. Accumulation and stripping steps are made in the sample medium conditioned with 40 mmol L-1 Britton-Robinson buffer (pH 10) in 0.25 mol L-1 NaNO3. The homogenized mixture is injected at a flow rate of 10 mu Ls(-1) toward the flow cell, which is adapted to the capillary of a hanging drop mercury electrode. After a suitable deposition time, the flow is stopped and the potential is scanned from -0.3 to -1.0 V versus Ag/AgCl at frequency of 250 Hz and pulse height of 25 mV The linear dynamic range is observed for methyl parathion concentrations between 0.010 and 0.50 mgL(-1), with detection and quantification limits of 2 and 7 mu gL(-1), respectively. The sampling throughput is 25 h(-1) if the in line standard addition and sample conditioning protocols are followed, but this frequency can be increased up to 61 h(-1) if the sample is conditioned off-line and quantified using an external calibration curve. The method was applied for determination of methyl parathion in spiked water samples and the accuracy was evaluated either by comparison to high performance liquid chromatography with UV detection, or by the recovery percentages. Although no evidences of statistically significant differences were observed between the expected and obtained concentrations, because of the susceptibility of the method to interference by other pesticides (e.g., parathion, dichlorvos) and natural organic matter (e.g., fulvic and humic acids), isolation of the analyte may be required when more complex sample matrices are encountered. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background the Test-mate kit determines acetylcholinesterase (AChE, EC 3.1.1.7) and hemoglobin content of a drop of blood, displaying enzyme activities normalized to 25degreesC. Previous models produced inconsistent results at different temperatures. This report focuses on the current model, ChE 400, and two instruments of a previous OP model.Methods AChE activities were determined by the Ellman assay, using the three kits and a 96-well microplate reader Temperatures ranged from 10 to 37degreesC. Fetal bovine serum was the source of AChE.Results Normalized activities decreased below 20degreesC in the ChE model and below 25 C in the OP models. Activities of the same serum sample differed between the three Test-mate kits, ranging from 1.03 to 1.49 mumoles/min/ml. Percent errors were greater than with the microplate reader at all temperatures.Conclusions Neither we nor the manufacturer recommend the current Test-mate model for fieldwork. Nevertheless, there have been field measurements with Test-Mate kits, and we recommend that an enzyme activity standard be run in parallel with their use. (C) 2002 Wiley-Liss, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crotamine is one of four major components of the venom of the South American rattlesnake Crotalus durissus terrificus. Similar to its counterparts in the family of the myotoxins, it induces myonecrosis of skeletal muscle cells. This paper describes a new NMR structure determination of crotamine in aqueous solution at pH 5.8 and 20 degrees C, using standard homonuclear (1)H NMR spectroscopy at 900 MHz and the automated structure calculation software ATNOS/CANDID/DYANA. The automatic NOESY spectral analysis included the identification of a most likely combination of the six cysteines into three disulfide bonds, i.e. Cys4-Cys36, Cys11-Cys30 and Cys18-Cys37; thereby a generally applicable new computational protocol is introduced to determine unknown disulfide bond connectivities in globular proteins. A previous NMR structure determination was thus confirmed and the structure refined. Crotamine contains an alpha-helix with residues 1-7 and a two-stranded anti-parallel beta-sheet with residues 9-13 and 34-38 as the only regular secondary structures. These are connected with each other and the remainder of the polypeptide chain by the three disulfide bonds, which also form part of a central hydrophobic core. A single conformation was observed, with Pro13 and Pro21 in the trans and Pro20 in the cis-form. The global fold and the cysteine-pairing pattern of crotamine are similar to the beta-defensin fold, although the two proteins have low sequence homology, and display different biological activities. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Insulin resistance is a common risk factor in chronic kidney disease patients contributing to the high cardiovascular burden, even in the absence of diabetes. Glucose-based peritoneal dialysis (PD) solutions are thought to intensify insulin resistance due to the continuous glucose absorption from the peritoneal cavity. The aim of our study was to analyse the effect of the substitution of glucose for icodextrin on insulin resistance in non-diabetic PD patients in a multicentric randomized clinical trial. This was a multicenter, open-label study with balanced randomization (1:1) and two parallel-groups. Inclusion criteria were non-diabetic adult patients on automated peritoneal dialysis (APD) for at least 3 months on therapy prior to randomization. Patients assigned to the intervention group were treated with 2L of icodextrin 7.5%, and the control group with glucose 2.5% during the long dwell and, at night in the cycler, with a prescription of standard glucose-based PD solution only in both groups. The primary end-point was the change in insulin resistance measured by homeostatic model assessment (HOMA) index at 90 days. Sixty patients were included in the intervention (n = 33) or the control (n = 27) groups. There was no difference between groups at baseline. After adjustment for pre-intervention HOMA index levels, the group treated with icodextrin had the lower post-intervention levels at 90 days in both intention to treat [1.49 (95% CI: 1.23-1.74) versus 1.89 (95% CI: 1.62-2.17)], (F = 4.643, P = 0.03, partial η(2) = 0.078); and the treated analysis [1.47 (95% CI: 1.01-1.84) versus 2.18 (95% CI: 1.81-2.55)], (F = 7.488, P = 0.01, partial η(2) = 0.195). The substitution of glucose for icodextrin for the long dwell improved insulin resistance measured by HOMA index in non-diabetic APD patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Automated weaning modes are available in some mechanical ventilators, but no studies compared them hitherto. We compared the performance of 3 automated modes under standard and challenging situations. Methods: We used a lung simulator to compare 3 automated modes, adaptive support ventilation (ASV), mandatory rate ventilation (MRV), and Smartcare, in 6 situations, weaning success, weaning failure, weaning success with extreme anxiety, weaning success with Cheyne-Stokes, weaning success with irregular breathing, and weaning failure with ineffective efforts. Results: The 3 modes correctly recognized the situations of weaning success and failure, even when anxiety or irregular breathing were present but incorrectly recognized weaning success with Cheyne-Stokes. MRV incorrectly recognized weaning failure with ineffective efforts. Time to pressure support (PS) stabilization was shorter for ASV (1-2 minutes for all situations) and MRV (1-7 minutes) than for Smartcare (8-78 minutes). ASV had higher rates of PS oscillations per 5 minutes (4-15), compared with Smartcare (0-1) and MRV (0-12), except when extreme anxiety was present. Conclusions: Smartcare, ASV, and MRV were equally able to recognize weaning success and failure, despite the presence of anxiety or irregular breathing but performed incorrectly in the presence of Cheyne-Stokes. PS behavior over the time differs among modes, with ASV showing larger and more frequent PS oscillations over the time. Clinical studies are needed to confirm our results. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To investigate the possible effect of aspherical or yellow tinted intraocular lens (IOL) on contrast sensitivity and blue-on-yellow perimetry. METHODS: This prospective randomized bilateral double-masked clinical study included 52 patients with visually significant bilateral cataracts divided in two groups; 25 patients (50 eyes) received aspherical intraocular lens in one eye and spherical intraocular lens in the fellow eye; and 27 patients (54 eyes) received ultraviolet and blue light filter (yellow tinted) IOL implantation in one eye and acrylic ultraviolet filter IOL in the fellow eye. The primary outcome measures were contrast sensitivity and blue-on-yellow perimetry values (mean deviation [MD] and pattern standard deviation [PSD]) investigated two years after surgery. The results were compared intra-individually. RESULTS: There was a statistically significant between-group (aspherical and spherical intraocular lens) difference in contrast sensitivity under photopic conditions at 12 cycles per degree and under mesopic conditions at all spatial frequencies. There were no between-group significant differences (yellow tinted and clear intraocular lens) under photopic or mesopic conditions. There was no statistically significant difference between all intraocular lens in MD or PSD. CONCLUSION: Contrast sensitivity was better under mesopic conditions with aspherical intraocular lens. Blue-on-yellow perimetry did not appear to be affected by aspherical or yellow tinted intraocular lens. Further studies with a larger sample should be carried out to confirm or not that hypotheses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Myocardial perfusion quantification by means of Contrast-Enhanced Cardiac Magnetic Resonance images relies on time consuming frame-by-frame manual tracing of regions of interest. In this Thesis, a novel automated technique for myocardial segmentation and non-rigid registration as a basis for perfusion quantification is presented. The proposed technique is based on three steps: reference frame selection, myocardial segmentation and non-rigid registration. In the first step, the reference frame in which both endo- and epicardial segmentation will be performed is chosen. Endocardial segmentation is achieved by means of a statistical region-based level-set technique followed by a curvature-based regularization motion. Epicardial segmentation is achieved by means of an edge-based level-set technique followed again by a regularization motion. To take into account the changes in position, size and shape of myocardium throughout the sequence due to out of plane respiratory motion, a non-rigid registration algorithm is required. The proposed non-rigid registration scheme consists in a novel multiscale extension of the normalized cross-correlation algorithm in combination with level-set methods. The myocardium is then divided into standard segments. Contrast enhancement curves are computed measuring the mean pixel intensity of each segment over time, and perfusion indices are extracted from each curve. The overall approach has been tested on synthetic and real datasets. For validation purposes, the sequences have been manually traced by an experienced interpreter, and contrast enhancement curves as well as perfusion indices have been computed. Comparisons between automatically extracted and manually obtained contours and enhancement curves showed high inter-technique agreement. Comparisons of perfusion indices computed using both approaches against quantitative coronary angiography and visual interpretation demonstrated that the two technique have similar diagnostic accuracy. In conclusion, the proposed technique allows fast, automated and accurate measurement of intra-myocardial contrast dynamics, and may thus address the strong clinical need for quantitative evaluation of myocardial perfusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To determine the accuracy of automated vessel-segmentation software for vessel-diameter measurements based on three-dimensional contrast-enhanced magnetic resonance angiography (3D-MRA). METHOD: In 10 patients with high-grade carotid stenosis, automated measurements of both carotid arteries were obtained with 3D-MRA by two independent investigators and compared with manual measurements obtained by digital subtraction angiography (DSA) and 2D maximum-intensity projection (2D-MIP) based on MRA and duplex ultrasonography (US). In 42 patients undergoing carotid endarterectomy (CEA), intraoperative measurements (IOP) were compared with postoperative 3D-MRA and US. RESULTS: Mean interoperator variability was 8% for measurements by DSA and 11% by 2D-MIP, but there was no interoperator variability with the automated 3D-MRA analysis. Good correlations were found between DSA (standard of reference), manual 2D-MIP (rP=0.6) and automated 3D-MRA (rP=0.8). Excellent correlations were found between IOP, 3D-MRA (rP=0.93) and US (rP=0.83). CONCLUSION: Automated 3D-MRA-based vessel segmentation and quantification result in accurate measurements of extracerebral-vessel dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECT: Preliminary experience with the C-Port Flex-A Anastomosis System (Cardica, Inc.) to enable rapid automated anastomosis has been reported in coronary artery bypass surgery. The goal of the current study was to define the feasibility and safety of this method for high-flow extracranial-intracranial (EC-IC) bypass surgery in a clinical series. METHODS: In a prospective study design, patients with symptomatic carotid artery (CA) occlusion were selected for C-Port-assisted high-flow EC-IC bypass surgery if they met the following criteria: 1) transient or moderate permanent symptoms of focal ischemia; 2) CA occlusion; 3) hemodynamic instability; and 4) had provided informed consent. Bypasses were done using a radial artery graft that was proximally anastomosed to the superficial temporal artery trunk, the cervical external, or common CA. All distal cerebral anastomoses were performed on M2 branches using the C-Port Flex-A system. RESULTS: Within 6 months, 10 patients were enrolled in the study. The distal automated anastomosis could be accomplished in all patients; the median temporary occlusion time was 16.6+/-3.4 minutes. Intraoperative digital subtraction angiography (DSA) confirmed good bypass function in 9 patients, and in 1 the anastomosis was classified as fair. There was 1 major perioperative complication that consisted of the creation of a pseudoaneurysm due to a hardware problem. In all but 1 case the bypass was shown to be patent on DSA after 7 days; furthermore, in 1 patient a late occlusion developed due to vasospasm after a sylvian hemorrhage. One-week follow-up DSA revealed transient asymptomatic extracranial spasm of the donor artery and the radial artery graft in 1 case. Two patients developed a limited zone of infarction on CT scanning during the follow-up course. CONCLUSIONS: In patients with symptomatic CA occlusion, C-Port Flex-A-assisted high-flow EC-IC bypass surgery is a technically feasible procedure. The system needs further modification to achieve a faster and safer anastomosis to enable a conclusive comparison with standard and laser-assisted methods for high-flow bypass surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Multiple breath washout (MBW) derived Scond is an established index of ventilation inhomogeneity. Time-consuming post hoc calculations of the expirogram's slope of alveolar phase III (SIII) and the lack of available software hampered widespread application of Scond. METHODS Seventy-two school-aged children (45 with cystic fibrosis; CF) performed 3 nitrogen MBW. We tested a new automated algorithm for Scond analysis (Scondauto ) which comprised breath selection for SIII detection, calculation and reporting of test quality. We compared Scondauto to (i) standard Scond analysis (Scondmanual ) with manual breath selection and to (ii) pragmatic Scond analysis including all breaths (Scondall ). Primary outcomes were success rate and agreement between different Scond protocols, and Scond fitting quality (linear regression R(2) ). RESULTS Average Scondauto (0.06 for CF and 0.01 for controls) was not different from Scondmanual (0.06 for CF and 0.01 for controls) and showed comparable fitting quality (R(2) 0.53 for CF and 0.13 for controls vs. R(2) 0.54 for CF and 0.13 for controls). Scondall was similar in CF and controls but with inferior fitting quality compared to Scondauto and Scondmanual . CONCLUSIONS Automated Scond calculation is feasible and produces robust results comparable to the standard manual way of Scond calculation. This algorithm provides a valid, fast and objective tool for regular use, even in children. Pediatr Pulmonol. © 2014 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A composite section, which reconstructs a continuous stratigraphic record from cores of multiple nearby holes, and its associated composite depth scale are important tools for analyzing sediment recovered from a drilling site. However, the standard technique for creating composite depth scales on drilling cruises does not correct for depth distortion within each core. Additionally, the splicing technique used to create composite sections often results in a 10-15% offset between composite depths and measured drill depths. We present a new automated compositing technique that better aligns stratigraphy across holes, corrects depth offsets, and could be performed aboard ship. By analyzing 618 cores from seven Ocean Drilling Program (ODP) sites, we estimate that ?80% of the depth offset in traditional composite depth scales results from core extension during drilling and extraction. Average rates of extension are 12.4 ± 1.5% for calcareous and siliceous cores from ODP Leg 138 and 8.1 ± 1.1% for calcareous and clay-rich cores from ODP Leg 154. Also, average extension decreases as a function of depth in the sediment column, suggesting that elastic rebound is not the dominant extension mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: This paper addresses one of the major end-user development (EUD) challenges, namely, how to pack today?s EUD support tools with composable elements. This would give end users better access to more components which they can use to build a solution tailored to their own needs. The success of later end-user software engineering (EUSE) activities largely depends on how many components each tool has and how adaptable components are to multiple problem domains. Objective: A system for automatically adapting heterogeneous components to a common development environment would offer a sizeable saving of time and resources within the EUD support tool construction process. This paper presents an automated adaptation system for transforming EUD components to a standard format. Method: This system is based on the use of description logic. Based on a generic UML2 data model, this description logic is able to check whether an end-user component can be transformed to this modeling language through subsumption or as an instance of the UML2 model. Besides it automatically finds a consistent, non-ambiguous and finite set of XSLT mappings to automatically prepare data in order to leverage the component as part of a tool that conforms to the target UML2 component model. Results: The proposed system has been successfully applied to components from four prominent EUD tools. These components were automatically converted to a standard format. In order to validate the proposed system, rich internet applications (RIA) used as an operational support system for operators at a large services company were developed using automatically adapted standard format components. These RIAs would be impossible to develop using each EUD tool separately. Conclusion: The positive results of applying our system for automatically adapting components from current tool catalogues are indicative of the system?s effectiveness. Use of this system could foster the growth of web EUD component catalogues, leveraging a vast ecosystem of user-centred SaaS to further current EUSE trends.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was carried out to detect differences in locomotion and feeding behavior in lame (group L; n = 41; gait score ≥ 2.5) and non-lame (group C; n = 12; gait score ≤ 2) multiparous Holstein cows in a cross-sectional study design. A model for automatic lameness detection was created, using data from accelerometers attached to the hind limbs and noseband sensors attached to the head. Each cow's gait was videotaped and scored on a 5-point scale before and after a period of 3 consecutive days of behavioral data recording. The mean value of 3 independent experienced observers was taken as a definite gait score and considered to be the gold standard. For statistical analysis, data from the noseband sensor and one of two accelerometers per cow (randomly selected) of 2 out of 3 randomly selected days was used. For comparison between group L and group C, the T-test, the Aspin-Welch Test and the Wilcoxon Test were used. The sensitivity and specificity for lameness detection was determined with logistic regression and ROC-analysis. Group L compared to group C had significantly lower eating and ruminating time, fewer eating chews, ruminating chews and ruminating boluses, longer lying time and lying bout duration, lower standing time, fewer standing and walking bouts, fewer, slower and shorter strides and a lower walking speed. The model considering the number of standing bouts and walking speed was the best predictor of cows being lame with a sensitivity of 90.2% and specificity of 91.7%. Sensitivity and specificity of the lameness detection model were considered to be very high, even without the use of halter data. It was concluded that under the conditions of the study farm, accelerometer data were suitable for accurately distinguishing between lame and non-lame dairy cows, even in cases of slight lameness with a gait score of 2.5.