973 resultados para STANDARD AUTOMATED PERIMETRY


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Multiple breath washout (MBW) derived Scond is an established index of ventilation inhomogeneity. Time-consuming post hoc calculations of the expirogram's slope of alveolar phase III (SIII) and the lack of available software hampered widespread application of Scond. METHODS Seventy-two school-aged children (45 with cystic fibrosis; CF) performed 3 nitrogen MBW. We tested a new automated algorithm for Scond analysis (Scondauto ) which comprised breath selection for SIII detection, calculation and reporting of test quality. We compared Scondauto to (i) standard Scond analysis (Scondmanual ) with manual breath selection and to (ii) pragmatic Scond analysis including all breaths (Scondall ). Primary outcomes were success rate and agreement between different Scond protocols, and Scond fitting quality (linear regression R(2) ). RESULTS Average Scondauto (0.06 for CF and 0.01 for controls) was not different from Scondmanual (0.06 for CF and 0.01 for controls) and showed comparable fitting quality (R(2) 0.53 for CF and 0.13 for controls vs. R(2) 0.54 for CF and 0.13 for controls). Scondall was similar in CF and controls but with inferior fitting quality compared to Scondauto and Scondmanual . CONCLUSIONS Automated Scond calculation is feasible and produces robust results comparable to the standard manual way of Scond calculation. This algorithm provides a valid, fast and objective tool for regular use, even in children. Pediatr Pulmonol. © 2014 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A composite section, which reconstructs a continuous stratigraphic record from cores of multiple nearby holes, and its associated composite depth scale are important tools for analyzing sediment recovered from a drilling site. However, the standard technique for creating composite depth scales on drilling cruises does not correct for depth distortion within each core. Additionally, the splicing technique used to create composite sections often results in a 10-15% offset between composite depths and measured drill depths. We present a new automated compositing technique that better aligns stratigraphy across holes, corrects depth offsets, and could be performed aboard ship. By analyzing 618 cores from seven Ocean Drilling Program (ODP) sites, we estimate that ?80% of the depth offset in traditional composite depth scales results from core extension during drilling and extraction. Average rates of extension are 12.4 ± 1.5% for calcareous and siliceous cores from ODP Leg 138 and 8.1 ± 1.1% for calcareous and clay-rich cores from ODP Leg 154. Also, average extension decreases as a function of depth in the sediment column, suggesting that elastic rebound is not the dominant extension mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: This paper addresses one of the major end-user development (EUD) challenges, namely, how to pack today?s EUD support tools with composable elements. This would give end users better access to more components which they can use to build a solution tailored to their own needs. The success of later end-user software engineering (EUSE) activities largely depends on how many components each tool has and how adaptable components are to multiple problem domains. Objective: A system for automatically adapting heterogeneous components to a common development environment would offer a sizeable saving of time and resources within the EUD support tool construction process. This paper presents an automated adaptation system for transforming EUD components to a standard format. Method: This system is based on the use of description logic. Based on a generic UML2 data model, this description logic is able to check whether an end-user component can be transformed to this modeling language through subsumption or as an instance of the UML2 model. Besides it automatically finds a consistent, non-ambiguous and finite set of XSLT mappings to automatically prepare data in order to leverage the component as part of a tool that conforms to the target UML2 component model. Results: The proposed system has been successfully applied to components from four prominent EUD tools. These components were automatically converted to a standard format. In order to validate the proposed system, rich internet applications (RIA) used as an operational support system for operators at a large services company were developed using automatically adapted standard format components. These RIAs would be impossible to develop using each EUD tool separately. Conclusion: The positive results of applying our system for automatically adapting components from current tool catalogues are indicative of the system?s effectiveness. Use of this system could foster the growth of web EUD component catalogues, leveraging a vast ecosystem of user-centred SaaS to further current EUSE trends.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was carried out to detect differences in locomotion and feeding behavior in lame (group L; n = 41; gait score ≥ 2.5) and non-lame (group C; n = 12; gait score ≤ 2) multiparous Holstein cows in a cross-sectional study design. A model for automatic lameness detection was created, using data from accelerometers attached to the hind limbs and noseband sensors attached to the head. Each cow's gait was videotaped and scored on a 5-point scale before and after a period of 3 consecutive days of behavioral data recording. The mean value of 3 independent experienced observers was taken as a definite gait score and considered to be the gold standard. For statistical analysis, data from the noseband sensor and one of two accelerometers per cow (randomly selected) of 2 out of 3 randomly selected days was used. For comparison between group L and group C, the T-test, the Aspin-Welch Test and the Wilcoxon Test were used. The sensitivity and specificity for lameness detection was determined with logistic regression and ROC-analysis. Group L compared to group C had significantly lower eating and ruminating time, fewer eating chews, ruminating chews and ruminating boluses, longer lying time and lying bout duration, lower standing time, fewer standing and walking bouts, fewer, slower and shorter strides and a lower walking speed. The model considering the number of standing bouts and walking speed was the best predictor of cows being lame with a sensitivity of 90.2% and specificity of 91.7%. Sensitivity and specificity of the lameness detection model were considered to be very high, even without the use of halter data. It was concluded that under the conditions of the study farm, accelerometer data were suitable for accurately distinguishing between lame and non-lame dairy cows, even in cases of slight lameness with a gait score of 2.5.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Manual curation has long been held to be the gold standard for functional annotation of DNA sequence. Our experience with the annotation of more than 20,000 full-length cDNA sequences revealed problems with this approach, including inaccurate and inconsistent assignment of gene names, as well as many good assignments that were difficult to reproduce using only computational methods. For the FANTOM2 annotation of more than 60,000 cDNA clones, we developed a number of methods and tools to circumvent some of these problems, including an automated annotation pipeline that provides high-quality preliminary annotation for each sequence by introducing an uninformative filter that eliminates uninformative annotations, controlled vocabularies to accurately reflect both the functional assignments and the evidence supporting them, and a highly refined, Web-based manual annotation tool that allows users to view a wide array of sequence analyses and to assign gene names and putative functions using a consistent nomenclature. The ultimate utility of our approach is reflected in the low rate of reassignment of automated assignments by manual curation. Based on these results, we propose a new standard for large-scale annotation, in which the initial automated annotations are manually investigated and then computational methods are iteratively modified and improved based on the results of manual curation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study utilized the advanced technology provided by automated perimeters to investigate the hypothesis that patients with retinitis pigmentosa behave atypically over the dynamic range and to concurrently determine the influence of extraneous factors on the format of the normal perimetric sensitivity profile. The perimetric processing of some patients with retinitis pigmentosa was considered to be abnormal in either the temporal and/or the spatial domain. The standard size III stimulus saturated the central regions and was thus ineffective in detecting early depressions in sensitivity in these areas. When stimulus size was scaled in inverse proportion to the square root of ganglion cell receptive field density (M-scaled), isosensitive profiles did not result, although cortical representation was theoretically equivalent across the visual field. It was conjectured that this was due to variations in the ganglion cell characteristics with increasing peripheral angle, most notably spatial summation. It was concluded that the development of perimetric routines incorporating stimulus sizes adjusted in proportion to the coverage factor of retinal ganglion cells would enhance the diagnostic capacity of perimetry. Good general and local correspondence was found between perimetric sensitivity and the available retinal cell counts. Intraocular light scatter arising both from simulations and media opacities depressed perimetric sensitivity. Attenuation was greater centrally for the smaller LED stimuli, whereas the reverse was true for the larger projected stimuli. Prior perimetric experience and pupil size also demonstrated eccentricity-dependent effect on sensitivity. Practice improved perimetric sensitivity for projected stimuli at eccentricities greater than or equal to 30o; particularly in the superior region. Increase in pupil size for LED stimuli enhanced sensitivity at eccentricities greater than 10o. Conversely, microfluctuation in the accommodative response during perimetric examination and the correction of peripheral refractive error had no significant influence on perimetric sensitivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTAMAP is a Web Processing Service for the automatic spatial interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the Open Geospatial Consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an integrated, open source solution. The system couples an open-source Web Processing Service (developed by 52°North), accepting data in the form of standardised XML documents (conforming to the OGC Observations and Measurements standard) with a computing back-end realised in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a markup language designed to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropy, extreme values, and data with known error distributions. Besides a fully automatic mode, the system can be used with different levels of user control over the interpolation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presentation Purpose:To relate structural change to functional change in age-related macular degeneration (AMD) in a cross-sectional population using fundus imaging and the visual field status. Methods:10 degree standard and SWAP visual fields and other standard functional clinical measures were acquired in 44 eyes of 27 patients at various stages of AMD, as well as fundus photographs. Retro-mode SLO images were captured in a subset of 29 eyes of 19 of the patients. Drusen area, measured by automated drusen segmentation software (Smith et al. 2005) was correlated with visual field data. Visual field defect position was compared to the position of the imaged drusen and deposits using custom software. Results:The effect of AMD stage on drusen area within the 6000µm was significant (One-way ANOVA: F = 17.231, p < 0.001), however the trend was not strong across all stages. There were significant linear relationships between visual field parameters and drusen area. The mean deviation (MD) declined by 3.00dB and 3.92dB for each log % drusen area for standard perimetry and SWAP, respectively. The visual field parameters of focal loss displayed the strongest correlations with drusen area. The number of pattern deviation (PD) defects increased by 9.30 and 9.68 defects per log % drusen area for standard perimetry and SWAP, respectively. Weaker correlations were found between drusen area and visual acuity, contrast sensitivity, colour vision and reading speed. 72.6% of standard PD defects and 65.2% of SWAP PD defects coincided with retinal signs of AMD on fundus photography. 67.5% of standard PD defects and 69.7% of SWAP PD defects coincided with deposits on retro-mode images. Conclusions:Perimetry exhibited a stronger relationship with drusen area than other measures of visual function. The structure-function relationship between visual field parameters and drusen area was linear. Overall the indices of focal loss had a stronger correlation with drusen area in SWAP than in standard perimetry. Visual field defects had a high coincidence proportion with retinal manifestations of AMD.Smith R.T. et al. (2005) Arch Ophthalmol 123:200-206.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The purpose of this study was to examine the effectiveness of a new analysis method of mfVEP objective perimetry in the early detection of glaucomatous visual field defects compared to the gold standard technique. Methods and patients: Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes), and glaucoma suspect patients (38 eyes). All subjects underwent two standard 24-2 visual field tests: one with the Humphrey Field Analyzer and a single mfVEP test in one session. Analysis of the mfVEP results was carried out using the new analysis protocol: the hemifield sector analysis protocol. Results: Analysis of the mfVEP showed that the signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the three groups (analysis of variance, P<0.001 with a 95% confidence interval, 2.82, 2.89 for normal group; 2.25, 2.29 for glaucoma suspect group; 1.67, 1.73 for glaucoma group). The difference between superior and inferior hemifield sectors and hemi-rings was statistically significant in 11/11 pair of sectors and hemi-rings in the glaucoma patients group (t-test P<0.001), statistically significant in 5/11 pairs of sectors and hemi-rings in the glaucoma suspect group (t-test P<0.01), and only 1/11 pair was statistically significant (t-test P<0.9). The sensitivity and specificity of the hemifield sector analysis protocol in detecting glaucoma was 97% and 86% respectively and 89% and 79% in glaucoma suspects. These results showed that the new analysis protocol was able to confirm existing visual field defects detected by standard perimetry, was able to differentiate between the three study groups with a clear distinction between normal patients and those with suspected glaucoma, and was able to detect early visual field changes not detected by standard perimetry. In addition, the distinction between normal and glaucoma patients was especially clear and significant using this analysis. Conclusion: The new hemifield sector analysis protocol used in mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol, it can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. The sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucomatous visual field loss. The intersector analysis protocol can detect early field changes not detected by the standard Humphrey Field Analyzer test. © 2013 Mousa et al, publisher and licensee Dove Medical Press Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acute life-threatening events are mostly predictable in adults and children. Despite real-time monitoring these events still occur at a rate of 4%. This paper describes an automated prediction system based on the feature space embedding and time series forecasting methods of the SpO2 signal; a pulsatile signal synchronised with heart beat. We develop an age-independent index of abnormality that distinguishes patient-specific normal to abnormal physiology transitions. Two different methods were used to distinguish between normal and abnormal physiological trends based on SpO2 behaviour. The abnormality index derived by each method is compared against the current gold standard of clinical prediction of critical deterioration. Copyright © 2013 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SMS (Short Message Service) is now a hugely popular and a very powerful business communication technology for mobile phones. In order to respond correctly to a free form factual question given a large collection of texts, one needs to understand the question at a level that allows determining some of constraints the question imposes on a possible answer. These constraints may include a semantic classification of the sought after answer and may even suggest using different strategies when looking for and verifying a candidate answer. In this paper we focus on various attempts to overcome the major contradiction: the technical limitations of the SMS standard, and the huge number of found information for a possible answer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Vigabatrin (VGB) is an anti-epileptic medication which has been linked to peripheral constriction of the visual field. Documenting the natural history associated with continued VGB exposure is important when making decisions about the risk and benefits associated with the treatment. Due to its speed the Swedish Interactive Threshold Algorithm (SITA) has become the algorithm of choice when carrying out Full Threshold automated static perimetry. SITA uses prior distributions of normal and glaucomatous visual field behaviour to estimate threshold sensitivity. As the abnormal model is based on glaucomatous behaviour this algorithm has not been validated for VGB recipients. We aim to assess the clinical utility of the SITA algorithm for accurately mapping VGB attributed field loss. Methods: The sample comprised one randomly selected eye of 16 patients diagnosed with epilepsy, exposed to VGB therapy. A clinical diagnosis of VGB attributed visual field loss was documented in 44% of the group. The mean age was 39.3 years∈±∈14.5 years and the mean deviation was -4.76 dB ±4.34 dB. Each patient was examined with the Full Threshold, SITA Standard and SITA Fast algorithm. Results: SITA Standard was on average approximately twice as fast (7.6 minutes) and SITA Fast approximately 3 times as fast (4.7 minutes) as examinations completed using the Full Threshold algorithm (15.8 minutes). In the clinical environment, the visual field outcome with both SITA algorithms was equivalent to visual field examination using the Full Threshold algorithm in terms of visual inspection of the grey scale plots, defect area and defect severity. Conclusions: Our research shows that both SITA algorithms are able to accurately map visual field loss attributed to VGB. As patients diagnosed with epilepsy are often vulnerable to fatigue, the time saving offered by SITA Fast means that this algorithm has a significant advantage for use with VGB recipients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the world's synchrotrons and X-FELs endeavour to meet the need to analyse ever-smaller protein crystals, there grows a requirement for a new technique to present nano-dimensional samples to the beam for X-ray diffraction experiments.The work presented here details developmental work to reconfigure the nano tweezer technology developed by Optofluidics (PA, USA) for the trapping of nano dimensional protein crystals for X-ray crystallography experiments. The system in its standard configuration is used to trap nano particles for optical microscopy. It uses silicon nitride laser waveguides that bridge a micro fluidic channel. These waveguides contain 180 nm apertures of enabling the system to use biologically compatible 1.6 micron wavelength laser light to trap nano dimensional biological samples. Using conventional laser tweezers, the wavelength required to trap such nano dimensional samples would destroy them. The system in its optical configuration has trapped protein molecules as small as 10 nanometres.