17 resultados para equivalent web thickness method

em Aston University Research Archive


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: To analyse the relationship between measured intraocular pressure (IOP) and central corneal thickness (CCT), corneal hysteresis (CH) and corneal resistance factor (CRF) in ocular hypertension (OHT), primary open-angle (POAG) and normal tension glaucoma (NTG) eyes using multiple tonometry devices. Methods: Right eyes of patients diagnosed with OHT (n=47), normal tension glaucoma (n=17) and POAG (n=50) were assessed, IOP was measured in random order with four devices: Goldmann applanation tonometry (GAT); Pascal(R) dynamic contour tonometer (DCT); Reichert(R) ocular response analyser (ORA); and Tono-Pen(R) XL. CCT was then measured using a hand-held ultrasonic pachymeter. CH and CRF were derived from the air pressure to corneal reflectance relationship of the ORA data. Results: Compared to the GAT, the Tonopen and ORA Goldmann equivalent (IOPg) and corneal compensated (IOPcc) measured higher IOP readings (F=19.351, p<0.001), particularly in NTG (F=12.604, p<0.001). DCT was closest to Goldmann IOP and had the lowest variance. CCT was significantly different (F=8.305, p<0.001) between the 3 conditions as was CH (F=6.854, p=0.002) and CRF (F=19.653, p<0.001). IOPcc measures were not affected by CCT. The DCT was generally not affected by corneal biomechanical factors. Conclusion: This study suggests that as the true pressure of the eye cannot be determined non-invasively, measurements from any tonometer should be interpreted with care, particularly when alterations in the corneal tissue are suspected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Medicines reconciliation-identifying and maintaining an accurate list of a patient's current medications-should be undertaken at all transitions of care and available to all patients. Objective A self-completion web survey was conducted for chief pharmacists (or equivalent) to evaluate medicines reconciliation levels in secondary care mental health organisations. Setting The survey was sent to secondary care mental health organisations in England, Scotland, Northern Ireland and Wales. Method The survey was launched via Bristol Online Surveys. Quantitative data was analysed using descriptive statistics and qualitative data was collected through respondents free-text answers to specific questions. Main outcomes measure Investigate how medicines reconciliation is delivered, incorporate a clear description of the role of pharmacy staff and identify areas of concern. Results Forty-two (52 % response rate) surveys were completed. Thirty-seven (88.1 %) organisations have a formal policy for medicines reconciliation with defined steps. Results show that the pharmacy team (pharmacists and pharmacy technicians) are the main professionals involved in medicines reconciliation with a high rate of doctors also involved. Training procedures frequently include an induction by pharmacy for doctors whilst the pharmacy team are generally trained by another member of pharmacy. Mental health organisations estimate that nearly 80 % of medicines reconciliation is carried out within 24 h of admission. A full medicines reconciliation is not carried out on patient transfer between mental health wards; instead quicker and less exhaustive variations are implemented. 71.4 % of organisations estimate that pharmacy staff conduct daily medicine reconciliations for acute admission wards (Monday to Friday). However, only 38 % of organisations self-report to pharmacy reconciling patients' medication for other teams that admit from primary care. Conclusion Most mental health organisations appear to be complying with NICE guidance on medicines reconciliation for their acute admission wards. However, medicines reconciliation is conducted less frequently on other units that admit from primary care and rarely completed on transfer when it significantly differs to that on admission. Formal training and competency assessments on medicines reconciliation should be considered as current training varies and adherence to best practice is questionable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To demonstrate the application of low-coherence reflectometry to the study of biometric changes during disaccommodation responses in human eyes after cessation of a near task and to evaluate the effect of contact lenses on low-coherence reflectometry biometric measurements. METHODS: Ocular biometric parameters of crystalline lens thickness (LT) and anterior chamber depth (ACD) were measured with the LenStar device during and immediately after a 5 D accommodative task in 10 participants. In a separate trial, accommodation responses were recorded with a Shin-Nippon WAM-5500 optometer in a subset of two participants. Biometric data were interleaved to form a profile of post-task anterior segment changes. In a further experiment, the effect of soft contact lenses on LenStar measurements was evaluated in 15 participants. RESULTS: In 10 adult participants, increased LT and reduced ACD was seen during the 5 D task. Post-task, during fixation of a 0 D target, a profile of the change in LT and ACD against time was observed. In the two participants with accommodation data (one a sufferer of nearwork-induced transient myopia and other a non-sufferer), the post-task changes in refraction compared favorably with the interleaved LenStar biometry data. The insertion of soft contact lenses did not have a significant effect on LenStar measures of ACD or LT (mean change: -0.007 mm, p = 0.265 and + 0.001 mm, p = 0.875, respectively). CONCLUSIONS: With the addition of a relatively simple stimulus modification, the LenStar instrument can be used to produce a profile of post-task changes in LT and ACD. The spatial and temporal resolution of the system is sufficient for the investigation of nearwork-induced transient myopia from a biometric viewpoint. LenStar measurements of ACD and LT remain valid after the fitting of soft contact lenses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose. The purpose of this study was to investigate the influence of corneal topography and thickness on intraocular pressure (IOP) and pulse amplitude (PA) as measured using the Ocular Blood Flow Analyzer (OBFA) pneumatonometer (Paradigm Medical Industries, Utah, USA). Methods. 47 university students volunteered for this cross-sectional study: mean age 20.4 yrs, range 18 to 28 yrs; 23 male, 24 female. Only the measurements from the right eye of each participant were used. Central corneal thickness and mean corneal radius were measured using Scheimpflug biometry and corneal topographic imaging respectively. IOP and PA measurements were made with the OBFA pneumatonometer. Axial length was measured using A-scan ultrasound, due to its known correlation with these corneal parameters. Stepwise multiple regression analysis was used to identify those components that contributed significant variance to the independent variables of IOP and PA. Results. The mean IOP and PA measurements were 13.1 (SD 3.3) mmHg and 3.0 (SD 1.2) mmHg respectively. IOP measurements made with the OBFA pneumatonometer correlated significantly with central corneal thickness (r = +0.374, p = 0.010), such that a 10 mm change in CCT was equivalent to a 0.30 mmHg change in measured IOP. PA measurements correlated significantly with axial length (part correlate = -0.651, p < 0.001) and mean corneal radius (part correlate = +0.459, p < 0.001) but not corneal thickness. Conclusions. IOP measurements taken with the OBFA pneumatonometer are correlated with corneal thickness, but not axial length or corneal curvature. Conversely, PA measurements are unaffected by corneal thickness, but correlated with axial length and corneal radius. These parameters should be taken into consideration when interpreting IOP and PA measurements made with the OBFA pneumatonometer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose. To evaluate the repeatability and reproducibility of subfoveal choroidal thickness (CT) calculations performed manually using optical coherence tomography (OCT). Methods. The CT was imaged in vivo at each of two visits on 11 healthy volunteers (mean age, 35.72 ± 13.19 years) using the spectral domain OCT. CT was manually measured after applying ImageJ processing filters on 15 radial subfoveal scans. Each radial scan was spaced 12° from each other and contained 2500 A-scans. The coefficient of variability, coefficient of repeatability (CoR), coefficient of reproducibility, and intraclass correlation coefficient determined the reproducibility and repeatability of the calculation. Axial length (AL) and mean spherical equivalent refractive error were measured with the IOLMaster and an open view autorefractor to study their potential relationship with CT. Results. The within-visit and between-visit coefficient of variability, CoR, coefficient of reproducibility, and intraclass correlation coefficient were 0.80, 2.97% 2.44%, and 99%, respectively. The subfoveal CT correlated significantly with AL (R = -0.60, p = 0.05). Conclusions. The subfoveal CT could be measured manually in vivo using OCT and the readings obtained from the healthy subjects evaluated were repeatable and reproducible. It is proposed that OCT could be a useful instrument to perform in vivo assessment and monitoring of CT changes in retinal disease. The preliminary results suggest a negative correlation between subfoveal CT and AL in such a way that it decreases with increasing AL but not with refractive error.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An interoperable web processing service (WPS) for the automatic interpolation of environmental data has been developed in the frame of the INTAMAP project. In order to assess the performance of the interpolation method implemented, a validation WPS has also been developed. This validation WPS can be used to perform leave one out and K-fold cross validation: a full dataset is submitted and a range of validation statistics and diagnostic plots (e.g. histograms, variogram of residuals, mean errors) is received in return. This paper presents the architecture of the validation WPS and a case study is used to briefly illustrate its use in practice. We conclude with a discussion on the current limitations of the system and make proposals for further developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The literature available on submerged arc welding of copper and copper alloys, submerged arc welding with strip electrodes, and related areas has been reviewed in depth. Copper cladding of mild steel substrates by deposition from strip electrodes using the submerged arc welding process has been successful. A wide range of parameters, and several fluxes have been investigated. The range of deposit compositions is 66.4% Cu to 95.7% Cu. The weld beads have been metallographically examined using optical and electron microscopy. Equating weld beads to a thermodynamical equivalent of iron has proven to be an accurate and simplified means of handling quantitative data for multicomponent welds. Empirical equations derived using theoretical considerations characterize the weld bead dimensions as functions of the welding parameters and hence composition. The melting rate for strip electrodes is dependent upon the current-voltage product. Weld nugget size is increased by increased thermal transfer efficiencies resulting from stirring which is current dependent. The presence of Fe2O3 in a flux has been demonstrated to diminish electrode melting rate and drastically increase penetration, making flux choice the prime consideration in cladding operations. A theoretical model for welding with strip electrodes and the submerged arc process is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hierarchical knowledge structures are frequently used within clinical decision support systems as part of the model for generating intelligent advice. The nodes in the hierarchy inevitably have varying influence on the decisionmaking processes, which needs to be reflected by parameters. If the model has been elicited from human experts, it is not feasible to ask them to estimate the parameters because there will be so many in even moderately-sized structures. This paper describes how the parameters could be obtained from data instead, using only a small number of cases. The original method [1] is applied to a particular web-based clinical decision support system called GRiST, which uses its hierarchical knowledge to quantify the risks associated with mental-health problems. The knowledge was elicited from multidisciplinary mental-health practitioners but the tree has several thousand nodes, all requiring an estimation of their relative influence on the assessment process. The method described in the paper shows how they can be obtained from about 200 cases instead. It greatly reduces the experts’ elicitation tasks and has the potential for being generalised to similar knowledge-engineering domains where relative weightings of node siblings are part of the parameter space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Poster Introduction: In neovascular age-related macular degeneration (nAMD), optical coherence tomography (OCT) is an important tool to determine when intravitreal injections of ranibizumab should be administered. Current guidelines recommend that patients should be reviewed four weekly and OCT indications for further treatment include subretinal fluid and intraretinal fluid or cysts. Purpose: We have reviewed the OCT scans of subjects who have successfully responded to ranibizumab to look for factors that might predict which patients will not require injection and could have extended appointments. Method: This was a prospective study in which we observed for 6 consecutive months the OCT images of 28 subjects who had received intravitreal ranibizumab for nAMD and were judged to be clinically inactive at recruitment to the study. Ratios between full retinal thickness (FRT = neurosensory retina + outer reflective band) and outer reflective band (ORB) thickness at the fovea were calculated for each subject at the moment of entering the study and at each successive visit for 6 consecutive months. Results: Patients with lower FRT/ORB ratios were found to be less likely to require an additional injection of ranibizumab and no subject with a ratio of 1.75 or less needed further injections. Conclusion: This small pilot study suggests that on macular OCT, the FRT/ORB ratio, and in particular values of 1.75 or less, may prove to be a useful, practical tool when deciding the follow up period for subjects undergoing treatment with intravitreal ranibizumab for nAMD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite expectations being high, the industrial take-up of Semantic Web technologies in developing services and applications has been slower than expected. One of the main reasons is that many legacy systems have been developed without considering the potential of theWeb in integrating services and sharing resources.Without a systematic methodology and proper tool support, the migration from legacy systems to SemanticWeb Service-based systems can be a tedious and expensive process, which carries a significant risk of failure. There is an urgent need to provide strategies, allowing the migration of legacy systems to Semantic Web Services platforms, and also tools to support such strategies. In this paper we propose a methodology and its tool support for transitioning these applications to Semantic Web Services, which allow users to migrate their applications to Semantic Web Services platforms automatically or semi-automatically. The transition of the GATE system is used as a case study. © 2009 - IOS Press and the authors. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In current organizations, valuable enterprise knowledge is often buried under rapidly expanding huge amount of unstructured information in the form of web pages, blogs, and other forms of human text communications. We present a novel unsupervised machine learning method called CORDER (COmmunity Relation Discovery by named Entity Recognition) to turn these unstructured data into structured information for knowledge management in these organizations. CORDER exploits named entity recognition and co-occurrence data to associate individuals in an organization with their expertise and associates. We discuss the problems associated with evaluating unsupervised learners and report our initial evaluation experiments in an expert evaluation, a quantitative benchmarking, and an application of CORDER in a social networking tool called BuddyFinder.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Development and validation of a selective and sensitive LCMS method for the determination of methotrexate polyglutamates in dried blood spots (DBS). Methods: DBS samples [spiked or patient samples] were prepared by applying blood to Guthrie cards which was then dried at room temperature. The method utilised 6-mm disks punched from the DBS samples (equivalent to approximately 12 μl of whole blood). The simple treatment procedure was based on protein precipitation using perchloric acid followed by solid phase extraction using MAX cartridges. The extracted sample was chromatographed using a reversed phase system involving an Atlantis T3-C18 column (3 μm, 2.1x150 mm) preceded by Atlantis guard column of matching chemistry. Analytes were subjected to LCMS analysis using positive electrospray ionization. Key Results: The method was linear over the range 5-400 nmol/L. The limits of detection and quantification were 1.6 and 5 nmol/L for individual polyglutamates and 1.5 and 4.5 nmol/L for total polyglutamates, respectively. The method has been applied successfully to the determination of DBS finger-prick samples from 47 paediatric patients and results confirmed with concentrations measured in matched RBC samples using conventional HPLC-UV technique. Conclusions and Clinical Relevance: The methodology has a potential for application in a range of clinical studies (e.g. pharmacokinetic evaluations or medication adherence assessment) since it is minimally invasive and easy to perform, potentially allowing parents to take blood samples at home. The feasibility of using DBS sampling can be of major value for future clinical trials or clinical care in paediatric rheumatology. © 2014 Hawwa et al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The expansion of the Internet has made the task of searching a crucial one. Internet users, however, have to make a great effort in order to formulate a search query that returns the required results. Many methods have been devised to assist in this task by helping the users modify their query to give better results. In this paper we propose an interactive method for query expansion. It is based on the observation that documents are often found to contain terms with high information content, which can summarise their subject matter. We present experimental results, which demonstrate that our approach significantly shortens the time required in order to accomplish a certain task by performing web searches.