7 resultados para Target Held Method
em Aston University Research Archive
Resumo:
PURPOSE: To demonstrate the application of low-coherence reflectometry to the study of biometric changes during disaccommodation responses in human eyes after cessation of a near task and to evaluate the effect of contact lenses on low-coherence reflectometry biometric measurements. METHODS: Ocular biometric parameters of crystalline lens thickness (LT) and anterior chamber depth (ACD) were measured with the LenStar device during and immediately after a 5 D accommodative task in 10 participants. In a separate trial, accommodation responses were recorded with a Shin-Nippon WAM-5500 optometer in a subset of two participants. Biometric data were interleaved to form a profile of post-task anterior segment changes. In a further experiment, the effect of soft contact lenses on LenStar measurements was evaluated in 15 participants. RESULTS: In 10 adult participants, increased LT and reduced ACD was seen during the 5 D task. Post-task, during fixation of a 0 D target, a profile of the change in LT and ACD against time was observed. In the two participants with accommodation data (one a sufferer of nearwork-induced transient myopia and other a non-sufferer), the post-task changes in refraction compared favorably with the interleaved LenStar biometry data. The insertion of soft contact lenses did not have a significant effect on LenStar measures of ACD or LT (mean change: -0.007 mm, p = 0.265 and + 0.001 mm, p = 0.875, respectively). CONCLUSIONS: With the addition of a relatively simple stimulus modification, the LenStar instrument can be used to produce a profile of post-task changes in LT and ACD. The spatial and temporal resolution of the system is sufficient for the investigation of nearwork-induced transient myopia from a biometric viewpoint. LenStar measurements of ACD and LT remain valid after the fitting of soft contact lenses.
Resumo:
OBJECTIVES: To assess whether blood pressure control in primary care could be improved with the use of patient held targets and self monitoring in a practice setting, and to assess the impact of these on health behaviours, anxiety, prescribed antihypertensive drugs, patients' preferences, and costs. DESIGN: Randomised controlled trial. SETTING: Eight general practices in south Birmingham. PARTICIPANTS: 441 people receiving treatment in primary care for hypertension but not controlled below the target of < 140/85 mm Hg. INTERVENTIONS: Patients in the intervention group received treatment targets along with facilities to measure their own blood pressure at their general practice; they were also asked to visit their general practitioner or practice nurse if their blood pressure was repeatedly above the target level. Patients in the control group received usual care (blood pressure monitoring by their practice). MAIN OUTCOME MEASURES: Primary outcome: change in systolic blood pressure at six months and one year in both intervention and control groups. Secondary outcomes: change in health behaviours, anxiety, prescribed antihypertensive drugs, patients' preferences of method of blood pressure monitoring, and costs. RESULTS: 400 (91%) patients attended follow up at one year. Systolic blood pressure in the intervention group had significantly reduced after six months (mean difference 4.3 mm Hg (95% confidence interval 0.8 mm Hg to 7.9 mm Hg)) but not after one year (mean difference 2.7 mm Hg (- 1.2 mm Hg to 6.6 mm Hg)). No overall difference was found in diastolic blood pressure, anxiety, health behaviours, or number of prescribed drugs. Patients who self monitored lost more weight than controls (as evidenced by a drop in body mass index), rated self monitoring above monitoring by a doctor or nurse, and consulted less often. Overall, self monitoring did not cost significantly more than usual care (251 pounds sterling (437 dollars; 364 euros) (95% confidence interval 233 pounds sterling to 275 pounds sterling) versus 240 pounds sterling (217 pounds sterling to 263 pounds sterling). CONCLUSIONS: Practice based self monitoring resulted in small but significant improvements of blood pressure at six months, which were not sustained after a year. Self monitoring was well received by patients, anxiety did not increase, and there was no appreciable additional cost. Practice based self monitoring is feasible and results in blood pressure control that is similar to that in usual care.
Resumo:
Purpose: To analyse the relationship between measured intraocular pressure (IOP) and central corneal thickness (CCT), corneal hysteresis (CH) and corneal resistance factor (CRF) in ocular hypertension (OHT), primary open-angle (POAG) and normal tension glaucoma (NTG) eyes using multiple tonometry devices. Methods: Right eyes of patients diagnosed with OHT (n=47), normal tension glaucoma (n=17) and POAG (n=50) were assessed, IOP was measured in random order with four devices: Goldmann applanation tonometry (GAT); Pascal(R) dynamic contour tonometer (DCT); Reichert(R) ocular response analyser (ORA); and Tono-Pen(R) XL. CCT was then measured using a hand-held ultrasonic pachymeter. CH and CRF were derived from the air pressure to corneal reflectance relationship of the ORA data. Results: Compared to the GAT, the Tonopen and ORA Goldmann equivalent (IOPg) and corneal compensated (IOPcc) measured higher IOP readings (F=19.351, p<0.001), particularly in NTG (F=12.604, p<0.001). DCT was closest to Goldmann IOP and had the lowest variance. CCT was significantly different (F=8.305, p<0.001) between the 3 conditions as was CH (F=6.854, p=0.002) and CRF (F=19.653, p<0.001). IOPcc measures were not affected by CCT. The DCT was generally not affected by corneal biomechanical factors. Conclusion: This study suggests that as the true pressure of the eye cannot be determined non-invasively, measurements from any tonometer should be interpreted with care, particularly when alterations in the corneal tissue are suspected.
Resumo:
Biomass is projected to account for approximately half of the new energy production required to achieve the 2020 primary energy target in the UK. Combined heat and power (CHP) bioenergy systems are not only a highly efficient method of energy conversion, at smaller-scales a significant proportion of the heat produced can be effectively utilised for hot water, space heating or industrial heating purposes. However, there are many barriers to project development and this has greatly inhibited deployment in the UK. Project viability is highly subjective to changes in policy, regulation, the finance market and the low cost incumbent; a high carbon centralised energy system. Unidentified or unmitigated barriers occurring during the project lifecycle may not only negatively impact on the project but could ultimately lead to project failure. The research develops a decision support system (DSS) for small-scale (500 kWe to 10 MWe) biomass combustion CHP project development and risk management in the early stages of a potential project’s lifecycle. By supporting developers in the early stages of project development with financial, scheduling and risk management analysis, the research aims to reduce the barriers identified and streamline decision-making. A fuzzy methodology is also applied throughout the developed DSS to support developers in handling the uncertain or approximate information often held at the early stages of the project lifecycle. The DSS is applied to a case study of a recently failed (2011) small-scale biomass CHP project to demonstrate its applicability and benefits. The application highlights that the proposed development within the case study was not viable. Moreover, further analysis of the possible barriers with the DSS confirmed that some possible modifications to be project could have improved this, such as a possible change of feedstock to a waste or residue, addressing the unnecessary land lease cost or by increasing heat utilisation onsite. This analysis is further supported by a practitioner evaluation survey that confirms the research contribution and objectives are achieved.
Resumo:
Recent research suggests that the ability of an extraneous formant to impair intelligibility depends on the variation of its frequency contour. This idea was explored using a method that ensures interference cannot occur through energetic masking. Three-formant (F1+F2+F3) analogues of natural sentences were synthesized using a monotonous periodic source. Target formants were presented monaurally, with the target ear assigned randomly on each trial. A competitor for F2 (F2C) was presented contralaterally; listeners must reject F2C to optimize recognition. In experiment 1, F2Cs with various frequency and amplitude contours were used. F2Cs with time-varying frequency contours were effective competitors; constant-frequency F2Cs had far less impact. To a lesser extent, amplitude contour also influenced competitor impact; this effect was additive. In experiment 2, F2Cs were created by inverting the F2 frequency contour about its geometric mean and varying its depth of variation over a range from constant to twice the original (0%-200%). The impact on intelligibility was least for constant F2Cs and increased up to ∼100% depth, but little thereafter. The effect of an extraneous formant depends primarily on its frequency contour; interference increases as the depth of variation is increased until the range exceeds that typical for F2 in natural speech.
Resumo:
Recent research suggests that the ability of an extraneous formant to impair intelligibility depends on the variation of its frequency contour. This idea was explored using a method that ensures interference occurs only through informational masking. Three-formant analogues of sentences were synthesized using a monotonous periodic source (F0 = 140 Hz). Target formants were presented monaurally; the target ear was assigned randomly on each trial. A competitor for F2 (F2C) was presented contralaterally; listeners must reject F2C to optimize recognition. In experiment 1, F2Cs with various frequency and amplitude contours were used. F2Cs with time-varying frequency contours were effective competitors; constant-frequency F2Cs had far less impact. Amplitude contour also influenced competitor impact; this effect was additive. In experiment 2, F2Cs were created by inverting the F2 frequency contour about its geometric mean and varying its depth of variation over a range from constant to twice the original (0–200%). The impact on intelligibility was least for constant F2Cs and increased up to ~100% depth, but little thereafter. The effect of an extraneous formant depends primarily on its frequency contour; interference increases as the depth of variation is increased until the range exceeds that typical for F2 in natural speech.
Resumo:
This study suggests a novel application of Inverse Data Envelopment Analysis (InvDEA) in strategic decision making about mergers and acquisitions in banking. The conventional DEA assesses the efficiency of banks based on the information gathered about the quantities of inputs used to realize the observed level of outputs produced. The decision maker of a banking unit willing to merge/acquire another banking unit needs to decide about the inputs and/or outputs level if an efficiency target for the new banking unit is set. In this paper, a new InvDEA-based approach is developed to suggest the required level of the inputs and outputs for the merged bank to reach a predetermined efficiency target. This study illustrates the novelty of the proposed approach through the case of a bank considering merging with or acquiring one of its competitors to synergize and realize higher level of efficiency. A real data set of 42 banking units in Gulf Corporation Council countries is used to show the practicality of the proposed approach.