91 resultados para Burr grass
Resumo:
Aim: To compare the diagnostic performance of accredited glaucoma optometrists (AGO) for both the diagnosis of glaucoma and the decision to treat with that of routine hospital eye care, against a reference standard of expert opinion (a consultant ophthalmologist with a special interest in glaucoma). Methods: A directly comparative, masked, performance study was undertaken in Grampian, Scotland. Of 165 people invited to participate, 100 (61%) were examined. People suspected of having glaucoma underwent, within one month, a full ophthalmic assessment in both a newly established community optometry led glaucoma management scheme and a consultant led hospital eye service. Results: Agreement between the AGO and the consultant ophthalmologist in diagnosing glaucoma was substantial (89%; ? = 0.703, SE = 0.083). Agreement over the need for treatment was also substantial (88%; ? = 0.716, SE = 0.076). The agreement between the trainee ophthalmologists and the consultant ophthalmologist in the diagnosis of glaucoma and treatment recommendation was moderate (83%, ? = 0.541, SE = 0.098, SE = 0.98; and 81%, ? = 0.553, SE = 0.90, respectively). The diagnostic accuracy of the optometrists in detecting glaucoma in this population was high for specificity (0.93 (95% confidence interval, 0.85 to 0.97)) but lower for sensitivity (0.76 (0.57 to 0.89)). Performance was similar when accuracy was assessed for treatment recommendation (sensitivity 0.73 (0.57 to 0.85); specificity 0.96 (0.88 to 0.99)). The differences in sensitivity and specificity between AGO and junior ophthalmologist were not statistically significant. Conclusions: Community optometrists trained in glaucoma provided satisfactory decisions regarding diagnosis and initiation of treatment for glaucoma. With such additional training in glaucoma, optometrists are at least as accurate as junior ophthalmologists but some cases of glaucoma are missed.
Resumo:
PURPOSE: To identify vision Patient-Reported Outcomes instruments relevant to glaucoma and assess their content validity.
METHODS: MEDLINE, MEDLINE in Process, EMBASE and SCOPUS (to January 2009) were systematically searched. Observational studies or randomised controlled trials, published in English, reporting use of vision instruments in glaucoma studies involving adults were included. In addition, reference lists were scanned to identify additional studies describing development and/or validation to ascertain the final version of the instruments. Instruments' content was then mapped onto a theoretical framework, the World Health Organization International Classification of Functioning, Disability and Health. Two reviewers independently evaluated studies for inclusion and quality assessed instrument content.
RESULTS: Thirty-three instruments were identified. Instruments were categorised into thirteen vision status, two vision disability, one vision satisfaction, five glaucoma status, one glaucoma medication related to health status, five glaucoma medication side effects and six glaucoma medication satisfaction measures according to each instruments' content. The National Eye Institute Visual Function Questionnaire-25, Impact of Vision Impairment and Treatment Satisfaction Survey-Intraocular Pressure had the highest number of positive ratings in the content validity assessment.
CONCLUSION: This study provides a descriptive catalogue of vision-specific PRO instruments, to inform the choice of an appropriate measure of patient-reported outcomes in a glaucoma context.
Resumo:
PURPOSE: To evaluate the relative benefits and to identify any adverse effects of surgical interventions for limbal stem cell deficiency (LSCD).
DESIGN: Systematic literature review.
METHODS: We searched the following electronic databases from January 1, 1989 through September 30, 2006: MEDLINE, EMBASE, Science citation index, BIOSIS, and the Cochrane Library. In addition, reference lists were scanned to identify any additional reports. The quality of published reports was assessed using standard methods. The main outcome measure was improvement in vision of at least two Snellen lines of best-corrected visual acuity (BCVA). Data on adverse outcomes also were collected.
RESULTS: Twenty-six studies met the inclusion criteria. There were no randomized controlled studies. All 26 studies were either prospective or retrospective case series. For bilateral severe LSCD, keratolimbal allograft was the most common intervention with systemic immunosuppression. Other interventions included eccentric penetrating keratolimbal allografts and cultivated autologous oral mucosal epithelial grafts. An improvement in BCVA of two lines or more was reported in 31% to 67% of eyes. For unilateral severe LSCD, the most common surgical intervention was contralateral conjunctival limbal autograft, with 35% to 88% of eyes gaining an improvement in BCVA of two lines or more. The only study evaluating partial LSCD showed an improvement in BCVA of two lines or more in 39% of eyes.
CONCLUSIONS: Studies to date have not provided strong evidence to guide clinical practice on which surgery is most beneficial to treat various types of LSCD. Standardized data collection in a multicenter LSCD register is suggested.
Resumo:
OBJECTIVE: To assess the agreement of tonometers available for clinical practice with the Goldmann applanation tonometer (GAT), the most commonly accepted reference device.
DESIGN: A systematic review and meta-analysis of directly comparative studies assessing the agreement of 1 or more tonometers with the reference tonometer (GAT).
PARTICIPANTS: A total of 11 582 participants (15 525 eyes) were included.
METHODS: Summary 95% limits of agreement (LoA) were produced for each comparison.
MAIN OUTCOME MEASURES: Agreement, recordability, and reliability.
RESULTS: A total of 102 studies, including 130 paired comparisons, were included, representing 8 tonometers: dynamic contour tonometer, noncontact tonometer (NCT), ocular response analyzer, Ocuton S, handheld applanation tonometer (HAT), rebound tonometer, transpalpebral tonometer, and Tono-Pen. The agreement (95% limits) seemed to vary across tonometers: 0.2 mmHg (-3.8 to 4.3 mmHg) for the NCT to 2.7 mmHg (-4.1 to 9.6 mmHg) for the Ocuton S. The estimated proportion within 2 mmHg of the GAT ranged from 33% (Ocuton S) to 66% and 59% (NCT and HAT, respectively). Substantial inter- and intraobserver variability were observed for all tonometers.
CONCLUSIONS: The NCT and HAT seem to achieve a measurement closest to the GAT. However, there was substantial variability in measurements both within and between studies.
Resumo:
BACKGROUND: Open angle glaucoma (OAG) is the commonest cause of irreversible blindness worldwide. OBJECTIVES: To study the relative effects of medical and surgical treatment of OAG. SEARCH STRATEGY: We searched the Cochrane Central Register of Controlled Trials (The Cochrane Library Issue 1, 2005), MEDLINE (1966 to February 2005), EMBASE (1988 to February 2005), and reference lists of articles. We also contacted researchers in the field. SELECTION CRITERIA: Randomised controlled trials comparing medications to surgery in adults. DATA COLLECTION AND ANALYSIS: Two authors independently assessed trial quality and extracted data. We contacted trial investigators for missing information. MAIN RESULTS: Four trials involving 888 participants with previously untreated OAG were included. Surgery was Scheie's procedure in one trial and trabeculectomy in three trials. In three trials, primary medication was usually pilocarpine, in one trial a beta-blocker.In the most recent trial, participants with mild OAG, progressive visual field (VF) loss, after adjustment for cataract surgery, was not significantly different for medications compared to trabeculectomy (Odds ratio (OR) 0.74; 95% CI 0.54 to 1.01). Reduction of vision, with a higher risk of developing cataract (OR 2.69, 95%% CI 1.64 to 4.42), and more patient discomfort was more likely with trabeculectomy than medication.There is some evidence, from three trials, for people with moderately advanced glaucoma that medication is associated with more progressive VF loss and 6 to 8 mmHg less intraocular pressure (IOP) lowering than surgery, either by a Scheie's procedure or trabeculectomy. There was a trend towards an increased risk of failed IOP control over time for initial pilocarpine treatment compared to trabeculectomy. In the longer-term (two trials) the risk of failure was significantly greater with medication than trabeculectomy (OR 3.90, 95% CI 1.60 to 9.53; HR 7.27, 95% CI 2.23 to 25.71). Medicine and surgery have evolved since these trials were undertaken, and additionally the evidence is potentially subject to detection and attrition bias. AUTHORS' CONCLUSIONS: Evidence from one trial suggests, for mild OAG, that VF deterioration up to five-years is not significantly different whether treatment is initiated with medication or trabeculectomy. Reduced vision, cataract and eye discomfort are more likely with trabeculectomy. There is some evidence, for more severe OAG, that initial medication (pilocarpine, now rarely used as first line medication) is associated with greater VF deterioration than surgery. In general, surgery lowers IOP more than medication.There was no evidence to determine the effectiveness of contemporary medication (prostaglandin analogues, alpha2-agonists and topical carbonic anhydrase inhibitors) compared to surgery in severe OAG, and in people of black African ethnic origin who have a greater risk of more severe open angle glaucoma. More research is required.
Resumo:
BACKGROUND: Glaucoma is the leading cause of irreversible blindness. Although primary open-angle glaucoma is more common, primary angle-closure glaucoma (PACG) is more likely to result in irreversible blindness. By 2020, 5·3 million people worldwide will be blind because of PACG. The current standard care for PACG is a stepped approach of a combination of laser iridotomy surgery (to open the drainage angle) and medical treatment (to reduce intraocular pressure). If these treatments fail, glaucoma surgery (eg, trabeculectomy) is indicated. It has been proposed that, because the lens of the eye plays a major role in the mechanisms leading to PACG, early clear lens extraction will improve glaucoma control by opening the drainage angle. This procedure might reduce the need for drugs and glaucoma surgery, maintain good visual acuity, and improve quality of life compared with standard care.EAGLE aims to evaluate whether early lens extraction improves patient-reported, clinical outcomes, and cost-effectiveness, compared with standard care.
METHODS/DESIGN: EAGLE is a multicentre pragmatic randomized trial. All people presenting to the recruitment centres in the UK and east Asia with newly diagnosed PACG and who are at least 50 years old are eligible.The primary outcomes are EQ-5D, intraocular pressure, and incremental cost per quality adjusted life year (QALY) gained. Other outcomes are: vision and glaucoma-specific patient-reported outcomes, visual acuity, visual field, angle closure, number of medications, additional surgery (e.g., trabeculectomy), costs to the health services and patients, and adverse events.A single main analysis will be done at the end of the trial, after three years of follow-up. The analysis will be based on all participants as randomized (intention to treat). 400 participants (200 in each group) will be recruited, to have 90% power at 5% significance level to detect a difference in EQ-5D score between the two groups of 0·05, and a mean difference in intraocular pressure of 1·75 mm Hg. The study will have 80% power to detect a difference of 15% in the glaucoma surgery rate.
TRIAL REGISTRATION: ISRCTN44464607.
Resumo:
Soil food webs are characterised by complex direct and indirect effects among the organisms. Consumption of microorganisms by soil animals is considered as an important factor that contributes to the stability of communities, though cascading effects within the food web can be difficult to detect. In a greenhouse experiment, an addition of a high number the fungal feeding collembola Folsomia quadrioculata was applied to grassland soil food webs in monocultures of three plant species: Plantago lanceolato (forb), Lotus corniculatus (legume) and Holcus lanatus (grass). The abundance of microorganisms, determined as the abundances of phospholipid fatty acids (PLFAs) and the abundances of resident invertebrates, nematodes and collembolans, did not change due to the addition of E quadrioculata. Trophic positions of collembolans were determined by analyses of natural abundances of N-15 stable isotopes. The use of food resources by microorganisms and collembolans was determined by C-13 analysis of microbial PLFAs and solid samples of collembolans. delta C-13 values of the resident collembola Folsomia fimetaria were lower in the presence of E quadrioculata than in the control food webs indicating a use of more depleted C-13 food resources by E fimetaria. The delta N-15 values of E fimetaria did not change at the addition of E quadrioculata thus no change of trophic levels was detected. The switch of E fimetaria to a different food resource could be due to indirect interactions in the food web as the two collembolan species were positioned on different trophic positions, according to different delta N-15 values. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The application of slurry nutrients to land can be associated with unintended losses to the environment depending on soil and weather conditions. Correct timing of slurry application, however, can increase plant nutrient uptake and reduce losses. A decision support system (DSS), which predicts optimum conditions for slurry spreading based on the Hybrid Soil Moisture Deficit (HSMD) model, was investigated for use as a policy tool. The DSS recommendations were compared to farmer perception of suitable conditions for slurry spreading for three soil drainage classes (well, moderate and poorly drained) to better understand on farm slurry management practices and to identify potential conflict with farmer opinion. Six farmers participated in a survey over two and a half years, during which they completed a daily diary, and their responses were compared to Soil Moisture Deficit (SMD) calculations and weather data recorded by on farm meteorological stations. The perception of land drainage quality differed between farmers and was related to their local knowledge and experience. It was found that the allocation of grass fields to HSMD drainage classes using a visual assessment method aligned farmer perception of drainage at the national scale. Farmer opinion corresponded to the theoretical understanding that slurry should not be applied when the soil is wetter than field capacity, i.e. when drainage can occur. While weather and soil conditions (especially trafficability) were the principal reasons given by farmers not to spread slurry, farm management practices (grazing and silage) and current Nitrates Directive policies (closed winter period for spreading) combined with limited storage capacities were obstacles to utilisation of slurry nutrients. Despite the slightly more restrictive advice of the DSS regarding the number of suitable spreading opportunities, the system has potential to address an information deficit that would help farmers to reduce nutrient losses and optimise plant nutrient uptake by improved slurry management. The DSS advice was in general agreement with the farmers and, therefore, they should not be resistant to adopting the tool for day to day management.
Resumo:
We present pollen records from three sites in south Westland, New Zealand, that document past vegetation and inferred climate change between approximately 30,000 and 15,000 cal. yr BP. Detailed radiocarbon dating of the enclosing sediments at one of those sites, Galway tarn, provides a more robust chronology for the structure and timing of climate-induced vegetation change than has previously been possible in this region. The Kawakawa/Oruanui tephra, a key isochronous marker, affords a precise stratigraphic link across all three pollen records, while other tie points are provided by key pollen-stratigraphic changes which appear to be synchronous across all three sites. Collectively, the records show three episodes in which grassland, interpreted as indicating mostly cold subalpine to alpine conditions, was prevalent in lowland south Westland, separated by phases dominated by subalpine shrubs and montane-lowland trees, indicating milder interstadial conditions. Dating, expressed as a Bayesian-estimated single 'best' age followed in parentheses by younger/older bounds of the 95% confidence modelled age range, indicates that a cold stadial episode, whose onset was marked by replacement of woodland by grassland, occurred between 28,730 (29,390-28,500) and 25,470 (26,090-25,270) cal. yr BP (years before AD, 1950), prior to the deposition of the Kawakawa/Oruanui tephra. Milder interstadial conditions prevailed between 25,470 (26,090-25,270) and 24,400 (24,840-24,120) cal. yr BP and between 22,630 (22,930-22,340) and 21,980 (22,210-21,580) cal. yr BP, separated by a return to cold stadial conditions between 24,400 and 22,630 cal. yr BP. A final episode of grass-dominated vegetation, indicating cold stadial conditions, occurred from 21,980 (22,210-21,580) to 18,490 (18,670-17,950) cal. yr BP. The decline in grass pollen, indicating progressive climate amelioration, was well advanced by 17,370 (17,730-17,110) cal. yr BP, indicating that the onset of the termination in south Westland occurred sometime between ca 18,490 and ca 17,370 cal. yr BP. A similar general pattern of stadials and interstadials is seen, to varying degrees of resolution but generally with lesser chronological control, in many other paleoclimate proxy records from the New Zealand region. This highly resolved chronology of vegetation changes from southwestern New Zealand contributes to the examination of past climate variations in the southwest Pacific region. The stadial and interstadial episodes defined by south Westland pollen records represent notable climate variability during the latter part of the Last Glaciation. Similar climatic patterns recorded farther afield, for example from Antarctica and the Southern Ocean, imply that climate variations during the latter part of the Last Glaciation and the transition to the Holocene interglacial were inter-regionally extensive in the Southern Hemisphere and thus important to understand in detail and to place into a global context. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
Several observational studies have suggested the potential benefit of internal limiting membrane (ILM) peeling to treat idiopathic full-thickness macular hole (FTMH). However, no strong evidence is available on the potential benefit(s) of this surgical manoeuvre and uncertainty remains among vitreoretinal surgeons about the indication for peeling the ILM, whether to use it in all cases or in long-standing and/or larger holes.
Resumo:
Prediction of biotic responses to future climate change in tropical Africa tends to be based on two modelling approaches: bioclimatic species envelope models and dynamic vegetation models. Another complementary but underused approach is to examine biotic responses to similar climatic changes in the past as evidenced in fossil and historical records. This paper reviews these records and highlights the information that they provide in terms of understanding the local- and regional-scale responses of African vegetation to future climate change. A key point that emerges is that a move to warmer and wetter conditions in the past resulted in a large increase in biomass and a range distribution of woody plants up to 400–500 km north of its present location, the so-called greening of the Sahara. By contrast, a transition to warmer and drier conditions resulted in a reduction in woody vegetation in many regions and an increase in grass/savanna-dominated landscapes. The rapid rate of climate warming coming into the current interglacial resulted in a dramatic increase in community turnover, but there is little evidence for widespread extinctions. However, huge variation in biotic response in both space and time is apparent with, in some cases, totally different responses to the same climatic driver. This highlights the importance of local features such as soils, topography and also internal biotic factors in determining responses and resilience of the African biota to climate change, information that is difficult to obtain from modelling but is abundant in palaeoecological records.
Resumo:
To explore the presentation behaviours and pathways to detection of adults who first presented to UK hospital eye services with severe glaucoma.
Resumo:
The response of arsenate and non-tolerant Holcus lanatus L. phenotypes, where tolerance is achieved through suppression of high affinity phosphate/arsenate root uptake, was investigated under different growth regimes to investigate why there is a polymorphism in tolerance found in populations growing on uncontaminated soil. Tolerant plants screened from an arsenic uncontaminated population differed, when grown on the soil from the populations origin, from non-tolerants, in their biomass allocation under phosphate fertilization: non-tolerants put more resources into tiller production and down regulated investment in root production under phosphate fertilization while tolerants tillered less effectively and did not alter resource allocation to shoot biomass under phosphate fertilization. The two phenotypes also differed in their shoot mineral status having higher concentrations of copper, cadmium, lead and manganese, but phosphorus status differed little, suggesting tight homeostasis. The polymorphism was also widely present (40%) in other wild grass species suggesting an important ecological role for this gene that can be screened through plant root response to arsenate.