142 resultados para Corneal diseases
Resumo:
Purpose To examine the influence of short-term miniscleral contact lens wear on corneal shape, thickness and anterior surface aberrations. Methods Scheimpflug imaging was captured before, immediately following and 3 hours after a short period (3 hours) of miniscleral contact lens wear for 10 young (mean 27 ± 5 years), healthy participants. Natural diurnal variations were considered by measuring baseline diurnal changes obtained on a separate control day without contact lens wear. Results Small but significant anterior corneal flattening was observed immediately following lens removal (overall mean 0.02 ± 0.03 mm, p < 0.001) which returned to baseline levels three hours after lens removal. During the three hour recovery period significant corneal thinning (-13.4 ± 10.5 μm) and posterior surface flattening (0.03 ± 0.02 mm) were also observed (both p < 0.01). The magnitude of posterior corneal flattening during recovery correlated with the amount of corneal thinning (r = 0.69, p = 0.03). Central corneal clearance (maximum tear reservoir depth) was not associated with corneal swelling following lens removal (r = -0.24, p > 0.05). An increase in lower-order corneal astigmatism Z(2,2) was also observed following lens wear (mean -0.144 ± 0.075 μm, p = 0.02). Conclusions Flattening of the anterior corneal surface was observed immediately following lens wear, while ‘rebound’ thinning and flattening of the posterior surface was evident following the recovery period. Modern miniscleral contact lenses that vault the cornea may slightly influence corneal shape and power but do not induce clinically significant corneal oedema during short-term wear.
Resumo:
Purpose The aim of the study was to determine the association, agreement, and detection capability of manual, semiautomated, and fully automated methods of corneal nerve fiber length (CNFL) quantification of the human corneal subbasal nerve plexus (SNP). Methods Thirty-three participants with diabetes and 17 healthy controls underwent laser scanning corneal confocal microscopy. Eight central images of the SNP were selected for each participant and analyzed using manual (CCMetrics), semiautomated (NeuronJ), and fully automated (ACCMetrics) software to quantify the CNFL. Results For the entire cohort, mean CNFL values quantified by CCMetrics, NeuronJ, and ACCMetrics were 17.4 ± 4.3 mm/mm2, 16.0 ± 3.9 mm/mm2, and 16.5 ± 3.6 mm/mm2, respectively (P < 0.01). CNFL quantified using CCMetrics was significantly higher than those obtained by NeuronJ and ACCMetrics (P < 0.05). The 3 methods were highly correlated (correlation coefficients 0.87–0.98, P < 0.01). The intraclass correlation coefficients were 0.87 for ACCMetrics versus NeuronJ and 0.86 for ACCMetrics versus CCMetrics. Bland–Altman plots showed good agreement between the manual, semiautomated, and fully automated analyses of CNFL. A small underestimation of CNFL was observed using ACCMetrics with increasing the amount of nerve tissue. All 3 methods were able to detect CNFL depletion in diabetic participants (P < 0.05) and in those with peripheral neuropathy as defined by the Toronto criteria, compared with healthy controls (P < 0.05). Conclusions Automated quantification of CNFL provides comparable neuropathy detection ability to manual and semiautomated methods. Because of its speed, objectivity, and consistency, fully automated analysis of CNFL might be advantageous in studies of diabetic neuropathy.
Resumo:
Purpose To investigate longitudinal changes of subbasal nerve plexus (SNP) morphology and its relationship with conventional measures of neuropathy in individuals with diabetes. Methods A cohort of 147 individuals with type 1 diabetes and 60 age-balanced controls underwent detailed assessment of clinical and metabolic factors, neurologic deficits, quantitative sensory testing, nerve conduction studies and corneal confocal microscopy at baseline and four subsequent annual visits. The SNP parameters included corneal nerve fiber density (CNFD), branch density (CNBD) and fiber length (CNFL) and were quantified using a fully-automated algorithm. Linear mixed models were fitted to examine the changes in corneal nerve parameters over time. Results At baseline, 27% of the participants had mild diabetic neuropathy. All SNP parameters were significantly lower in the neuropathy group compared to controls (P<0.05). Overall, 89% of participants examined at baseline also completed the final visit. There was no clinically significant change to health and metabolic parameters and neuropathy measures from baseline to the final visit. Linear mixed model revealed a significant linear decline of CNFD (annual change rate, -0.9 nerve/mm2, P=0.01) in the neuropathy group compared to controls, which was associated with age (β=-0.06, P=0.04) and duration of diabetes (β=-0.08, P=0.03). In the neuropathy group, absolute changes of CNBD and CNFL showed moderate correlations with peroneal conduction velocity and cold sensation threshold, respectively (rs, 0.38 and 0.40, P<0.05). Conclusion This study demonstrates dynamic small fiber damage at the SNP, thus providing justification for our ongoing efforts to establish corneal nerve morphology as an appropriate adjunct to conventional measures of DPN.
Resumo:
Improved glycemic control is the only treatment that has been shown to be effective for diabetic peripheral neuropathy in patients with type 1 diabetes (1). Continuous subcutaneous insulin infusion (CSII) is superior to multiple daily insulin injection (MDI) for reducing HbA1c and hypoglycemic events (2). Here, we have compared the benefits of CSII compared withMDI for neuropathy over 24months....
Resumo:
Limbal microvascular endothelial cells (L-MVEC) contribute to formation of the corneal-limbal stem cell niche and to neovascularization of diseased and injuries corneas. Nevertheless, despite these important roles in corneal health and disease, few attempts have been made to isolate L-MVEC with the view to studying their biology in vitro. We therefore explored the feasibility of generating primary cultures of L-MVEC from cadaveric human tissue. We commenced our study by evaluating growth conditions (MesenCult-XF system) that have been previously found to be associated with expression of the endothelial cell surface marker thrombomodulin/CD141, in crude cultures established from collagenase-digests of limbal stroma. The potential presence of L-MVEC in these cultures was examined by flow cytometry using a more specific marker for vascular endothelial cells, CD31/PECAM-1. These studies demonstrated that the presence of CD141 in crude cultures established using the MesenCult-XF system is unrelated to L-MVEC. Thus we subsequently explored the use of magnetic assisted cell sorting (MACS) for CD31 as a tool for generating cultures of L-MVEC, in conjunction with more traditional endothelial cell growth conditions. These conditions consisted of gelatin-coated tissue culture plastic and MCDB-131 medium supplemented with fetal bovine serum (10% v/v), D-glucose (10 mg/mL), epidermal growth factor (10 ng/mL), heparin (50 μg/mL), hydrocortisone (1 μg/mL) and basic fibroblast growth factor (10 ng/mL). Our studies revealed that use of endothelial growth conditions are insufficient to generate significant numbers of L-MVEC in primary cultures established from cadaveric corneal stroma. Nevertheless, through use of positive-MACS selection for CD31 we were able to routinely observe L-MVEC in cultures derived from collagenase-digests of limbal stroma. The presence of L-MVEC in these cultures was confirmed by immunostaining for von Willebrand factor (vWF) and by ingestion of acetylated low-density lipoprotein. Moreover, the vWF+ cells formed aligned cell-to-cell ‘trains’ when grown on Geltrex™. The purity of L-MVEC cultures was found to be unrelated to tissue donor age (32 to 80 years) or duration in eye bank corneal preservation medium prior to use (3 to 10 days in Optisol) (using multiple regression test). Optimal purity of L-MVEC cultures was achieved through use of two rounds of positive-MACS selection for CD31 (mean ± s.e.m, 65.0 ± 20.8%; p<0.05). We propose that human L-MVEC cultures generated through these techniques, in conjunction with other cell types, will provide a useful tool for exploring the mechanisms of blood vessel cell growth in vitro.
Resumo:
BACKGROUND Measuring disease and injury burden in populations requires a composite metric that captures both premature mortality and the prevalence and severity of ill-health. The 1990 Global Burden of Disease study proposed disability-adjusted life years (DALYs) to measure disease burden. No comprehensive update of disease burden worldwide incorporating a systematic reassessment of disease and injury-specific epidemiology has been done since the 1990 study. We aimed to calculate disease burden worldwide and for 21 regions for 1990, 2005, and 2010 with methods to enable meaningful comparisons over time. METHODS We calculated DALYs as the sum of years of life lost (YLLs) and years lived with disability (YLDs). DALYs were calculated for 291 causes, 20 age groups, both sexes, and for 187 countries, and aggregated to regional and global estimates of disease burden for three points in time with strictly comparable definitions and methods. YLLs were calculated from age-sex-country-time-specific estimates of mortality by cause, with death by standardised lost life expectancy at each age. YLDs were calculated as prevalence of 1160 disabling sequelae, by age, sex, and cause, and weighted by new disability weights for each health state. Neither YLLs nor YLDs were age-weighted or discounted. Uncertainty around cause-specific DALYs was calculated incorporating uncertainty in levels of all-cause mortality, cause-specific mortality, prevalence, and disability weights. FINDINGS Global DALYs remained stable from 1990 (2·503 billion) to 2010 (2·490 billion). Crude DALYs per 1000 decreased by 23% (472 per 1000 to 361 per 1000). An important shift has occurred in DALY composition with the contribution of deaths and disability among children (younger than 5 years of age) declining from 41% of global DALYs in 1990 to 25% in 2010. YLLs typically account for about half of disease burden in more developed regions (high-income Asia Pacific, western Europe, high-income North America, and Australasia), rising to over 80% of DALYs in sub-Saharan Africa. In 1990, 47% of DALYs worldwide were from communicable, maternal, neonatal, and nutritional disorders, 43% from non-communicable diseases, and 10% from injuries. By 2010, this had shifted to 35%, 54%, and 11%, respectively. Ischaemic heart disease was the leading cause of DALYs worldwide in 2010 (up from fourth rank in 1990, increasing by 29%), followed by lower respiratory infections (top rank in 1990; 44% decline in DALYs), stroke (fifth in 1990; 19% increase), diarrhoeal diseases (second in 1990; 51% decrease), and HIV/AIDS (33rd in 1990; 351% increase). Major depressive disorder increased from 15th to 11th rank (37% increase) and road injury from 12th to 10th rank (34% increase). Substantial heterogeneity exists in rankings of leading causes of disease burden among regions. INTERPRETATION Global disease burden has continued to shift away from communicable to non-communicable diseases and from premature death to years lived with disability. In sub-Saharan Africa, however, many communicable, maternal, neonatal, and nutritional disorders remain the dominant causes of disease burden. The rising burden from mental and behavioural disorders, musculoskeletal disorders, and diabetes will impose new challenges on health systems. Regional heterogeneity highlights the importance of understanding local burden of disease and setting goals and targets for the post-2015 agenda taking such patterns into account. Because of improved definitions, methods, and data, these results for 1990 and 2010 supersede all previously published Global Burden of Disease results.
Resumo:
Previous attempts to determine the degree to which exposure to environmental factors contribute to noncommunicable diseases (NCDs) have been very conservative and have significantly underestimated the actual contribution of the environment for at least two reasons. Firstly, most previous reports have excluded the contribution of lifestyle behavioral risk factors, but these usually involve significant exposure to environmental chemicals that increase risk of disease. Secondly, early life exposure to chemical contaminants is now clearly associated with an elevated risk of several diseases later in life, but these connections are often difficult to discern. This is especially true for asthma and neurodevelopmental conditions, but there is also a major contribution to the development of obesity and chronic diseases. Most cancers are caused by environmental exposures in genetically susceptible individuals. In addition, new information shows significant associations between cardiovascular diseases and diabetes and exposure to environmental chemicals present in air, food, and water. These relationships likely reflect the combination of epigenetic effects and gene induction. Environmental factors contribute significantly more to NCDs than previous reports have suggested. Prevention needs to shift focus from individual responsibility to societal responsibility and an understanding that effective prevention of NCDs ultimately relies on improved environmental management to reduce exposure to modifiable risks.
Resumo:
Background: High levels of wealth inequality with improved health statistics in South Africa (SA) provide an important opportunity to investigate non-communicable diseases (NCDs) among the poor. Aims: This paper uses two distinct national data sets to contrast patterns of mortality in rich and poor areas and explore the associations between poverty, risk factors, health care and selected NCDs diseases in South African adults. Methods: Causes of premature mortality in 1996 experienced in the poorest magisterial districts are compared with those in the richest, using average household wealth to classify districts. Logistic and multinomial regression are used to investigate the association of a household asset index and selected chronic conditions, related risk factors and healthcare indicators using data from the 1998 South African Demographic and Health Survey. Results: NCDs accounted for 39% and 33% of premature mortality in rich and poor districts respectively. The household survey data showed that the risk factors hypertension and obesity increased with increasing wealth, while most of the lifestyle factors, such as light smoking, domestic exposure to ``smoky'' fuels and alcohol dependence were associated with poverty. Treatment status for hypertension and asthma was worse for poor people than for rich people. Conclusions: The study suggests that NCDs and lifestyle-related risk factors are prevalent among the poor in SA and treatment for chronic diseases is lacking for poor people. The observed increase in hypertension and obesity with wealth suggests that unless comprehensive health promotion strategies are implemented, there will be an unmanageable chronic disease epidemic with future socioeconomic development in SA.
Resumo:
AIM To assess the effects of eye rubbing on corneal thickness (CT) and intraocular pressure (IOP) measurements obtained 0-30min after habitual eye rubbing in symptomatic patients. METHODS Measurements of IOP and CT were obtained at five locations (central, temporal, superior, nasal and inferior) before, and every 5min for 30min interval after 30s of eye rubbing, for 25 randomly selected eyes of 14 subjects with ocular allergy and 11 age-matched normals. Differences in measurements were calculated in each group [Baseline measurements minus measurements recorded at each time interval after eye rubbing (for IOP), and for each corneal location (for CT)] and comparison were then made between groups (allergic versus control) for differences in any observed effects. RESULTS Within groups, baseline mean IOPs in the allergic patient-group (14.2±3.0 mm Hg) and in the control group (13.1±1.9 mm Hg) were similar at all times, after eye rubbing (P >0.05, for all). The maximum reduction in IOP was 0.8 mm Hg in the control subjects and the maximum increase was also 0.8 mm Hg in the allergic subjects. Between groups (allergic versus control), the changes in IOP remained under 1 mm Hg at all times (P=0.2) after 30min of eye rubbing. Between 0 and 30min of CT measurements after eye rubbing, the mean central CT (CCT), inferior CT (ICT), superior CT (SCT), temporal CT (TCT) and nasal CT (NCT) did not vary significantly from baseline values in the control and allergic-subject groups (P>0.05, for both). Between both groups, changes in CT were similar at all locations (P>0.05) except for the TC which was minimally thinner by about 4.4 µm (P=0.001) in the allergic subjects than in the control subjects, 30min following 30s of eye rubbing. CONCLUSION IOP measured in allergic subjects after 30s of habitual eye rubbing was comparable with that obtained in normal subjects at all times between 0 and 30min. Although, CT in the allergic subjects were similar to those of the control subjects at all times, it varied between +10 and -7.5 µm following eye rubbing, with the temporal cornea showing consistent reductions in thickness in the subjects with allergy. However, this reduction was minimal and was considered to not be clinically relevant.
Resumo:
Background Internet-based surveillance systems provide a novel approach to monitoring infectious diseases. Surveillance systems built on internet data are economically, logistically and epidemiologically appealing and have shown significant promise. The potential for these systems has increased with increased internet availability and shifts in health-related information seeking behaviour. This approach to monitoring infectious diseases has, however, only been applied to single or small groups of select diseases. This study aims to systematically investigate the potential for developing surveillance and early warning systems using internet search data, for a wide range of infectious diseases. Methods Official notifications for 64 infectious diseases in Australia were downloaded and correlated with frequencies for 164 internet search terms for the period 2009–13 using Spearman’s rank correlations. Time series cross correlations were performed to assess the potential for search terms to be used in construction of early warning systems. Results Notifications for 17 infectious diseases (26.6%) were found to be significantly correlated with a selected search term. The use of internet metrics as a means of surveillance has not previously been described for 12 (70.6%) of these diseases. The majority of diseases identified were vaccine-preventable, vector-borne or sexually transmissible; cross correlations, however, indicated that vector-borne and vaccine preventable diseases are best suited for development of early warning systems. Conclusions The findings of this study suggest that internet-based surveillance systems have broader applicability to monitoring infectious diseases than has previously been recognised. Furthermore, internet-based surveillance systems have a potential role in forecasting emerging infectious disease events, especially for vaccine-preventable and vector-borne diseases
Resumo:
Climate change and solar ultraviolet radiation may affect vaccine-preventable infectious diseases (VPID), the human immune response process and the immunization service delivery system. We systematically reviewed the scientific literature and identified 37 relevant publications. Our study shows that climate variability and ultraviolet radiation may potentially affect VPID and the immunization delivery system through modulating vector reproduction and vaccination effectiveness, possibly influencing human immune response systems to the vaccination, and disturbing immunization service delivery. Further research is needed to determine these affects on climate-sensitive VPID and on human immune response to common vaccines. Such research will facilitate the development and delivery of optimal vaccination programs for target populations, to meet the goal of disease control and elimination.
Resumo:
The lack of adequate disease surveillance systems in Ebola-affected areas has both reduced the ability to respond locally and has increased global risk. There is a need to improve disease surveillance in vulnerable regions, and digital surveillance could present a viable approach.
Resumo:
PURPOSE - To present the results of same-day topography-guided photorefractive keratectomy (TG-PRK) and corneal collagen cross-linking (CXL) after intrastromal corneal ring (ISCR) implantation in patients with keratoconus. METHODS - Thirty-three patients (41 eyes) aged between 19 and 45 years were included in this prospective study. All patients underwent a femtosecond laser-enabled (Intralase FS; Abbott Medical Optics, Inc.) placement of intracorneal ring segments (Kerarings; Mediphacos, Brazil). Uncorrected distance visual acuity (UDVA), corrected distance visual acuity (CDVA), and keratometry readings remained stable for 6 months. Same-day PRK and CXL was subsequently performed in all patients. RESULTS - After 12 months of completion of the procedure, mean UDVA in log of minimal angle of resolution was significantly improved (0.74±0.54-0.10±0.16); CDVA did not improve significantly but 85% of eyes maintained or gained multiple lines of CDVA; mean refraction spherical equivalent improved (from -3.03±1.98 to -0.04±0.99 D), all keratometry readings were significantly reduced, from preoperative values, but coma did not vary significantly from preoperative values. Central corneal thickness and corneal thickness at the thinnest point were significantly (P<0.0001) reduced from 519.76±29.33 and 501.87±31.50 preoperatively to 464.71±36.79 and 436.55±47.42 postoperatively, respectively. Safety and efficacy indices were 0.97 and 0.88, respectively. From 6 months up until more than 1 year of follow-up, further significant improvement was observed only for UDVA (P<0.0001). CONCLUSIONS - Same-day combined TG-PRK and CXL after ISCR implantation is a safe and effective option for improving visual acuity and visual function, and it halts the progression of the keratoconus. The improvements recorded after 6 months of follow-up were maintained or improved upon 1 year after the procedure.
Resumo:
Oral diseases, or stomatognathic diseases, denote the diseases of the mouth (“stoma”) and jaw (“gnath”). Dental caries and periodontal diseases have been traditionally considered as the most important global oral health burdens. It is important to note that in oral diagnostics, the greatest challenges are determining the clinical utility of potential biomarkers for screening (in asymptomatic people), predicting the early onset of disease (prognostic tests), and evaluating the disease activity and the efficacy of therapy through innovative diagnostic tests. An oral diagnostic test, in principle, should provide valuable information for differential diagnosis, localization of disease, and severity of infection. This information can then be incorporated by the physician when planning treatments and will provide means for assessing the effectiveness of therapy.