189 resultados para STRATIFIED TIDAL RECTIFICATION
Resumo:
Executive Summary Emergency health is a critical component of Australia’s health system and emergency departments (EDs) are increasingly congested from growing demand and blocked access to inpatient beds. The Emergency Health Services Queensland (EHSQ) study aims to identify the factors driving increased demand for emergency health and to evaluate strategies which may safely reduce the future demand growth. This monograph addresses the perspectives of users of both ambulance services and EDs. The research reported here aimed to identify the perspectives of users of emergency health services, both ambulance services and public hospital Emergency Departments and to identify the factors that they took into consideration when exercising their choice of location for acute health care. A cross-sectional survey design was used involving a survey of patients or their carers presenting to the EDs of a stratified sample of eight hospitals. A specific purpose questionnaire was developed based on a novel theoretical model which had been derived from analysis of the literature (Monograph 1). Two survey versions were developed: one for adult patients (self-complete); and one for children (to be completed by parents/guardians). The questionnaires measured perceptions of social support, health status, illness severity, self-efficacy; beliefs and attitudes towards ED and ambulance services; reasons for using these services, and actions taken prior to the service request. The survey was conducted at a stratified sample of eight hospitals representing major cities (four), inner regional (two) and outer regional and remote (two). Due to practical limitations, data were collected for ambulance and ED users within hospital EDs, while patients were waiting for or under treatment. A sample size quota was determined for each ED based on their 2009/10 presentation volumes. The data collection was conducted by four members of the research team and a group of eight interviewers between March and May 2011 (corresponding to autumn season). Of the total of 1608 patients in all eight emergency departments the interviewers were able to approach 1361 (85%) patients and seek their consent to participate in the study. In total, 911 valid surveys were available for analysis (response rate= 67%). These studies demonstrate that patients elected to attend hospital EDs in a considered fashion after weighing up alternatives and there is no evidence of deliberate or ill-informed misuse. • Patients attending ED have high levels of social support and self-efficacy that speak to the considered and purposeful nature of the exercise of choice. • About one third of patients have new conditions while two thirds have chronic illnesses • More than half the attendees (53.1%) had consulted a healthcare professional prior to making the decision. • The decision to seek urgent care at an ED was mostly constructed around the patient’s perception of the urgency and severity of their illness, reinforced by a strong perception that the hospital ED was the correct location for them (better specialised staff, better care for my condition, other options not as suitable). • 33% of the respondent held private hospital insurance but nevertheless attended a public hospital ED. Similarly patients exercised considered and rational judgements in their choice to seek help from the ambulance service. • The decision to call for ambulance assistance was based on a strong perception about the severity of the illness (too severe to use other means of transport) and that other options were not considered appropriate. • The decision also appeared influenced by a perception that the ambulance provided appropriate access to the ED which was considered most appropriate for their particular condition (too severe to go elsewhere, all facilities in one spot, better specialised and better care). • In 43.8% of cases a health care professional advised use of the ambulance. • Only a small number of people perceived that ambulance should be freely available regardless of severity or appropriateness. These findings confirm a growing understanding that the choice of professional emergency health care services is not made lightly but rather made by reasonable people exercising a judgement which is influenced by public awareness of the risks of acute health and which is most often informed by health professionals. It is also made on the basis of a rational weighing up of alternatives and a deliberate and considered choice to seek assistance from a service which the patient perceived was most appropriate to their needs at that time. These findings add weight to dispensing with public perceptions that ED and ambulance congestion is a result of inappropriate choice by patients. The challenge for health services is to better understand the patient’s needs and to design and validate services that meet those needs. The failure of our health system to do so should not be grounds for blaming the patient, claiming inappropriate patient choices.
Resumo:
Background Trials of new technologies to remotely monitor for signs and symptoms of worsening heart failure are continually emerging. The extent to which technological differences impact the effectiveness of non-invasive remote monitoring for heart failure management is unknown. Objective To examine the effect of specific technology used for non-invasive remote monitoring of people with heart failure on all-cause mortality and heart failure-related hospitalisations. Methods A sub-analysis of a large systematic review and meta-analysis was conducted. Studies were stratified according to the specific type of technology used and separate meta-analyses were performed. Four different types of non-invasive remote monitoring technologies were identified including structured telephone calls, videophone, interactive voice response devices and telemonitoring. Results Only structured telephone calls and telemonitoring were effective in reducing the risk of all-cause mortality (RR 0.87; 95% CI=0.75-1.01; p=0.06 and 0.62; 95% CI=0.50-0.77; p<0.0001) and heart failure-related hospitalisations (RR 0.77; 95% CI=0.68-0.87; p<0.001) and 0.75; 95% CI=0.63-0.91; p=0.003). More research data is required for videophone and interactive voice response technologies. Conclusions This sub-analysis identified that only two of the four specific technologies used for non-invasive remote monitoring in heart failure improved outcomes. When results of studies that involved these disparate technologies were combined in previous meta-analyses, significant improvements in outcomes were identified. As such, this study has highlighted implications for future meta-analyses of randomised controlled trials focused on evaluating the effectiveness of remote monitoring in heart failure.
Resumo:
Background The effects of extra-pleural pneumonectomy (EPP) on survival and quality of life in patients with malignant pleural mesothelioma have, to our knowledge, not been assessed in a randomised trial. We aimed to assess the clinical outcomes of patients who were randomly assigned to EPP or no EPP in the context of trimodal therapy in the Mesothelioma and Radical Surgery (MARS) feasibility study. Methods MARS was a multicentre randomised controlled trial in 12 UK hospitals. Patients aged 18 years or older who had pathologically confirmed mesothelioma and were deemed fit enough to undergo trimodal therapy were included. In a prerandomisation registration phase, all patients underwent induction platinum-based chemotherapy followed by clinical review. After further consent, patients were randomly assigned (1:1) to EPP followed by postoperative hemithorax irradiation or to no EPP. Randomisation was done centrally with computer-generated permuted blocks stratified by surgical centre. The main endpoints were feasibility of randomly assigning 50 patients in 1 year (results detailed in another report), proportion randomised who received treatment, proportion eligible (registered) who proceeded to randomisation, perioperative mortality, and quality of life. Patients and investigators were not masked to treatment allocation. This is the principal report of the MARS study; all patients have been recruited. Analyses were by intention to treat. This trial is registered, number ISRCTN95583524. Findings Between Oct 1, 2005, and Nov 3, 2008, 112 patients were registered and 50 were subsequently randomly assigned: 24 to EPP and 26 to no EPP. The main reasons for not proceeding to randomisation were disease progression (33 patients), inoperability (five patients), and patient choice (19 patients). EPP was completed satisfactorily in 16 of 24 patients assigned to EPP; in five patients EPP was not started and in three patients it was abandoned. Two patients in the EPP group died within 30 days and a further patient died without leaving hospital. One patient in the no EPP group died perioperatively after receiving EPP off trial in a non-MARS centre. The hazard ratio [HR] for overall survival between the EPP and no EPP groups was 1·90 (95% CI 0·92-3·93; exact p=0·082), and after adjustment for sex, histological subtype, stage, and age at randomisation the HR was 2·75 (1·21-6·26; p=0·016). Median survival was 14·4 months (5·3-18·7) for the EPP group and 19·5 months (13·4 to time not yet reached) for the no EPP group. Of the 49 randomly assigned patients who consented to quality of life assessment (EPP n=23; no EPP n=26), 12 patients in the EPP group and 19 in the no EPP group completed the quality of life questionnaires. Although median quality of life scores were lower in the EPP group than the no EPP group, no significant differences between groups were reported in the quality of life analyses. There were ten serious adverse events reported in the EPP group and two in the no EPP group. Interpretation In view of the high morbidity associated with EPP in this trial and in other non-randomised studies a larger study is not feasible. These data, although limited, suggest that radical surgery in the form of EPP within trimodal therapy offers no benefit and possibly harms patients. Funding Cancer Research UK (CRUK/04/003), the June Hancock Mesothelioma Research Fund, and Guy's and St Thomas' NHS Foundation Trust. © 2011 Elsevier Ltd.
Resumo:
Sundarbans, a Ramsar and World Heritage site, is the largest single block of tidal halophytic mangrove forest in the world covering parts of Bangladesh and India. Natural mangroves were very common along the entire coast of Bangladesh. However, all other natural mangrove forests, including the Chakaria Sundarbans with 21,000 hectares of mangrove, have been cleared for shrimp cultivation. Against this backdrop, the Forest Department of Bangladesh has developed project design documents for a project called ‘Collaborative REDD+ Improved Forest Management (IFM) Sundarbans Project’ (CRISP) to save the only remaining natural mangrove forest of the country. This project, involving conservation of 412,000 ha of natural mangrove forests, is expected to generate, over a 30-year period, a total emissions reduction of about 6.4 million tons of CO2. However, the successful implementation of this project involves a number of critical legal and institutional issues. It may involve complex legal issues such as forest ownership, forest use rights, rights of local people and carbon rights. It may also involve institutional reforms. Ensuring good governance of the proposed project is very vital considering the failure of the Asian Development Bank (ADB) funded and Bangladesh Forest Department managed ‘Sundarbans Biodiversity Conservation Project’. Considering this previous experience, this paper suggests that a comprehensive legal and institutional review and reform is needed for the successful implementation of the proposed CRISP project. This paper argues that without ensuring local people’s rights and their participation, no project can be successful in the Sundarbans. Moreover, corruption of local and international officials may be a serious hurdle in the successful implementation of the project.
Resumo:
Floods are among the most devastating events that affect primarily tropical, archipelagic countries such as the Philippines. With the current predictions of climate change set to include rising sea levels, intensification of typhoon strength and a general increase in the mean annual precipitation throughout the Philippines, it has become paramount to prepare for the future so that the increased risk of floods on the country does not translate into more economic and human loss. Field work and data gathering was done within the framework of an internship at the former German Technical Cooperation (GTZ) in cooperation with the Local Government Unit of Ormoc City, Leyte, The Philippines, in order to develop a dynamic computer based flood model for the basin of the Pagsangaan River. To this end, different geo-spatial analysis tools such as PCRaster and ArcGIS, hydrological analysis packages and basic engineering techniques were assessed and implemented. The aim was to develop a dynamic flood model and use the development process to determine the required data, availability and impact on the results as case study for flood early warning systems in the Philippines. The hope is that such projects can help to reduce flood risk by including the results of worst case scenario analyses and current climate change predictions into city planning for municipal development, monitoring strategies and early warning systems. The project was developed using a 1D-2D coupled model in SOBEK (Deltares Hydrological modelling software package) and was also used as a case study to analyze and understand the influence of different factors such as land use, schematization, time step size and tidal variation on the flood characteristics. Several sources of relevant satellite data were compared, such as Digital Elevation Models (DEMs) from ASTER and SRTM data, as well as satellite rainfall data from the GIOVANNI server (NASA) and field gauge data. Different methods were used in the attempt to partially calibrate and validate the model to finally simulate and study two Climate Change scenarios based on scenario A1B predictions. It was observed that large areas currently considered not prone to floods will become low flood risk (0.1-1 m water depth). Furthermore, larger sections of the floodplains upstream of the Lilo- an’s Bridge will become moderate flood risk areas (1 - 2 m water depth). The flood hazard maps created for the development of the present project will be presented to the LGU and the model will be used to create a larger set of possible flood prone areas related to rainfall intensity by GTZ’s Local Disaster Risk Management Department and to study possible improvements to the current early warning system and monitoring of the basin section belonging to Ormoc City; recommendations about further enhancement of the geo-hydro-meteorological data to improve the model’s accuracy mainly on areas of interest will also be presented at the LGU.
Resumo:
Prophylactic surgery including hysterectomy and bilateral salpingo-oophorectomy (BSO) is recommended in BRCA positive women, while in women from the general population, hysterectomy plus BSO may increase the risk of overall mortality. The effect of hysterectomy plus BSO on women previously diagnosed with breast cancer is unknown. We used data from a population-base data linkage study of all women diagnosed with primary breast cancer in Queensland, Australia between 1997 and 2008 (n=21,067). We fitted flexible parametric breast cancer specific and overall survival models with 95% confidence intervals (also known as Royston-Parmar models) to assess the impact of risk-reducing surgery (removal of uterus, one or both ovaries). We also stratified analyses by age 20-49 and 50-79 years, respectively. Overall, 1,426 women (7%) underwent risk-reducing surgery (13% of premenopausal women and 3% of postmenopausal women). No women who had risk-reducing surgery, compared to 171 who did not have risk-reducing surgery developed a gynaecological cancer. Overall, 3,165 (15%) women died, including 2,195 (10%) from breast cancer. Hysterectomy plus BSO was associated with significantly reduced risk of death overall (adjusted HR = 0.69, 95% CI 0.53-0.89; P =0.005). Risk reduction was greater among premenopausal women, whose risk of death halved (HR, 0.45; 95% CI, 0.25-0.79; P < 0.006). This was largely driven by reduction in breast cancer-specific mortality (HR, 0.43; 95% CI, 0.24-0.79; P < 0.006). This population-based study found that risk-reducing surgery halved the mortality risk for premenopausal breast cancer patients. Replication of our results in independent cohorts, and subsequently randomised trials are needed to confirm these findings.
Resumo:
BACKGROUND Mosquito-borne diseases are climate sensitive and there has been increasing concern over the impact of climate change on future disease risk. This paper projected the potential future risk of Barmah Forest virus (BFV) disease under climate change scenarios in Queensland, Australia. METHODS/PRINCIPAL FINDINGS We obtained data on notified BFV cases, climate (maximum and minimum temperature and rainfall), socio-economic and tidal conditions for current period 2000-2008 for coastal regions in Queensland. Grid-data on future climate projections for 2025, 2050 and 2100 were also obtained. Logistic regression models were built to forecast the otential risk of BFV disease distribution under existing climatic, socio-economic and tidal conditions. The model was applied to estimate the potential geographic distribution of BFV outbreaks under climate change scenarios. The predictive model had good model accuracy, sensitivity and specificity. Maps on potential risk of future BFV disease indicated that disease would vary significantly across coastal regions in Queensland by 2100 due to marked differences in future rainfall and temperature projections. CONCLUSIONS/SIGNIFICANCE We conclude that the results of this study demonstrate that the future risk of BFV disease would vary across coastal regions in Queensland. These results may be helpful for public health decision making towards developing effective risk management strategies for BFV disease control and prevention programs in Queensland.
Resumo:
Freestanding membranes created from Bombyx mori silk fibroin (BMSF) offer a potential vehicle for corneal cell transplantation since they are transparent and support the growth of human corneal epithelial cells (HCE). Fibroin derived from the wild silkworm Antheraea pernyi (APSF) might provide a superior material by virtue of containing putative cell- attachment sites that are absent from BMSF. Thus we have investigated the feasibility of producing transparent, freestanding membranes from APSF and have analysed the behaviour of HCE cells on this material. No significant differences in cell numbers or phenotype were observed in short term HCE cell cultures established on either fibroin. Production of transparent freestanding APSF membranes, however, proved to be problematic as cast solutions of APSF were more prone to becoming opaque, displayed significantly lower permeability and were more brittle than BMSF-membranes. Cultures of HCE cells established on either membrane developed a normal stratified morphology with cytokeratin pair 3/12 being immuno-localized to the superficial layers. We conclude that while it is feasible to produce transparent freestanding membranes from APSF, the technical difficulties associated with this biomaterial, along with an absence of enhanced cell growth, currently favours the continued development of BMSF as a preferred vehicle for corneal cell transplantation. Nevertheless, it remains possible that refinement of techniques for processing APSF might yet lead to improvements in the handling properties and performance of this material.
Resumo:
Background Dietary diversity is recognized as a key element of a high quality diet. However, diets that offer a greater variety of energy-dense foods could increase food intake and body weight. The aim of this study was to explore association of diet diversity with obesity in Sri Lankan adults. Methods Six hundred adults aged > 18 years were randomly selected by using multi-stage stratified sample. Dietary intake assessment was undertaken by a 24 hour dietary recall. Three dietary scores, Dietary Diversity Score (DDS), Dietary Diversity Score with Portions (DDSP) and Food Variety Score (FVS) were calculated. Body mass index (BMI) ≥ 25 kg.m-2 is defined as obese and Asian waist circumference cut-offs were used diagnosed abdominal obesity. Results Mean of DDS for men and women were 6.23 and 6.50 (p=0.06), while DDSP was 3.26 and 3.17 respectively (p=0.24). FVS values were significantly different between men and women 9.55 and 10.24 (p=0.002). Dietary diversity among Sri Lankan adults was significantly associated with gender, residency, ethnicity, education level but not with diabetes status. As dietary scores increased, the percentage consumption was increased in most of food groups except starches. Obese and abdominal obese adults had the highest DDS compared to non obese groups (p<0.05). With increased dietary diversity the level of BMI, waist circumference and energy consumption was significantly increased in this population. Conclusion Our data suggests that dietary diversity is positively associated with several socio-demographic characteristics and obesity among Sri Lankan adults. Although high dietary diversity is widely recommended, public health messages should emphasize to improve dietary diversity in selective food items.
Resumo:
OBJECTIVES: Four randomized phase II/III trials investigated the addition of cetuximab to platinum-based, first-line chemotherapy in patients with advanced non-small cell lung cancer (NSCLC). A meta-analysis was performed to examine the benefit/risk ratio for the addition of cetuximab to chemotherapy. MATERIALS AND METHODS: The meta-analysis included individual patient efficacy data from 2018 patients and individual patient safety data from 1970 patients comprising respectively the combined intention-to-treat and safety populations of the four trials. The effect of adding cetuximab to chemotherapy was measured by hazard ratios (HRs) obtained using a Cox proportional hazards model and odds ratios calculated by logistic regression. Survival rates at 1 year were calculated. All applied models were stratified by trial. Tests on heterogeneity of treatment effects across the trials and sensitivity analyses were performed for all endpoints. RESULTS: The meta-analysis demonstrated that the addition of cetuximab to chemotherapy significantly improved overall survival (HR 0.88, p=0.009, median 10.3 vs 9.4 months), progression-free survival (HR 0.90, p=0.045, median 4.7 vs 4.5 months) and response (odds ratio 1.46, p<0.001, overall response rate 32.2% vs 24.4%) compared with chemotherapy alone. The safety profile of chemotherapy plus cetuximab in the meta-analysis population was confirmed as manageable. Neither trials nor patient subgroups defined by key baseline characteristics showed significant heterogeneity for any endpoint. CONCLUSION: The addition of cetuximab to platinum-based, first-line chemotherapy for advanced NSCLC significantly improved outcome for all efficacy endpoints with an acceptable safety profile, indicating a favorable benefit/risk ratio.
Resumo:
Purpose The LUX-Lung 3 study investigated the efficacy of chemotherapy compared with afatinib, a selective, orally bioavailable ErbB family blocker that irreversibly blocks signaling from epidermal growth factor receptor (EGFR/ErbB1), human epidermal growth factor receptor 2 (HER2/ErbB2), and ErbB4 and has wide-spectrum preclinical activity against EGFR mutations. A phase II study of afatinib in EGFR mutation-positive lung adenocarcinoma demonstrated high response rates and progression-free survival (PFS). Patients and Methods In this phase III study, eligible patients with stage IIIB/IV lung adenocarcinoma were screened for EGFR mutations. Mutation-positive patients were stratified by mutation type (exon 19 deletion, L858R, or other) and race (Asian or non-Asian) before two-to-one random assignment to 40 mg afatinib per day or up to six cycles of cisplatin plus pemetrexed chemotherapy at standard doses every 21 days. The primary end point was PFS by independent review. Secondary end points included tumor response, overall survival, adverse events, and patient-reported outcomes (PROs). Results A total of 1,269 patients were screened, and 345 were randomly assigned to treatment. Median PFS was 11.1 months for afatinib and 6.9 months for chemotherapy (hazard ratio [HR], 0.58; 95% CI, 0.43 to 0.78; P = .001). Median PFS among those with exon 19 deletions and L858R EGFR mutations (n = 308) was 13.6 months for afatinib and 6.9 months for chemotherapy (HR, 0.47; 95% CI, 0.34 to 0.65; P = .001). The most common treatmentrelated adverse events were diarrhea, rash/acne, and stomatitis for afatinib and nausea, fatigue, and decreased appetite for chemotherapy. PROs favored afatinib, with better control of cough, dyspnea, and pain. Conclusion Afatinib is associated with prolongation of PFS when compared with standard doublet chemotherapy in patients with advanced lung adenocarcinoma and EGFR mutations.
Resumo:
Background Indigenous children in high-income countries have a heavy burden of bronchiectasis unrelated to cystic fibrosis. We aimed to establish whether long-term azithromycin reduced pulmonary exacerbations in Indigenous children with non-cystic-fibrosis bronchiectasis or chronic suppurative lung disease. Methods Between Nov 12, 2008, and Dec 23, 2010, we enrolled Indigenous Australian, Maori, and Pacific Island children aged 1—8 years with either bronchiectasis or chronic suppurative lung disease into a multicentre, double-blind, randomised, parallel-group, placebo-controlled trial. Eligible children had had at least one pulmonary exacerbation in the previous 12 months. Children were randomised (1:1 ratio, by computer-generated sequence with permuted block design, stratified by study site and exacerbation frequency [1—2 vs ≥3 episodes in the preceding 12 months]) to receive either azithromycin (30 mg/kg) or placebo once a week for up to 24 months. Allocation concealment was achieved by double-sealed, opaque envelopes; participants, caregivers, and study personnel were masked to assignment until after data analysis. The primary outcome was exacerbation (respiratory episodes treated with antibiotics) rate. Analysis of the primary endpoint was by intention to treat. At enrolment and at their final clinic visits, children had deep nasal swabs collected, which we analysed for antibiotic-resistant bacteria. This study is registered with the Australian New Zealand Clinical Trials Registry; ACTRN12610000383066. Findings 45 children were assigned to azithromycin and 44 to placebo. The study was stopped early for feasibility reasons on Dec 31, 2011, thus children received the intervention for 12—24 months. The mean treatment duration was 20·7 months (SD 5·7), with a total of 902 child-months in the azithromycin group and 875 child-months in the placebo group. Compared with the placebo group, children receiving azithromycin had significantly lower exacerbation rates (incidence rate ratio 0·50; 95% CI 0·35—0·71; p<0·0001). However, children in the azithromycin group developed significantly higher carriage of azithromycin-resistant bacteria (19 of 41, 46%) than those receiving placebo (four of 37, 11%; p=0·002). The most common adverse events were non-pulmonary infections (71 of 112 events in the azithromycin group vs 132 of 209 events in the placebo group) and bronchiectasis-related events (episodes or investigations; 22 of 112 events in the azithromycin group vs 48 of 209 events in the placebo group); however, study drugs were well tolerated with no serious adverse events being attributed to the intervention. Interpretation Once-weekly azithromycin for up to 24 months decreased pulmonary exacerbations in Indigenous children with non-cystic-fibrosis bronchiectasis or chronic suppurative lung disease. However, this strategy was also accompanied by increased carriage of azithromycin-resistant bacteria, the clinical consequences of which are uncertain, and will need careful monitoring and further study.
Resumo:
Purpose To evaluate the association between retinal nerve fibre layer (RNFL) thickness and diabetic peripheral neuropathy in people with type 2 diabetes, and specifically those at higher risk of foot ulceration. Methods RNFL thicknesses was measured globally and in four quadrants (temporal, superior, nasal and inferior) at 3.45 mm diameter around the optic nerve head using optical coherence tomography (OCT). Severity of neuropathy was assessed using the Neuropathy Disability Score (NDS). Eighty-two participants with type 2 diabetes were stratified according to NDS scores (0-10) as: none, mild, moderate, and severe neuropathy. A control group was additionally included (n=17). Individuals with NDS≥ 6 (moderate and severe neuropathy) have been shown to be at higher risk of foot ulceration. A linear regression model was used to determine the association between RNFL and severity of neuropathy. Age, disease duration and diabetic retinopathy levels were fitted in the models. Independent t-test was employed for comparison between controls and the group without neuropathy, as well as for comparison between groups with higher and lower risk of foot ulceration. Analysis of variance was used to compare across all NDS groups. Results RNFL thickness was significantly associated with NDS in the inferior quadrant (b= -1.46, p=0.03). RNFL thicknesses globally and in superior, temporal and nasal quadrants did not show significant associations with NDS (all p>0.51). These findings were independent of the effect of age, disease duration and retinopathy. RNFL was thinner for the group with NDS ≥ 6 in all quadrants but was significant only inferiorly (p<0.005). RNFL for control participants was not significantly different from the group with diabetes and no neuropathy (superior p=0.07, global and all other quadrants: p>0.23). Mean RNFL thickness was not significantly different between the four NDS groups globally and in all quadrants (p=0.08 for inferior, P>0.14 for all other comparisons). Conclusions Retinal nerve fibre layer thinning is associated with neuropathy in people with type 2 diabetes. This relationship is strongest in the inferior retina and in individuals at higher risk of foot ulceration.
Resumo:
Purpose Over the past decade, corneal nerve morphology and corneal sensation threshold have been explored as potential surrogate markers for the evaluation of diabetic neuropathy. We present the baseline findings of a Longitudinal Assessment of Neuropathy in Diabetes using novel ophthalmic Markers (LANDMark). Methods The LANDMark Study is a 5-year, two-site, natural history (observational) study of individuals with Type 1 diabetes stratified into those with (T1W) and without (T1WO) neuropathy according to the Toronto criteria, and control subjects. All study participants undergo detailed annual assessment of neuropathy including corneal nerve parameters measured using corneal confocal microscopy and corneal sensitivity measured using non-contact corneal esthesiometry. Results 396 eligible individuals (208 in Brisbane and 188 in Manchester) were assessed: 76 T1W, 166 T1WO and 154 controls. Corneal sensation threshold (mbars) was significantly higher in T1W (1.0 ± 1.1) than T1WO (0.7 ± 0.7) and controls (0.6 ± 0.4) (P=0.002); post-hoc analysis (PHA) revealed no difference between T1WO and controls (Tukey HSD, P=0.502). Corneal nerve fiber length (mm/mm2) (CNFL) was lower in T1W (13.8 ± 6.4) than T1WO (19.1 ± 5.8) and controls (23.2 ± 6.3) (P<0.001); PHA revealed CNFL to be lower in T1W than T1WO, and lower in both of these groups than controls (P<0.001). Corneal nerve branch density (branches/mm2) (CNBD) was significantly lower in T1W (40 ± 32) than T1WO (62 ± 37) and controls (83 ± 46) (P<0.001); PHA showed CNBD was lower in T1W than T1WO, and lower in both groups than controls (P<0.001). Alcohol and cigarette consumption did not differ between groups, although age, BMI, BP, waist circumference, HbA1c, albumin-creatinine ratio, and cholesterol were slightly greater in T1W than T1WO (p<0.05). Some site differences were observed. Conclusions The LANDMark baseline findings confirm that corneal sensitivity and corneal nerve morphometry can detect differences in neuropathy status in individuals with Type 1 diabetes and healthy controls. Corneal nerve morphology is significantly abnormal even in diabetic patients ‘without neuropathy’ compared to control participants. Results of the longitudinal trial will assess the capability of these tests for monitoring change in these parameters over time as potential surrogate markers for neuropathy.
Resumo:
It is increasingly apparent that sea-level data (e.g. microfossil transfer functions, dated coral microatolls and direct observations from satellite and tidal gauges) vary temporally and spatially at regional to local scales, thus limiting our ability to model future sea-level rise for many regions. Understanding sealevel response at ‘far-field’ locations at regional scales is fundamental for formulating more relevant sea-level rise susceptibility models within these regions under future global change projections. Fossil corals and reefs in particular are valuable tools for reconstructing past sea levels and possible environmental phase shifts beyond the temporal constraints of instrumental records. This study used abundant surface geochronological data based on in situ subfossil corals and precise elevation surveys to determine previous sea level in Moreton Bay, eastern Australia, a far-field site. A total of 64 U-Th dates show that relative sea level was at least 1.1 m above modern lowest astronomical tide (LAT) from at least ˜6600 cal. yr BP. Furthermore, a rapid synchronous demise in coral reef growth occurred in Moreton Bay ˜5800 cal. yr BP, coinciding with reported reef hiatus periods in other areas around the Indo-Pacific region. Evaluating past reef growth patterns and phases allows for a better interpretation of anthropogenic forcing versus natural environmental/climatic cycles that effect reef formation and demise at all scales and may allow better prediction of reef response to future global change.