108 resultados para RES-000-23-1240
Resumo:
Background Moderate di?erences in e?cacy between adjuvant chemotherapy regimens for breast cancer are plausible, and could a? ect treatment choices. We sought any such di?erences.
Methods We undertook individual-patient-data meta-analyses of the randomised trials comparing: any taxane-plusanthracycline-based regimen versus the same, or more, non-taxane chemotherapy (n=44 000); one anthracyclinebased regimen versus another (n=7000) or versus cyclo phosphamide, methotrexate, and ?uorouracil (CMF; n=18 000); and polychemotherapy versus no chemotherapy (n=32 000). The scheduled dosages of these three drugs and of the anthracyclines doxorubicin (A) and epirubicin (E) were used to de? ne standard CMF, standard 4AC, and CAF and CEF. Log-rank breast cancer mortality rate ratios (RRs) are reported.
Findings In trials adding four separate cycles of a taxane to a ?xed anthracycline-based control regimen, extending treatment duration, breast cancer mortality was reduced (RR 0·86, SE 0·04, two-sided signi?cance [2p]=0·0005). In trials with four such extra cycles of a taxane counterbalanced in controls by extra cycles of other cytotoxic drugs, roughly doubling non-taxane dosage, there was no signi?cant di?erence (RR 0·94, SE 0·06, 2p=0·33). Trials with CMF-treated controls showed that standard 4AC and standard CMF were equivalent (RR 0·98, SE 0·05, 2p=0·67), but that anthracycline-based regimens with substantially higher cumulative dosage than standard 4AC (eg, CAF or CEF) were superior to standard CMF (RR 0·78, SE 0·06, 2p=0·0004). Trials versus no chemotherapy also suggested greater mortality reductions with CAF (RR 0·64, SE 0·09, 2p<0·0001) than with standard 4AC (RR 0·78, SE 0·09, 2p=0·01) or
standard CMF (RR 0·76, SE 0·05, 2p<0·0001). In all meta-analyses involving taxane-based or anthracycline-based regimens, proportional risk reductions were little a? ected by age, nodal status, tumour diameter or di?erentiation (moderate or poor; few were well di?erentiated), oestrogen receptor status, or tamoxifen use. Hence, largely independently of age (up to at least 70 years) or the tumour characteristics currently available to us for the patients selected to be in these trials, some taxane-plus-anthracycline-based or higher-cumulative-dosage anthracycline-based regimens (not requiring stem cells) reduced breast cancer mortality by, on average, about one-third. 10-year overall mortality di?erences paralleled breast cancer mortality di?erences, despite taxane, anthracycline, and other toxicities.
Interpretation 10-year gains from a one-third breast cancer mortality reduction depend on absolute risks without chemotherapy (which, for oestrogen-receptor-positive disease, are the risks remaining with appropriate endocrine therapy). Low absolute risk implies low absolute bene?t, but information was lacking about tumour gene expression markers or quantitative immunohistochemistry that might help to predict risk, chemosensitivity, or both.
Resumo:
BACKGROUND: The relationship between work-related stress and alcohol intake is uncertain. In order to add to the thus far inconsistent evidence from relatively small studies, we conducted individual-participant meta-analyses of the association between work-related stress (operationalised as self-reported job strain) and alcohol intake. METHODOLOGY AND PRINCIPAL FINDINGS: We analysed cross-sectional data from 12 European studies (n?=?142 140) and longitudinal data from four studies (n?=?48 646). Job strain and alcohol intake were self-reported. Job strain was analysed as a binary variable (strain vs. no strain). Alcohol intake was harmonised into the following categories: none, moderate (women: 1-14, men: 1-21 drinks/week), intermediate (women: 15-20, men: 22-27 drinks/week) and heavy (women: >20, men: >27 drinks/week). Cross-sectional associations were modelled using logistic regression and the results pooled in random effects meta-analyses. Longitudinal associations were examined using mixed effects logistic and modified Poisson regression. Compared to moderate drinkers, non-drinkers and (random effects odds ratio (OR): 1.10, 95% CI: 1.05, 1.14) and heavy drinkers (OR: 1.12, 95% CI: 1.00, 1.26) had higher odds of job strain. Intermediate drinkers, on the other hand, had lower odds of job strain (OR: 0.92, 95% CI: 0.86, 0.99). We found no clear evidence for longitudinal associations between job strain and alcohol intake. CONCLUSIONS: Our findings suggest that compared to moderate drinkers, non-drinkers and heavy drinkers are more likely and intermediate drinkers less likely to report work-related stress.
Resumo:
End of award report for the funded research seminar series of the same name.
Resumo:
Objectives: To investigate seasonal variation in month of diagnosis in children with type 1 diabetes registered in EURODIAB centres during 1989-2008.
Methods: 23 population-based registers recorded date of diagnosis in new cases of clinically diagnosed type 1 diabetes in children aged under 15 years. Completeness of ascertainment was assessed through capture-recapture methodology and was high in most centres. A general test for seasonal variation (11df) and Edward's test for sinusoidal (sine wave) variation (2df) were employed. Time series methods were also used to investigate if meteorological data were predictive of monthly counts after taking account of seasonality and long term trends.
Results: Significant seasonal variation was apparent in all but two small centres, with an excess of cases apparent in the winter quarter. Significant sinusoidal pattern was also evident in all but two small centres with peaks in December (14 centres), January (5 centres) or February (2 centres). Relative amplitude varied from ±11% to ±39% (median ±18%). There was no relationship across the centres between relative amplitude and incidence level. However there was evidence of significant deviation from the sinusoidal pattern in the majority of centres. Pooling results over centres, there was significant seasonal variation in each age-group at diagnosis, but with significantly less variation in those aged under 5 years. Boys showed marginally greater seasonal variation than girls. There were no differences in seasonal pattern between four sub-periods of the 20 year period. In most centres monthly counts of cases were not associated with deviations from normal monthly average temperature or sunshine hours; short term meteorological variations do not explain numbers of cases diagnosed.
Conclusions: Seasonality with a winter excess is apparent in all age-groups and both sexes, but girls and the under 5s show less marked variation. The seasonal pattern changed little in the 20 year period.
Resumo:
1. Freshwater unionoids are one of the most threatened animal groups worldwide and the freshwater pearl mussel Margaritifera margaritifera is currently listed as critically endangered in Europe. The ‘EC Habitats & Species Directive’ requires that EU member states monitor the distribution and abundance of this species and report regularly on its conservation status.
2. The pearl mussel meta-population in Northern Ireland was surveyed to assess temporal population trends in Special Areas of Conservation (SACs) and mussel reproduction throughout its range.
3. Mussels occurred in six rivers and numbers within three SAC designated sites remained stable between 2004-07 and 2011. The discovery of >8,000 previously unknown individuals in the Owenreagh River contributed to an overall increase (+56.8%) in the total known population. All populations actively reproduced during 2010 with approximately half of all individuals gravid. Moreover, suitable salmonid hosts occurred at all sites with 10.7% of salmon and 22.8% of trout carrying encysted glochidia. Populations were composed entirely of aged individuals with little evidence of recent recruitment.
4. We infer that the break in the life cycle must occur during the juvenile stage when glochidia metamorphose and settle into the interstitial spaces within the substrate. Water quality parameters, most notably levels of suspended solids, exceeded the recommended maximum thresholds in all rivers.
5. We posit that the deposition of silt may be the main cause of juvenile mortality contributing to a lack of recruitment. Consequently, all populations were judged to be in ‘unfavourable’ conservation status. Catchment-level management plans are urgently needed to reduce siltation with the aim of improving recruitment. Our results have implications for the success of ex-situ conservation programmes; specifically, the size at which captive bred juveniles are released into the wild. Further research is required to assess the vulnerabilities of early life stages of M. margaritifera to siltation.
Resumo:
Cystic fibrosis (CF) is characterized by defective mucociliary clearance and chronic airway infection by a complex microbiota. Infection, persistent inflammation and periodic episodes of acute pulmonary exacerbation contribute to an irreversible decline in CF lung function. While the factors leading to acute exacerbations are poorly understood, antibiotic treatment can temporarily resolve pulmonary symptoms and partially restore lung function. Previous studies indicated that exacerbations may be associated with changes in microbial densities and the acquisition of new microbial species. Given the complexity of the CF microbiota, we applied massively parallel pyrosequencing to identify changes in airway microbial community structure in 23 adult CF patients during acute pulmonary exacerbation, after antibiotic treatment and during periods of stable disease. Over 350,000 sequences were generated, representing nearly 170 distinct microbial taxa. Approximately 60% of sequences obtained were from the recognized CF pathogens Pseudomonas and Burkholderia, which were detected in largely non-overlapping patient subsets. In contrast, other taxa including Prevotella, Streptococcus, Rothia and Veillonella were abundant in nearly all patient samples. Although antibiotic treatment was associated with a small decrease in species richness, there was minimal change in overall microbial community structure. Furthermore, microbial community composition was highly similar in patients during an exacerbation and when clinically stable, suggesting that exacerbations may represent intrapulmonary spread of infection rather than a change in microbial community composition. Mouthwash samples, obtained from a subset of patients, showed a nearly identical distribution of taxa as expectorated sputum, indicating that aspiration may contribute to colonization of the lower airways. Finally, we observed a strong correlation between low species richness and poor lung function. Taken together, these results indicate that the adult CF lung microbiome is largely stable through periods of exacerbation and antibiotic treatment and that short-term compositional changes in the airway microbiota do not account for CF pulmonary exacerbations.
Resumo:
Increasing energy consumption has exerted great pressure on natural resources; this has led to a move towards sustainable energy resources to improve security of supply and to reduce greenhouse gas emissions. However, the rush to the cure may have been made in haste. Biofuels in particular, have a bad press both in terms of competition with good agricultural land for food, and also in terms of the associated energy balance with the whole life cycle analysis of the biofuel system. The emphasis is now very much on sustainable biofuel production; biofuels from wastes and lignocellulosic material are now seen as good sustainable biofuels that affect significantly better greenhouse gas balances as compared with first generation biofuels. Ireland has a significant resource of organic waste that could be a potential source of energy through anaerobic digestion. Ireland has 8% of the cattle population of the EU with less than 1% of the human population; as a result 91% of agricultural land in Ireland is under grass. Residues such as slurries and slaughter waste together with energy crops such as grass have an excellent potential to produce biogas that may be upgraded to biomethane. This biomethane may be used as a natural gas substitute; bio-compressed natural gas may then be an avenue for a biofuel strategy. It is estimated that a maximum potential of 33% of natural gas may be substituted by 2020 with a practical obtainable level of 7.5% estimated. Together with biodiesel from residues the practical obtainable level of this strategy may effect greater than a 5% substitution by energy of transport. The residues considered in this strategy to produce biofuel (excluding grass) have the potential to save 93,000 ha of agricultural land (23% of Irish arable land) when compared to a rapeseed biodiesel strategy. © 2009 Elsevier Ltd. All rights reserved.
Resumo:
Unfavorable work characteristics, such as low job control and too high or too low job demands, have been suggested to increase the likelihood of physical inactivity during leisure time, but this has not been verified in large-scale studies. The authors combined individual-level data from 14 European cohort studies (baseline years from 19851988 to 20062008) to examine the association between unfavorable work characteristics and leisure-time physical inactivity in a total of 170,162 employees (50 women; mean age, 43.5 years). Of these employees, 56,735 were reexamined after 29 years. In cross-sectional analyses, the odds for physical inactivity were 26 higher (odds ratio 1.26, 95 confidence interval: 1.15, 1.38) for employees with high-strain jobs (low control/high demands) and 21 higher (odds ratio 1.21, 95 confidence interval: 1.11, 1.31) for those with passive jobs (low control/low demands) compared with employees in low-strain jobs (high control/low demands). In prospective analyses restricted to physically active participants, the odds of becoming physically inactive during follow-up were 21 and 20 higher for those with high-strain (odds ratio 1.21, 95 confidence interval: 1.11, 1.32) and passive (odds ratio 1.20, 95 confidence interval: 1.11, 1.30) jobs at baseline. These data suggest that unfavorable work characteristics may have a spillover effect on leisure-time physical activity.
Resumo:
Objectives: To assess whether open angle glaucoma (OAG) screening meets the UK National Screening Committee criteria, to compare screening strategies with case finding, to estimate test parameters, to model estimates of cost and cost-effectiveness, and to identify areas for future research. Data sources: Major electronic databases were searched up to December 2005. Review methods: Screening strategies were developed by wide consultation. Markov submodels were developed to represent screening strategies. Parameter estimates were determined by systematic reviews of epidemiology, economic evaluations of screening, and effectiveness (test accuracy, screening and treatment). Tailored highly sensitive electronic searches were undertaken. Results: Most potential screening tests reviewed had an estimated specificity of 85% or higher. No test was clearly most accurate, with only a few, heterogeneous studies for each test. No randomised controlled trials (RCTs) of screening were identified. Based on two treatment RCTs, early treatment reduces the risk of progression. Extrapolating from this, and assuming accelerated progression with advancing disease severity, without treatment the mean time to blindness in at least one eye was approximately 23 years, compared to 35 years with treatment. Prevalence would have to be about 3-4% in 40 year olds with a screening interval of 10 years to approach cost-effectiveness. It is predicted that screening might be cost-effective in a 50-year-old cohort at a prevalence of 4% with a 10-year screening interval. General population screening at any age, thus, appears not to be cost-effective. Selective screening of groups with higher prevalence (family history, black ethnicity) might be worthwhile, although this would only cover 6% of the population. Extension to include other at-risk cohorts (e.g. myopia and diabetes) would include 37% of the general population, but the prevalence is then too low for screening to be considered cost-effective. Screening using a test with initial automated classification followed by assessment by a specialised optometrist, for test positives, was more cost-effective than initial specialised optometric assessment. The cost-effectiveness of the screening programme was highly sensitive to the perspective on costs (NHS or societal). In the base-case model, the NHS costs of visual impairment were estimated as £669. If annual societal costs were £8800, then screening might be considered cost-effective for a 40-year-old cohort with 1% OAG prevalence assuming a willingness to pay of £30,000 per quality-adjusted life-year. Of lesser importance were changes to estimates of attendance for sight tests, incidence of OAG, rate of progression and utility values for each stage of OAG severity. Cost-effectiveness was not particularly sensitive to the accuracy of screening tests within the ranges observed. However, a highly specific test is required to reduce large numbers of false-positive referrals. The findings that population screening is unlikely to be cost-effective are based on an economic model whose parameter estimates have considerable uncertainty, in particular, if rate of progression and/or costs of visual impairment are higher than estimated then screening could be cost-effective. Conclusions: While population screening is not cost-effective, the targeted screening of high-risk groups may be. Procedures for identifying those at risk, for quality assuring the programme, as well as adequate service provision for those screened positive would all be needed. Glaucoma detection can be improved by increasing attendance for eye examination, and improving the performance of current testing by either refining practice or adding in a technology-based first assessment, the latter being the more cost-effective option. This has implications for any future organisational changes in community eye-care services. Further research should aim to develop and provide quality data to populate the economic model, by conducting a feasibility study of interventions to improve detection, by obtaining further data on costs of blindness, risk of progression and health outcomes, and by conducting an RCT of interventions to improve the uptake of glaucoma testing. © Queen's Printer and Controller of HMSO 2007. All rights reserved.
Resumo:
Purpose: To assess the quality of referrals from community optometrists in the northeast of Scotland to the hospital glaucoma service before and after the implementation of the new General Ophthalmic Services (GOS) contract in Scotland. Methods: Retrospective study encompassing two 6-month periods, one before the implementation of the new GOS (Scotland) contract in April 2006 (from June to November 2005), and the other after (from June to November 2006). The community optometrist referral forms and hospital glaucoma service notes were reviewed. Comparisons were performed using the t-test and ?- test. Results: In all, 183 referrals were made during the first 6-month period from June to November 2005, and 120 referrals were made during the second 6-month period from June to November 2006. After the introduction of the new GOS contract, there was a statistically significant increase in true-positive referrals (from 18.0 to 31.7%; P=0.006), decrease in false-positive referrals (from 36.6 to 31.7%; P=0.006), and increase in the number of referrals with information on applanation tonometry (from 11.8 to 50.0%; P=0.000), dilated fundal examination (from 2.2 to 24.2%; P=0.000), and repeat visual fields (from 14.8 to 28.3%; P=0.004) when compared to the first 6-month period. However, only 41.7% of referrals fulfilled the new GOS contract requirements, with information on applanation tonometry the most commonly missing. Conclusions: After the implementation of the new GOS (Scotland) contract in April 2006, there has been an improvement in the quality of the glaucoma referrals from the community optometrists in the northeast of Scotland, with a corresponding reduction in false-positive referrals. Despite the relatively positive effect so far, there is still scope for further improvement. © 2009 Macmillan Publishers Limited All rights reserved.
Resumo:
There is extensive debate concerning the cognitive and behavioral adaptation of Neanderthals, especially in the period when the earliest anatomically modern humans dispersed into Western Europe, around 35,000–40,000 B.P. The site of the Grotte du Renne (at Arcy-sur-Cure) is of great importance because it provides the most persuasive evidence for behavioral complexity among Neanderthals. A range of ornaments and tools usually associated with modern human industries, such as the Aurignacian, were excavated from three of the Châtelperronian levels at the site, along with Neanderthal fossil remains (mainly teeth). This extremely rare occurrence has been taken to suggest that Neanderthals were the creators of these items. Whether Neanderthals independently achieved this level of behavioral complexity and whether this was culturally transmitted or mimicked via incoming modern humans has been contentious. At the heart of this discussion lies an assumption regarding the integrity of the excavated remains. One means of testing this is by radiocarbon dating; however, until recently, our ability to generate both accurate and precise results for this period has been compromised. A series of 31 accelerator mass spectrometry ultra?ltered dates on bones, antlers, artifacts, and teeth from six key archaeological levels shows an unexpected degree of variation. This suggests that some mixing of material may have occurred, which implies a more complex depositional history at the site and makes it dif?cult to be con?dent about the association of artifacts with human remains in the Châtelperronian levels.