267 resultados para Subsequent Risk
em University of Queensland eSpace - Australia
Role of dietary factors in the development of basal cell cancer and squamous cell cancer of the skin
Resumo:
The role of dietary factors in the development of skin cancer has been investigated for many years; however, the results of epidemiologic studies have not been systematically reviewed. This article reviews human studies of basal cell cancer (BCC) and squamous cell cancer (SCC) and includes all studies identified in the published scientific literature investigating dietary exposure to fats, retinol, carotenoids, vitamin E, vitamin Q and selenium. A total of 26 studies were critically reviewed according to study design and quality of the epidemiologic evidence. Overall, the evidence suggests a positive relationship between fat intake and BCC and SCC, an inconsistent association for retinol, and little relation between beta-carotene and BCC or SCC development. There is insufficient evidence on which to make a judgment about an association of other carotenoids with skin cancer. The evidence for associations between vitamin E, vitamin C, and selenium and both BCC and SCC is weak. Many of the existing studies contain limitations, however, and further well-designed and implemented studies are required to clarify the role of diet in skin cancer. Additionally, the role of other dietary factors, such as flavonoids and other polyphenols, which have been implicated in skin cancer development in animal models, needs to be investigated.
Resumo:
Objectives: Resternotomy is a common part of cardiac surgical practice. Associated with resternotomy are the risks of cardiac injury and catastrophic hemorrhage and the subsequent elevated morbidity and mortality in the operating room or during the postoperative period. The technique of direct vision resternotomy is safe and has fewer, if any, serious cardiac injuries. The technique, the reduced need for groin cannulation and the overall low operative mortality and morbidity are the focus of this restrospective analysis. Methods: The records of 495 patients undergoing 546 resternotomies over a 21-year period to January 2000 were reviewed. All consecutive reoperations by the one surgeon comprised patients over the age of 20 at first resternotomy: M:F 343:203, mean age 57 years (range 20 to 85, median age 60). The mean NYHA grade was 2.3 [with 67 patients (1), 273 (11),159 (111), 43 (IV), and 4 (V classification)] with elective reoperation in 94.6%. Cardiac injury was graded into five groups and the incidence and reasons for groin cannulation estimated. The morbidity and mortality as a result of the reoperation and resternotomy were assessed. Results: The hospital/30 day mortality was 2.9% (95% Cl: 1.6%-4.4%) (16 deaths) over the 21 years. First (481), second (53), and third (12) resternotomies produced 307 uncomplicated technical reopenings, 203 slower but uncomplicated procedures, 9 minor superficial cardiac lacerations, and no moderate or severe cardiac injuries. Direct vision resternotomy is crystalized into the principle that only adhesions that are visualized from below are divided and only sternal bone that is freed of adhesions is sewn. Groin exposure was never performed prophylactically for resternotomy. Fourteen patients (2.6%) had such cannulation for aortic dissection/aneurysm (9 patients), excessive sternal adherence of cardiac structures (3 patients), presurgery cardiac arrest (1 patient), and high aortic cannulation desired and not possible (1 patient). The average postop blood loss was 594 mL (95% CI:558-631) in the first 12 hours. The need to return to the operating room for control of excessive bleeding was 2% (11 patients). Blood transfusion was given in 65% of the resternotomy procedures over the 21 years (mean 854 mL 95% Cl 765-945 mL) and 41% over the last 5 years. Conclusions: The technique of direct vision resternotomy has been associated with zero moderate or major cardiac injury/catastrophic hemorrhage at reoperation. Few patients have required groin cannulation. In the postoperative period, there was acceptable blood loss, transfusion rates, reduced morbidity, and moderate low mortality for this potentially high risk group.
Resumo:
Although most prospective cohort studies do not support an association between coffee consumption and pancreatic cancer, the findings for alcohol are inconsistent. Recently, a large prospective cohort study of women reported statistically significant elevations in risk of pancreatic cancer for both coffee and alcoholic beverage consumption. We obtained data on coffee, alcohol, and other dietary factors using semiquantitative food frequency questionnaires administered at baseline (1986 in the Health Professionals Follow-Up Study and 1980 in the Nurses’ Health Study) and in subsequent follow-up questionnaires. Data on other risk factors for pancreatic cancer, including cigarette smoking, were also available. Individuals with a history of cancer at study initiation were excluded from all of the analyses. During the 1,907,222 person-years of follow-up, 288 incident cases of pancreatic cancer were diagnosed. The data were analyzed separately for each cohort, and results were pooled to compute overall relative risks (RR). Neither coffee nor alcohol intakes were associated with an increased risk of pancreatic cancer in either cohort or after pooling the results (pooled RR, 0.62; 95% confidence interval, 0.27–1.43, for >3 cups of coffee/day versus none; and pooled RR, 1.00; 95% confidence interval, 0.57–1.76, for >=30 grams of alcohol/day versus none). The associations did not change with analyses examining different latency periods for coffee and alcohol. Similarly, no statistically significant associations were observed for intakes of tea, decaffeinated coffee, total caffeine, or alcoholic beverages. Data from these two large cohorts do not support any overall association between coffee intake or alcohol intake and risk of pancreatic cancer.
Resumo:
The risk of cardiac events in patients undergoing major noncardiac surgery is dependent on their clinical characteristics and the results of stress testing. The purpose of this study was to develop a composite approach to defining levels of risk and to examine whether different approaches to prophylaxis influenced this prediction of outcome. One hundred forty-five consecutive patients (aged 68 +/- 9 years, 79 men) with >1 clinical risk variable were studied with standard dobutamine-atropine stress echo before major noncardiac surgery. Risk levels were stratified according to the presence of ischemia (new or worsening wall motion abnormality), ischemic threshold (heart rate at development of ischemia), and number of clinical risk variables. Patients were followed for perioperative events (during hospital admission) and death or infarction over the subsequent 16 10 months. Ten perioperative events occurred in 105 patients who proceeded to surgery (10%, 95% confidence interval [CI] 5% to 17%), 40 being cancelled because of cardiac or other risk. No ischemia was identified in 56 patients, 1 of whom (1.8%) had a perioperative infarction. Of the 49 patients with ischemia, 22 (45%) had 1 or 2 clinical risk factors; 2 (9%, 95% CI 1% to 29%) had events. Another 15 patients had a high ischemic threshold and 3 or 4 risk factors; 3 (20%, 95% Cl 4% to 48%) had events. Twelve patients had a low ischemic threshold and 3 or 4 risk factors; 4 (33%, 95% CI 10% to 65%) had events. Preoperative myocardial revascularization was performed in only 3 patients, none of whom had events. Perioperative and long-term events occurred despite the use of beta blockers; 7 of 41 eta blocker-treated patients had a perioperative event (17%, 95% CI 7% to 32%); these treated patients were at higher anticipated risk than untreated patients (20 +/- 24% vs 10 +/- 19%, p = 0.02). The total event rate over late follow-up was 13%, and was predicted by dobutamine-atropine stress echo results and heart rate response. (C) 2002 by Excerpta Medica, Inc.
Resumo:
Objective: To determine the feasibility, safety and effectiveness of a structured clinical pathway for stratification and management of patients presenting with chest pain and classified as having intermediate risk of adverse cardiac outcomes in the subsequent six months. Design: Prospective clinical audit. Participants and setting: 630 consecutive patients who presented to the emergency department of a metropolitan tertiary care hospital between January 2000 and June 2001 with chest pain and intermediate-risk features. Intervention: Use of the Accelerated Chest Pain Assessment Protocol (ACPAP), as advocated by the Management of unstable angina guidelines - 2000 from the National Heart Foundation and the Cardiac Society of Australia and New Zealand. Main outcome measure: Adverse cardiac events during six-month follow-up. Results: 409 patients (65%) were reclassified as low risk and discharged at a mean of 14 hours after assessment in the chest pain unit. None had missed myocardial infarctions, while three (1%) had cardiac events at six months (all elective revascularisation procedures, with no readmissions with acute coronary syndromes). Another 110 patients (17%) were reclassified as high risk, and 21 (19%) of these had cardiac events (mainly revascularisations) by six months. Patients who were unable to exercise or had non-diagnostic exercise stress test results (equivocal risk) had an intermediate cardiac event rate (8%). Conclusions: This study validates use of ACPAP. The protocol eliminated missed myocardial infarction; allowed early, safe discharge of low-risk patients; and led to early identification and management of high-risk patients.
Resumo:
Uses research in a major UK company on the introduction of an electronic document management system to explore perceptions of, and attitudes to, risk. Phenomenological methods were used; with subsequent dialogue transcripts evaluated with Winmax dialogue software, using an adapted theoretical framework based upon an analysis of the literature. The paper identifies a number of factors, and builds a framework, that should support a greater understanding of risk assessment and project management by the academic community and practitioners.
Resumo:
Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives-defined as a choice that makes preferred consequences more likely-requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the relevance of new information, relative to the initial ( and smaller) set of data on which the decision was based. We exemplify this seemingly simple situation using risk management of BSE. As an integral aspect of causal analysis under risk, the methods developed in this paper permit the addition of non-linear, hormetic dose-response models to the current set of regulatory defaults such as the linear, non-threshold models. This increase in the number of defaults is an important improvement because most of the variants of the precautionary principle require cost-benefit balancing. Specifically, increasing the set of causal defaults accounts for beneficial effects at very low doses. We also show and conclude that quantitative risk assessment dominates qualitative risk assessment, supporting the extension of the set of default causal models.
Resumo:
Aim: To determine if Campylobacter jejuni grown at 37 and 42 degrees C have different abilities to survive on beef and chicken, and in water. Methods and Results: Beef, chicken and water were separately inoculated with four Camp. jejuni (two poultry and two beef) strains grown at 37 or 42 degrees C. The matrices were stored at similar to 4 degrees C and Camp. jejuni numbers were monitored over time by plate counts. On beef there was a greater decrease in number for two strains (P < 0.05; similar to 0.7 and 1.3 log CFU cm(-2)) grown at 37 degrees C as compared with 42 degrees C. By contrast on chicken there was a decrease in numbers for two strains (P < 0.05; similar to 1.3 and 1 log CFU g(-1)) grown at 42 degrees C as compared with 37 degrees C. In water there was a greater decrease in numbers for all strains (P < 0.05; similar to 3-5.3 log CFU ml(-1)) grown at 42 degrees C as compared with 37 degrees C. Conclusions: Growth temperature influences the survival of Camp. jejuni on food and in water. Significance and Impact of this study: Campylobacter jejuni survival studies need to consider growth temperature to avoid erroneous results. Campylobacter jejuni grown at 37 degrees C, the body temperature of humans and cattle, may represent a greater public health risk in water than those grown at 42 degrees C, the body temperature of poultry.
Resumo:
There is some evidence that dietary factors may modify the risk of squamous cell carcinoma (SCC) of the skin, but the association between food intake and SCC has not been evaluated prospectively. We examined the association between food intake and SCC incidence among 1,056 randomly selected adults living in an Australian sub-tropical community. Measurement-error corrected estimates of intake in 15 food groups were defined from a validated food frequency questionnaire in 1992. Associations with SCC risk were assessed using Poisson and negative binomial regression to the persons affected and tumour counts, respectively, based on incident, histologically confirmed tumours occurring between 1992 and 2002. After multivariable adjustment, none of the food groups was significantly associated with SCC risk. Stratified analysis in participants with a past history of skin cancer showed a decreased risk of SCC tumours for high intakes of green leafy vegetables (RR = 0.45, 95% CI = 0.22-0.91; p for trend = 0.02) and an increased risk for high intake of unmodified dairy products (RR = 2.53, 95% CI: 1.15-5.54; p for trend = 0.03). Food intake was not associated with SCC risk in persons who had no past history of skin cancer. These findings suggest that consumption of green leafy vegetables may help prevent development of subsequent SCCs of the skin among people with previous skin cancer and that consumption of unmodified dairy products, such as whole milk, cheese and yoghurt, may increase SCC risk in susceptible persons.
Resumo:
In recent years, the phrase 'genomic medicine' has increasingly been used to describe a new development in medicine that holds great promise for human health. This new approach to health care uses the knowledge of an individual's genetic make-up to identify those that are at a higher risk of developing certain diseases and to intervene at an earlier stage to prevent these diseases. Identifying genes that are involved in disease aetiology will provide researchers with tools to develop better treatments and cures. A major role within this field is attributed to 'predictive genomic medicine', which proposes screening healthy individuals to identify those who carry alleles that increase their susceptibility to common diseases, such as cancers and heart disease. Physicians could then intervene even before the disease manifests and advise individuals with a higher genetic risk to change their behaviour - for instance, to exercise or to eat a healthier diet - or offer drugs or other medical treatment to reduce their chances of developing these diseases. These promises have fallen on fertile ground among politicians, health-care providers and the general public, particularly in light of the increasing costs of health care in developed societies. Various countries have established databases on the DNA and health information of whole populations as a first step towards genomic medicine. Biomedical research has also identified a large number of genes that could be used to predict someone's risk of developing a certain disorder. But it would be premature to assume that genomic medicine will soon become reality, as many problems remain to be solved. Our knowledge about most disease genes and their roles is far from sufficient to make reliable predictions about a patient’s risk of actually developing a disease. In addition, genomic medicine will create new political, social, ethical and economic challenges that will have to be addressed in the near future.
Resumo:
Parkinson’s disease (PD) is a progressive, degenerative, neurological disease. The progressive disability associated with PD results in substantial burdens for those with the condition, their families and society in terms of increased health resource use, earnings loss of affected individuals and family caregivers, poorer quality of life, caregiver burden, disrupted family relationships, decreased social and leisure activities, and deteriorating emotional well-being. Currently, no cure is available and the efficacy of available treatments, such as medication and surgical interventions, decreases with longer duration of the disease. Whilst the cause of PD is unknown, genetic and environmental factors are believed to contribute to its aetiology. Descriptive and analytical epidemiological studies have been conducted in a number of countries in an effort to elucidate the cause, or causes, of PD. Rural residency, farming, well water consumption, pesticide exposure, metals and solvents have been implicated as potential risk factors for PD in some previous epidemiological studies. However, there is substantial disagreement between the results of existing studies. Therefore, the role of environmental exposures in the aetiology of PD remains unclear. The main component of this thesis consists of a case-control study that assessed the contribution of environmental exposures to the risk of developing PD. An existing, previously unanalysed, dataset from a local case-control study was analysed to inform the design of the new case-control study. The analysis results suggested that regular exposure to pesticides and head injury were important risk factors for PD. However, due to the substantial limitations of this existing study, further confirmation of these results was desirable with a more robustly designed epidemiological study. A new exposure measurement instrument (a structured interviewer-delivered questionnaire) was developed for the new case-control study to obtain data on demographic, lifestyle, environmental and medical factors. Prior to its use in the case-control study, the questionnaire was assessed for test-retest repeatability in a series of 32 PD cases and 29 healthy sex-, age- and residential suburb-matched electoral roll controls. High repeatability was demonstrated for lifestyle exposures, such as smoking and coffee/tea consumption (kappas 0.70-1.00). The majority of environmental exposures, including use of pesticides, solvents and exposure to metal dusts and fumes, also showed high repeatability (kappas >0.78). A consecutive series of 163 PD case participants was recruited from a neurology clinic in Brisbane. One hundred and fifty-one (151) control participants were randomly selected from the Australian Commonwealth Electoral Roll and individually matched to the PD cases on age (± 2 years), sex and current residential suburb. Participants ranged in age from 40-89 years (mean age 67 years). Exposure data were collected in face-to-face interviews. Odds ratios and 95% confidence intervals were calculated using conditional logistic regression for matched sets in SAS version 9.1. Consistent with previous studies, ever having been a regular smoker or coffee drinker was inversely associated with PD with dose-response relationships evident for packyears smoked and number of cups of coffee drunk per day. Passive smoking from ever having lived with a smoker or worked in a smoky workplace was also inversely related to PD. Ever having been a regular tea drinker was associated with decreased odds of PD. Hobby gardening was inversely associated with PD. However, use of fungicides in the home garden or occupationally was associated with increased odds of PD. Exposure to welding fumes, cleaning solvents, or thinners occupationally was associated with increased odds of PD. Ever having resided in a rural or remote area was inversely associated with PD. Ever having resided on a farm was only associated with moderately increased odds of PD. Whilst the current study’s results suggest that environmental exposures on their own are only modest contributors to overall PD risk, the possibility that interaction with genetic factors may additively or synergistically increase risk should be considered. The results of this research support the theory that PD has a multifactorial aetiology and that environmental exposures are some of a number of factors to contribute to PD risk. There was also evidence of interaction between some factors (eg smoking and welding) to moderate PD risk.
Resumo:
A participative ergonomics approach to reducing injuries associated with manual tasks is widely promoted; however only limited evidence from uncontrolled trials has been available to support the efficacy of such an approach. This paper reports on a randomized and controlled trial of PErforM, a participative ergonomics intervention designed to reduce the risks of injury associated with manual tasks. One hundred and seventeen small to medium sized food, construction, and health workplaces were audited by government inspectors using a manual tasks risk assessment tool (ManTRA). Forty-eight volunteer workplaces were then randomly assigned to Experimental and Control groups with the Experimental group receiving the PErforM program. Inspectors audited the workplaces again, 9 months following the intervention. The results showed a significant decrease in estimates of manual task risk and suggested better legal compliance in the Experimental group.