968 resultados para Subsequent Risk
Resumo:
Spontaneous bacterial peritonitis (SBP) is considered as result of bacterial translocation from the gastrointestinal lumen to the mesenteric lymph nodes and subsequent circulation. Variants of the NOD2 gene contribute to bacterial translocation and were associated with SBP in a recent study.
Resumo:
Background Chronic localized pain syndromes, especially chronic low back pain (CLBP), are common reasons for consultation in general practice. In some cases chronic localized pain syndromes can appear in combination with chronic widespread pain (CWP). Numerous studies have shown a strong association between CWP and several physical and psychological factors. These studies are population-based cross-sectional and do not allow for assessing chronology. There are very few prospective studies that explore the predictors for the onset of CWP, where the main focus is identifying risk factors for the CWP incidence. Until now there have been no studies focusing on preventive factors keeping patients from developing CWP. Our aim is to perform a cross sectional study on the epidemiology of CLBP and CWP in general practice and to look for distinctive features regarding resources like resilience, self-efficacy and coping strategies. A subsequent cohort study is designed to identify the risk and protective factors of pain generalization (development of CWP) in primary care for CLBP patients. Methods/Design Fifty-nine general practitioners recruit consecutively, during a 5 month period, all patients who are consulting their family doctor because of chronic low back pain (where the pain is lasted for 3 months). Patients are asked to fill out a questionnaire on pain anamnesis, pain-perception, co-morbidities, therapy course, medication, socio demographic data and psychosomatic symptoms. We assess resilience, coping resources, stress management and self-efficacy as potential protective factors for pain generalization. Furthermore, we raise risk factors for pain generalization like anxiety, depression, trauma and critical life events. During a twelve months follow up period a cohort of CLBP patients without CWP will be screened on a regular basis (3 monthly) for pain generalization (outcome: incident CWP). Discussion This cohort study will be the largest study which prospectively analyzes predictors for transition from CLBP to CWP in primary care setting. In contrast to the typically researched risk factors, which increase the probability of pain generalization, this study also focus intensively on protective factors, which decrease the probability of pain generalization.
Resumo:
As a consequence of flood impacts, communities inhabiting mountain areas are increasingly affected by considerable damage to infrastructure and property. The design of effective flood risk mitigation strategies and their subsequent implementation is crucial for a sustainable development in mountain areas. The assessment of the dynamic evolution of flood risk is the pillar of any subsequent planning process that is targeted at a reduction of the expected adverse consequences of the hazard impact. Given these premises, firstly, a comprehensive method to derive flood hazard process scenarios for well-defined areas at risk is presented. Secondly, conceptualisations of a static and dynamic flood risk assessment are provided. These are based on formal schemes to compute the risk mitigation performance of devised mitigation strategies within the framework of economic cost-benefit analysis. In this context, techniques suitable to quantify the expected losses induced by the identified flood impacts are provided.
Resumo:
BACKGROUND: We sought to determine whether a high-risk group could be defined among patients with operable breast cancer in whom a search of occult central nervous system (CNS) metastases was justified. PATIENTS AND METHODS: We evaluated data from 9524 women with early breast cancer (42% node-negative) who were randomized in International Breast Cancer Study Group clinical trials between 1978 and 1999, and treated without anthracyclines, taxanes, or trastuzumab. We identified patients whose site of first event was CNS and those who had a CNS event at any time. RESULTS: Median follow-up was 13 years. The 10-year incidence (10-yr) of CNS relapse was 5.2% (1.3% as first recurrence). Factors predictive of CNS as first recurrence included: node-positive disease (10-yr = 2.2% for > 3 N+), estrogen receptor-negative (2.3%), tumor size > 2 cm (1.7%), tumor grade 3 (2.0%), < 35 years old (2.2%), HER2-positive (2.7%), and estrogen receptor-negative and node-positive (2.6%). The risk of subsequent CNS recurrence was elevated in patients experiencing lung metastases (10-yr = 16.4%). CONCLUSION: Based on this large cohort we were able to define risk factors for CNS metastases, but could not define a group at sufficient risk to justify routine screening for occult CNS metastases.
Resumo:
BACKGROUND: Many epidemiological studies indicate a positive correlation between cataract surgery and the subsequent progression of age-related macular degeneration (AMD). Such a correlation would have far-reaching consequences. However, in epidemiological studies it is difficult to determine the significance of a single risk factor, such as cataract surgery. PATIENTS AND METHODS: We performed a retrospective case-control study of patients with new onset exudative age-related macular degeneration to determine if cataract surgery was a predisposing factor. A total of 1496 eyes were included in the study: 984 cases with new onset of exudative AMD and 512 control eyes with early signs of age-related maculopathy. Lens status (phakic or pseudophakic) was determined for each eye. RESULTS: There was no significant difference in lens status between study and control group (227/984 [23.1 %] vs. 112/512 [21.8 %] pseudophakic, p = 0.6487; OR = 1.071; 95 % CI = 0.8284-1.384). In cases with bilateral pseudophakia (n = 64) no statistically significant difference of the interval between cataract surgery in either eye and the onset of exudative AMD in the study eye was found (225.9 +/- 170.4 vs. 209.9 +/- 158.2 weeks, p = 0.27). CONCLUSIONS: Our results provide evidence that cataract surgery is not a major risk factor for the development of exudative AMD.
Resumo:
Mass screening for osteoporosis using DXA measurements at the spine and hip is presently not recommended by health authorities. Instead, risk factor questionnaires and peripheral bone measurements may facilitate the selection of women eligible for axial bone densitometry. The aim of this study was to validate a case finding strategy for postmenopausal women who would benefit most from subsequent DXA measurement by using phalangeal radiographic absorptiometry (RA) alone or in combination with risk factors in a general practice setting. The sensitivity and specificity of this strategy in detecting osteoporosis (T-score < or =2.5 SD at the spine and/or the hip) were compared with those of the current reimbursement criteria for DXA measurements in Switzerland. Four hundred and twenty-three postmenopausal women with one or more risk factors for osteoporosis were recruited by 90 primary care physicians who also performed the phalangeal RA measurements. All women underwent subsequent DXA measurement of the spine and the hip at the Osteoporosis Policlinic of the University Hospital of Berne. They were allocated to one of two groups depending on whether they matched with the Swiss reimbursement conditions for DXA measurement or not. Logistic regression models were used to predict the likelihood of osteoporosis versus "no osteoporosis" and to derive ROC curves for the various strategies. Differences in the areas under the ROC curves (AUC) were tested for significance. In women lacking reimbursement criteria, RA achieved a significantly larger AUC (0.81; 95% CI 0.72-0.89) than the risk factors associated with patients' age, height and weight (0.71; 95% C.I. 0.62-0.80). Furthermore, in this study, RA provided a better sensitivity and specificity in identifying women with underlying osteoporosis than the currently accepted criteria for reimbursement of DXA measurement. In the Swiss environment, RA is a valid case finding tool for patients with risk factors for osteoporosis, especially for those who do not qualify for DXA reimbursement.
Resumo:
Animal and early clinical studies of gene therapy for tissue ischaemia suggested that this approach might provide benefit to patients with coronary artery disease not amenable to traditional revascularization. This enthusiasm was then tempered by the subsequent disappointing results of randomized clinical trials and led researchers to develop strategies using progenitor cells as an alternative to improve collateral function. However, the recent publication of several randomized clinical trials reporting either negative or weakly positive results using this approach have led to questions regarding its effectiveness. There are several factors that need to be considered in explaining the discordance between the positive studies of such treatments in animals and the disappointing results seen in randomized patient trials. Aside from the practical issues of arteriogenic therapies, such as effective delivery, vascular remodelling is an extraordinarily complex process, and the administration of a single agent or cell in the hope that it would lead to lasting physiological effects may be far too simplistic an approach. In addition, however, evidence now suggests that many of the traditional cardiovascular risk factors-such as age and hypercholesterolemia-may impair the host response not only to ischaemia but, critically, also to treatment as well. This review discusses the evidence and mechanisms for these observations and highlights future directions that might be taken in an effort to provide more effective therapies.
Resumo:
BACKGROUND: This study analyzed the impact of weight reduction method, preoperative, and intraoperative variables on the outcome of reconstructive body contouring surgery following massive weight reduction. METHODS: All patients presenting with a maximal BMI >/=35 kg/m(2) before weight reduction who underwent body contouring surgery of the trunk following massive weight loss (excess body mass index loss (EBMIL) >/= 30%) between January 2002 and June 2007 were retrospectively analyzed. Incomplete records or follow-up led to exclusion. Statistical analysis focused on weight reduction method and pre-, intra-, and postoperative risk factors. The outcome was compared to current literature results. RESULTS: A total of 104 patients were included (87 female and 17 male; mean age 47.9 years). Massive weight reduction was achieved through bariatric surgery in 62 patients (59.6%) and dietetically in 42 patients (40.4%). Dietetically achieved excess body mass index loss (EBMIL) was 94.20% and in this cohort higher than surgically induced reduction EBMIL 80.80% (p < 0.01). Bariatric surgery did not present increased risks for complications for the secondary body contouring procedures. The observed complications (26.9%) were analyzed for risk factors. Total tissue resection weight was a significant risk factor (p < 0.05). Preoperative BMI had an impact on infections (p < 0.05). No impact on the postoperative outcome was detected in EBMIL, maximal BMI, smoking, hemoglobin, blood loss, body contouring technique or operation time. Corrective procedures were performed in 11 patients (10.6%). The results were compared to recent data. CONCLUSION: Bariatric surgery does not increase risks for complications in subsequent body contouring procedures when compared to massive dietetic weight reduction.
Resumo:
This study will look at the passenger air bag (PAB) performance in a fix vehicle environment using Partial Low Risk Deployment (PLRD) as a strategy. This development will follow test methods against actual baseline vehicle data and Federal Motor Vehicle Safety Standards 208 (FMVSS 208). FMVSS 208 states that PAB compliance in vehicle crash testing can be met using one of three deployment methods. The primary method suppresses PAB deployment, with the use of a seat weight sensor or occupant classification sensor (OCS), for three-year old and six-year old occupants including the presence of a child seat. A second method, PLRD allows deployment on all size occupants suppressing only for the presents of a child seat. A third method is Low Risk Deployment (LRD) which allows PAB deployment in all conditions, all statures including any/all child seats. This study outlines a PLRD development solution for achieving FMVSS 208 performance. The results of this study should provide an option for system implementation including opportunities for system efficiency and other considerations. The objective is to achieve performance levels similar too or incrementally better than the baseline vehicles National Crash Assessment Program (NCAP) Star rating. In addition, to define systemic flexibility where restraint features can be added or removed while improving occupant performance consistency to the baseline. A certified vehicles’ air bag system will typically remain in production until the vehicle platform is redesigned. The strategy to enable the PLRD hypothesis will be to first match the baseline out of position occupant performance (OOP) for the three and six-year old requirements. Second, improve the 35mph belted 5th percentile female NCAP star rating over the baseline vehicle. Third establish an equivalent FMVSS 208 certification for the 25mph unbelted 50th percentile male. FMVSS 208 high-speed requirement defines the federal minimum crash performance required for meeting frontal vehicle crash-test compliance. The intent of NCAP 5-Star rating is to provide the consumer with information about crash protection, beyond what is required by federal law. In this study, two vehicles segments were used for testing to compare and contrast to their baseline vehicles performance. Case Study 1 (CS1) used a cross over vehicle platform and Case Study 2 (CS2) used a small vehicle segment platform as their baselines. In each case study, the restraints systems were from different restraint supplier manufactures and each case contained that suppliers approach to PLRD. CS1 incorporated a downsized twins shaped bag, a carryover inflator, standard vents, and a strategic positioned bag diffuser to help disperse the flow of gas to improve OOP. The twin shaped bag with two segregated sections (lobes) to enabled high-speed baseline performance correlation on the HYGE Sled. CS2 used an A-Symmetric (square shape) PAB with standard size vents, including a passive vent, to obtain OOP similar to the baseline. The A-Symmetric shape bag also helped to enabled high-speed baseline performance improvements in HYGE Sled testing in CS2. The anticipated CS1 baseline vehicle-pulse-index (VPI) target was in the range of 65-67. However, actual dynamic vehicle (barrier) testing was overshadowed with the highest crash pulse from the previous tested vehicles with a VPI of 71. The result from the 35mph NCAP Barrier test was a solid 4-Star (4.7 Star) respectfully. In CS2, the vehicle HYGE Sled development VPI range, from the baseline was 61-62 respectively. Actual NCAP test produced a chest deflection result of 26mm versus the anticipated baseline target of 12mm. The initial assessment of this condition was thought to be due to the vehicles significant VPI increase to 67. A subsequent root cause investigation confirmed a data integrity issue due to the instrumentation. In an effort to establish a true vehicle test data point a second NCAP test was performed but faced similar instrumentation issues. As a result, the chest deflect hit the target of 12.1mm; however a femur load spike, similar to the baseline, now skewed the results. With noted level of performance improvement in chest deflection, the NCAP star was assessed as directional for 5-Star capable performance. With an actual rating of 3-Star due to instrumentation, using data extrapolation raised the ratings to 5-Star. In both cases, no structural changes were made to the surrogate vehicle and the results in each case matched their perspective baseline vehicle platforms. These results proved the PLRD is viable for further development and production implementation.
Resumo:
BACKGROUND: In industrialized countries vaccination coverage remains suboptimal, partly because of perception of an increased risk of asthma. Epidemiologic studies of the association between childhood vaccinations and asthma have provided conflicting results, possibly for methodologic reasons such as unreliable vaccination data, biased reporting, and reverse causation. A recent review stressed the need for additional, adequately controlled large-scale studies. OBJECTIVE: Our goal was to determine if routine childhood vaccination against pertussis was associated with subsequent development of childhood wheezing disorders and asthma in a large population-based cohort study. METHODS: In 6811 children from the general population born between 1993 and 1997 in Leicestershire, United Kingdom, respiratory symptom data from repeated questionnaire surveys up to 2003 were linked to independently collected vaccination data from the National Health Service database. We compared incident wheeze and asthma between children of different vaccination status (complete, partial, and no vaccination against pertussis) by computing hazard ratios. Analyses were based on 6048 children, 23 201 person-years of follow-up, and 2426 cases of new-onset wheeze. RESULTS: There was no evidence for an increased risk of wheeze or asthma in children vaccinated against pertussis compared with nonvaccinated children. Adjusted hazard ratios comparing fully and partially vaccinated with nonvaccinated children were close to one for both incident wheeze and asthma. CONCLUSION: This study provides no evidence of an association between vaccination against pertussis in infancy and an increased risk of later wheeze or asthma and does not support claims that vaccination against pertussis might significantly increase the risk of childhood asthma.
Resumo:
Immunoglobulin E (IgE) mediates the immune response to parasites, but can also cause allergies. In humans maternal IgE is not transferred to cord blood and high levels of cord blood IgE are associated with subsequent allergy. In horses, both maternal IgG and IgE are transferred via colostrum; the IgE levels in the mare's serum, the colostrum and the foal's serum are correlated but the consequences of IgE transfer to foals are not known. By about 6 weeks of age the levels of IgE in foal serum have dropped to a nadir, at 6 months of age the level of IgE has risen only very slightly and is no longer correlated with the levels seen at birth, IgE(+) B-cells could be detected in lymphoid follicles of some foals at this age. Surprisingly, the levels of total IgE detected in a foals serum at 6 months of age are significantly correlated with the level in its serum at 1, 2 and even 3 years of age suggesting that by 6 months of age the foals are synthesizing IgE and that a pattern of relatively higher or lower total serum IgE has been established. The neonatal intestinal mucosa contained connective tissue mast cells which stained for bound IgE in foals up to 9 weeks of age but not mucosal mast cells, thereafter, the intestinal mast cells were IgE negative until 6 months of age. IgE antibodies to Culicoides nubeculosus salivary antigens were detected in Swiss born foals from imported Icelandic mares allergic to Culicoides spp. yet the foals showed no signs of skin sensitization and such second generation foals are known not to have an increased risk of developing allergy to Culicoides. Overall this evidence suggests there is a minimal effector role of maternal IgE also that maternal IgE has waned prior to the onset of IgE synthesis in foals and does not support maternal priming of IgE responses in foals. Furthermore the total levels of IgE in any given foal are seen to be relatively high or low from soon after the onset of IgE synthesis, and most likely they are determined by genetic factors.
Resumo:
BACKGROUND Mortality risk for people with chronic kidney disease is substantially greater than that for the general population, increasing to a 7-fold greater risk for those on dialysis therapy. Higher body mass index, generally due to higher energy intake, appears protective for people on dialysis therapy, but the relationship between energy intake and survival in those with reduced kidney function is unknown. STUDY DESIGN Prospective cohort study with a median follow-up of 14.5 (IQR, 11.2-15.2) years. SETTING & PARTICIPANTS Blue Mountains Area, west of Sydney, Australia. Participants in the general community enrolled in the Blue Mountains Eye Study (n=2,664) who underwent a detailed interview, food frequency questionnaire, and physical examination including body weight, height, blood pressure, and laboratory tests. PREDICTORS Relative energy intake, food components (carbohydrates, total sugars, fat, protein, and water), and estimated glomerular filtration rate (eGFR). Relative energy intake was dichotomized at 100%, and eGFR, at 60mL/min/1.73m(2). OUTCOMES All-cause and cardiovascular mortality. MEASUREMENTS All-cause and cardiovascular mortality using unadjusted and adjusted Cox proportional regression models. RESULTS 949 people died during follow-up, 318 of cardiovascular events. In people with eGFR<60mL/min/1.73m(2) (n=852), there was an increased risk of all-cause mortality (HR, 1.48; P=0.03), but no increased risk of cardiovascular mortality (HR, 1.59; P=0.1) among those with higher relative energy intake compared with those with lower relative energy intake. Increasing intake of carbohydrates (HR per 100g/d, 1.50; P=0.04) and total sugars (HR per 100g/d, 1.62; P=0.03) was associated significantly with increased risk of cardiovascular mortality. LIMITATIONS Under-reporting of energy intake, baseline laboratory and food intake values only, white population. CONCLUSIONS Increasing relative energy intake was associated with increased all-cause mortality in patients with eGFR<60mL/min/1.73m(2). This effect may be mediated by increasing total sugars intake on subsequent cardiovascular events.
Resumo:
BACKGROUND Conventional factors do not fully explain the distribution of cardiovascular outcomes. Biomarkers are known to participate in well-established pathways associated with cardiovascular disease, and may therefore provide further information over and above conventional risk factors. This study sought to determine whether individual and/or combined assessment of 9 biomarkers improved discrimination, calibration and reclassification of cardiovascular mortality. METHODS 3267 patients (2283 men), aged 18-95 years, at intermediate-to-high-risk of cardiovascular disease were followed in this prospective cohort study. Conventional risk factors and biomarkers were included based on forward and backward Cox proportional stepwise selection models. RESULTS During 10-years of follow-up, 546 fatal cardiovascular events occurred. Four biomarkers (interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D) were retained during stepwise selection procedures for subsequent analyses. Simultaneous inclusion of these biomarkers significantly improved discrimination as measured by the C-index (0.78, P = 0.0001), and integrated discrimination improvement (0.0219, P<0.0001). Collectively, these biomarkers improved net reclassification for cardiovascular death by 10.6% (P<0.0001) when added to the conventional risk model. CONCLUSIONS In terms of adverse cardiovascular prognosis, a biomarker panel consisting of interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D offered significant incremental value beyond that conveyed by simple conventional risk factors.
Resumo:
Perinatal brain damage is associated not only with hypoxic-ischemic insults but also with intrauterine inflammation. A combination of antenatal inflammation and asphyxia increases the risk of cerebral palsy >70 times. The aim of the present study was to determine the effect of intracisternal (i.c.) administration of endotoxin [lipopolysaccharides (LPS)] on subsequent hypoxic-ischemic brain damage in neonatal rats. Seven-day-old Wistar rats were subjected to i.c. application of NaCl or LPS (5 microg/pup). One hour later, the left common carotid artery was exposed through a midline neck incision and ligated with 6-0 surgical silk. After another hour of recovery, the pups were subjected to a hypoxic gas mixture (8% oxygen/92% nitrogen) for 60 min. The animals were randomized to four experimental groups: 1) sham control group, left common carotid artery exposed but not ligated (n = 5); 2) LPS group, subjected to i.c. application of LPS (n = 7); 3) hypoxic-ischemic study group, i.c. injection of NaCl and exposure to hypoxia after ligation of the left carotid artery (n = 17); or 4) hypoxic-ischemic/LPS study group, i.c. injection of LPS and exposure to hypoxia after ligation of the left carotid artery (n = 19). Seven days later, neonatal brains were assessed for neuronal cell damage. In a second set of experiments, rat pups received an i.c. injection of LPS (5 microg/pup) and were evaluated for tumor necrosis factor-alpha expression by immunohistochemistry. Neuronal cell damage could not be observed in the sham control or in the LPS group. In the hypoxic-ischemic/LPS group, neuronal injury in the cerebral cortex was significantly higher than in animals that were subjected to hypoxia/ischemia after i.c. application of NaCl. Injecting LPS intracisternally caused a marked expression of tumor necrosis factor-alpha in the leptomeninges. Applying LPS intracisternally sensitizes the immature rat brain to a subsequent hypoxic-ischemic insult.
Resumo:
BACKGROUND Treatment of patients with paediatric acute lymphoblastic leukaemia has evolved such that the risk of late effects in survivors treated in accordance with contemporary protocols could be different from that noted in those treated decades ago. We aimed to estimate the risk of late effects in children with standard-risk acute lymphoblastic leukaemia treated with contemporary protocols. METHODS We used data from similarly treated members of the Childhood Cancer Survivor Study cohort. The Childhood Cancer Survivor Study is a multicentre, North American study of 5-year survivors of childhood cancer diagnosed between 1970 and 1986. We included cohort members if they were aged 1·0-9·9 years at the time of diagnosis of acute lymphoblastic leukaemia and had received treatment consistent with contemporary standard-risk protocols for acute lymphoblastic leukaemia. We calculated mortality rates and standardised mortality ratios, stratified by sex and survival time, after diagnosis of acute lymphoblastic leukaemia. We calculated standardised incidence ratios and absolute excess risk for subsequent neoplasms with age-specific, sex-specific, and calendar-year-specific rates from the Surveillance, Epidemiology and End Results Program. Outcomes were compared with a sibling cohort and the general US population. FINDINGS We included 556 (13%) of 4329 cohort members treated for acute lymphoblastic leukaemia. Median follow-up of the survivors from 5 years after diagnosis was 18·4 years (range 0·0-33·0). 28 (5%) of 556 participants had died (standardised mortality ratio 3·5, 95% CI 2·3-5·0). 16 (57%) deaths were due to causes other than recurrence of acute lymphoblastic leukaemia. Six (1%) survivors developed a subsequent malignant neoplasm (standardised incidence ratio 2·6, 95% CI 1·0-5·7). 107 participants (95% CI 81-193) in each group would need to be followed-up for 1 year to observe one extra chronic health disorder in the survivor group compared with the sibling group. 415 participants (376-939) in each group would need to be followed-up for 1 year to observe one extra severe, life-threatening, or fatal disorder in the group of survivors. Survivors did not differ from siblings in their educational attainment, rate of marriage, or independent living. INTERPRETATION The prevalence of adverse long-term outcomes in children treated for standard risk acute lymphoblastic leukaemia according to contemporary protocols is low, but regular care from a knowledgeable primary-care practitioner is warranted. FUNDING National Cancer Institute, American Lebanese-Syrian Associated Charities, Swiss Cancer Research.