123 resultados para Yanitsky, Oleg N.: Russian greens in a risk society
Resumo:
We conducted a study on 91 women with thyroid cancer and 306 controls in hospital for acute nonneoplastic, non-hormone-related disorders in order to investigate the role of reproductive and hormonal factors in the etiology of epithelial thyroid cancer in the Canton of Vaud, Switzerland. Non-significant increases in cancer risk with an increasing number of full-term pregnancies (odds ratio, OR, after allowance for age and previous benign thyroid disease = 1.6, for > or = 3 vs. 0 full-term pregnancies, 95% confidence interval, CI: 0.7-3.6) and spontaneous abortions (OR = 2.0 for > or = 2 vs. 0 spontaneous abortions, 95% CI: 0.7-5.2) were seen. A significantly elevated OR (2.8, 95% CI: 1.1-7.2) was found in those women whose first pregnancy ended with an abortion. Whereas most other reproductive, menstrual and hormonal factors examined did not seem to affect the risk of thyroid cancer significantly, a clue emerged of an association between thyroid cancer and artificial menopause (OR = 6.3, for women who underwent artificial menopause vs. premenopausal women, 95% CI: 1.7-23.2). Although not necessarily causal, the relationship between the risk of epithelial thyroid cancer and the occurrence of spontaneous abortions and artificial menopause deserves attention in future studies, in the light of the high incidence of thyroid cancer in young and middle-aged women.
Resumo:
Objective: The incidence of late-onset cytomegalovirus disease (i.e. disease appearing after discontinuation of antiviral prophylaxis) in solid-organ transplant recipients remains excessively high. This review will focus on describing the several strategies that could potentially reduce the incidence of late-onset cytomegalovirus disease. Methods: We reviewed the literature and presented our own clinical experience in the field. Results: The incidence of late-onset cytomegalovirus disease in recent trials can be as high as 36% in high-risk patients (donor positive/recipient negative for cytomegalovirus). The extension of antiviral prophylaxis to six months has recently proven in a prospective randomized controlled trial to be effective for reducing late-onset cytomegalovirus disease. The monitoring of cytomegalovirus viral load by PCR after the discontinuation of prophylaxis seems to be of moderate usefulness in low/intermediate-risk patients. The use of low-dose valganciclovir could reduce drug toxicity and costs while maintaining similar efficacy, but further studies are needed. A potentially interesting approach to predict the individual risk for development of cytomegalovirus disease appears to be the assessment of specific cell-mediated immune response. If cell-mediated immunity assays become widely available in transplant centers in the future, these assays may possibly be used to tailor the cytomegalovirus preventive strategy on an individual basis. Finally, recent prospective trials have evaluated novel cytomegalovirus vaccines that merit further evaluation in the transplant setting, although currently there is no cytomegalovirus vaccine that has been approved for routine clinical use. Conclusions: Several studies have recently evaluated novel strategies to reduce the incidence of late-onset cytomegalovirus disease. It is therefore expected that this improvement in preventive strategies will allow to further reduce the negative effects of cytomegalovirus disease after transplantation.
Resumo:
Background: Cytomegalovirus (CMV) disease remains an important cause of morbidity after kidney transplantation and has been associated with acute rejection, graft loss and other indirect effects. A 3-month course of VGC prophylaxis reduces the incidence of CMV disease. However, little is known about the indirect effects of lateonset CMV disease after VGC prophylaxis. Objective: To evaluate the impact and indirect consequences of late-onset CMV disease after VGC prophylaxis in kidney transplant recipients. Methods: Retrospective analysis of 61 consecutive adult kidney transplant recipient with positive CMV serology (donor or recipient) who received VGC prophylaxis for 3 months and completed a follow-up of at least 2 years post-transplantation. Patients who developed CMV disease within 1 year after transplantation were compared to CMV disease-free patients for renal function (plasma creatinine values) at 1, 6, 12 and 24 months and for the incidence of graft loss, acute rejection, diabetes, cancer and opportunistic infections. Results: 8/61 (13%) patients developed CMV disease at a median of 131 days after transplantation (range: 98-220). The CMV incidence in D+/R- high risk patients was 6/18 (33%), while it was 2/43 (5%) in intermediate-risk patients (p < 0.01). All 8 patients were treated by oral valganciclovir (median 39 days; range: 19-119) with a complete resolution of CMV disease. As shown in the figure, there was no difference in creatinine values between the two groups at any time during follow-up. There was no graft loss, and the incidence of acute rejection, cancer and opportunistic infections did not differ between the two groups. The incidence of post-transplant diabetes was higher (38% vs 15%) in patients with CMV disease, but this difference was not significant (p = 0.4). Conclusions: An incidence of 13% of late-onset CMV disease was observed despite 3 months VGC prophylaxis. However, no indirect consequences were found. Moreover, therapy of CMV disease by oral VGC was effective and safe. Larger trials are needed to study whether late-onset CMV disease is associated with indirect consequences, as described with early-onset CMV.
Resumo:
Previous studies have demonstrated that poultry house workers are exposed to very high levels of organic dust and consequently have an increased prevalence of adverse respiratory symptoms. However, the influence of the age of broilers on bioaerosol concentrations has not been investigated. To evaluate the evolution of bioaerosol concentration during the fattening period, bioaerosol parameters (inhalable dust, endotoxin and bacteria) were measured in 12 poultry confinement buildings in Switzerland, at three different stages of the birds' growth; samples of air taken from within the breathing zones of individual poultry house employees as they caught the chickens ready to be transported for slaughter were also analysed. Quantitative polymerase chain reaction (Q-PCR) was used to assess the quantity of total airborne bacteria and total airborne Staphylococcus species. Bioaerosol levels increased significantly during the fattening period of the chickens. During the task of catching mature birds, the mean inhalable dust concentration for a worker was 26 +/- 1.9 mg m(-3) and endotoxin concentration was 6198 +/- 2.3 EU m(-3) air, >6-fold higher than the Swiss occupational recommended value (1000 EU m(-3)). The mean exposure level of bird catchers to total bacteria and Staphylococcus species measured by Q-PCR is also very high, respectively, reaching values of 53 (+/-2.6) x 10(7) cells m(-3) air and 62 (+/-1.9) x 10(6) m(-3) air. It was concluded that in the absence of wearing protective breathing apparatus, chicken catchers in Switzerland risk exposure beyond recommended limits for all measured bioaerosol parameters. Moreover, the use of Q-PCR to estimate total and specific numbers of airborne bacteria is a promising tool for evaluating any modifications intended to improve the safety of current working practices
Resumo:
Background The aim if this study was to compare percutaneous drainage (PD) of the gallbladder to emergency cholecystectomy (EC) in a well-defined patient group with sepsis related to acute calculous/acalculous cholecystitis (ACC/AAC).Methods Between 2001 and 2007, all consecutive patients of our ICU treated by either PD or EC were retrospectively analyzed. Cases were collected from a prospective database. Percutaneous drainage was performed by a transhepatic route and EC by open or laparoscopic approach. Patients' general condition and organ dysfunction were assessed by two validated scoring systems (SAPS II and SOFA, respectively). Morbidity, mortality, and long-term outcome were systematically reviewed and analyzed in both groups.Results Forty-two patients [median age = 65.5 years (range = 32-94)] were included; 45% underwent EC (ten laparoscopic, nine open) and 55% PD (n = 23). Both patient groups had similar preoperative characteristics. Percutaneous drainage and EC were successful in 91 and 100% of patients, respectively. Organ dysfunctions were similarly improved by the third postoperative/postdrainage days. Despite undergoing PD, two patients required EC due to gangrenous cholecystitis. The conversion rate after laparoscopy was 20%. Overall morbidity was 8.7% after PD and 47% after EC (P = 0.011). Major morbidity was 0% after PD and 21% after EC (P = 0.034). The mortality rate was not different (13% after PD and 16% after EC, P = 1.0) and the deaths were all related to the patients' preexisting disease. Hospital and ICU stays were not different. Recurrent symptoms (17%) occurred only after ACC in the PD group.Conclusions In high-risk patients, PD and EC are both efficient in the resolution of acute cholecystitis sepsis. However, EC is associated with a higher procedure-related morbidity and the laparoscopic approach is not always possible. Percutaneous drainage represents a valuable intervention, but secondary cholecystectomy is mandatory in cases of acute calculous cholecystitis.
Resumo:
Valganciclovir and ganciclovir are widely used for the prevention of cytomegalovirus (CMV) infection in solid organ transplant recipients, with a major impact on patients' morbidity and mortality. Oral valganciclovir, the ester prodrug of ganciclovir, has been developed to enhance the oral bioavailability of ganciclovir. It crosses the gastrointestinal barrier through peptide transporters and is then hydrolysed into ganciclovir. This review aims to describe the current knowledge of the pharmacokinetic and pharmacodynamic characteristics of this agent, and to address the issue of therapeutic drug monitoring. Based on currently available literature, ganciclovir pharmacokinetics in adult solid organ transplant recipients receiving oral valganciclovir are characterized by bioavailability of 66 +/- 10% (mean +/- SD), a maximum plasma concentration of 3.1 +/- 0.8 mg/L after a dose of 450 mg and of 6.6 +/- 1.9 mg/L after a dose of 900 mg, a time to reach the maximum plasma concentration of 3.0 +/- 1.0 hours, area under the plasma concentration-time curve values of 29.1 +/- 5.3 mg.h/L and 51.9 +/- 18.3 mg.h/L (after 450 mg and 900 mg, respectively), apparent clearance of 12.4 +/- 3.8 L/h, an elimination half-life of 5.3 +/- 1.5 hours and an apparent terminal volume of distribution of 101 +/- 36 L. The apparent clearance is highly correlated with renal function, hence the dosage needs to be adjusted in proportion to the glomerular filtration rate. Unexplained interpatient variability is limited (18% in apparent clearance and 28% in the apparent central volume of distribution). There is no indication of erratic or limited absorption in given subgroups of patients; however, this may be of concern in patients with severe malabsorption. The in vitro pharmacodynamics of ganciclovir reveal a mean concentration producing 50% inhibition (IC(50)) among CMV clinical strains of 0.7 mg/L (range 0.2-1.9 mg/L). Systemic exposure of ganciclovir appears to be moderately correlated with clinical antiviral activity and haematotoxicity during CMV prophylaxis in high-risk transplant recipients. Low ganciclovir plasma concentrations have been associated with treatment failure and high concentrations with haematotoxicity and neurotoxicity, but no formal therapeutic or toxic ranges have been validated. The pharmacokinetic parameters of ganciclovir after valganciclovir administration (bioavailability, apparent clearance and volume of distribution) are fairly predictable in adult transplant patients, with little interpatient variability beyond the effect of renal function and bodyweight. Thus ganciclovir exposure can probably be controlled with sufficient accuracy by thorough valganciclovir dosage adjustment according to patient characteristics. In addition, the therapeutic margin of ganciclovir is loosely defined. The usefulness of systematic therapeutic drug monitoring in adult transplant patients therefore appears questionable; however, studies are still needed to extend knowledge to particular subgroups of patients or dosage regimens.
Resumo:
Colorectal cancer mortality has been declining over the last two decades in Europe, particularly in women, the trends being, however, different across countries and age groups. We updated to 2007 colorectal cancer mortality trends in Europe using data from the World Health Organization (WHO). Rates were analyzed for the overall population and separately in young, middle-age and elderly populations. In the European Union (EU), between 1997 and 2007 mortality from colorectal cancer declined by around 2% per year, from 19.7 to 17.4/100,000 men (world standardized rates) and from 12.5 to 10.5/100,000 women. Persisting favorable trends were observed in countries of western and northern Europe, while there were more recent declines in several countries of eastern Europe, including the Czech Republic, Hungary and Slovakia particularly in women (but not Romania and the Russian Federation). In 2007, a substantial excess in colorectal cancer mortality was still observed in Slovakia, Hungary, Croatia, the Czech Republic and Slovenia in men (rates over 25/100,000), and in Hungary, Norway, Denmark and Slovakia in women (rates over 14/100,000). Colorectal mortality trends were more favorable in the young (30-49 years) from most European countries, with a decline of ∼2% per year since the early 1990s in both men and women from the EU. The recent decreases in colorectal mortality rates in several European countries are likely due to improvements in (early) diagnosis and treatment, with a consequent higher survival from the disease. Interventions to further reduce colorectal cancer burden are, however, still warranted, particularly in eastern European countries.
Resumo:
BACKGROUND: We assessed the impact of a multicomponent worksite health promotion program for0 reducing cardiovascular risk factors (CVRF) with short intervention, adjusting for regression towards the mean (RTM) affecting such nonexperimental study without control group. METHODS: A cohort of 4,198 workers (aged 42 +/- 10 years, range 16-76 years, 27% women) were analyzed at 3.7-year interval and stratified by each CVRF risk category (low/medium/high blood pressure [BP], total cholesterol [TC], body mass index [BMI], and smoking) with RTM and secular trend adjustments. Intervention consisted of 15 min CVRF screening and individualized counseling by health professionals to medium- and high-risk individuals, with eventual physician referral. RESULTS: High-risk groups participants improved diastolic BP (-3.4 mm Hg [95%CI: -5.1, -1.7]) in 190 hypertensive patients, TC (-0.58 mmol/l [-0.71, -0.44]) in 693 hypercholesterolemic patients, and smoking (-3.1 cig/day [-3.9, -2.3]) in 808 smokers, while systolic BP changes reflected RTM. Low-risk individuals without counseling deteriorated TC and BMI. Body weight increased uniformly in all risk groups (+0.35 kg/year). CONCLUSIONS: In real-world conditions, short intervention program participants in high-risk groups for diastolic BP, TC, and smoking improved their CVRF, whereas low-risk TC and BMI groups deteriorated. Future programs may include specific advises to low-risk groups to maintain a favorable CVRF profile.
Resumo:
Starting from the early descriptions of Kraepelin and Bleuler, the construct of schizotypy was developed from observations of aberrations in nonpsychotic family members of schizophrenia patients. In contemporary diagnostic manuals, the positive symptoms of schizotypal personality disorder were included in the ultra high-risk (UHR) criteria 20 years ago, and nowadays are broadly employed in clinical early detection of psychosis. The schizotypy construct, now dissociated from strict familial risk, also informed research on the liability to develop any psychotic disorder, and in particular schizophrenia-spectrum disorders, even outside clinical settings. Against the historical background of schizotypy it is surprising that evidence from longitudinal studies linking schizotypy, UHR, and conversion to psychosis has only recently emerged; and it still remains unclear how schizotypy may be positioned in high-risk research. Following a comprehensive literature search, we review 18 prospective studies on 15 samples examining the evidence for a link between trait schizotypy and conversion to psychosis in 4 different types of samples: general population, clinical risk samples according to UHR and/or basic symptom criteria, genetic (familial) risk, and clinical samples at-risk for a nonpsychotic schizophrenia-spectrum diagnosis. These prospective studies underline the value of schizotypy in high-risk research, but also point to the lack of evidence needed to better define the position of the construct of schizotypy within a developmental psychopathology perspective of emerging psychosis and schizophrenia-spectrum disorders.
Resumo:
Objective: Tachycardia is associated with hypertension and is a predictor of cardiovascular events. The predictive effect of tachycardia might reflect its connection with hypertension. In this analysis of 15,245 VALUE study patients we explore whether tachycardia predicts cardiovascular endpoints in high risk hypertension and whether the in-trial blood pressure lowering modified the tachycardia - related risk. Methods: Heart rate from ECG readings at baseline and annually throughout the trial. Results: In the Cox Regression analysis the primary endpoint hazard ratio for a 10 beats per minute increment of baseline heart rate was 1.16 (1.12-1.2) p < 0.0001, 1.17 (1.13-1.22) p < 0.0001 and 1.22 (1.18-1.27) p < 0.0001 unadjusted, adjusted for baseline blood pressure and for blood pressure plus risk factors, respectively. Primary endpoints strikingly increased in the highest quintile of baseline heart rate (=/>79 beats). Primary endpoints in the highest heart rate quintile were 30 % higher in first, 55 % in second, 55 % in third, 52 % in fourth and 46 % in the fifth year of the study. The in-trial heart rate was also a potent predictor. The primary endpoint hazard ratios of highest heart rate quintile versus pooled lower 4 quintiles was (1.34-1.66) p < 0.0001 unadjusted, 1.52 (1.36-1.69) p <0.0001 adjusted for baseline blood pressure and risk factors and 1.52 (1.36-1.69) p < 0.0001 further adjusted for in trial pressure. The increase of primary events in the upper quintile of in-trial heart rate was 68% in the group with good and 63% in the group with inadequate blood pressure control (both p < 0.0001 by log rank test). Conclusions: 1./ Tachycardia is a short term marker and a long term predictor of adverse event in high risk hypertension. 2./ Tachycardia contributes to the residual cardiovascular risk regardless of the degree of BP control. We hypothesize heart rate lowering with appropriate drugs may further decrease the cardiovascular risk in patients with high risk hypertension and tachycardia.
Resumo:
Renin angiotensin system (RAS) blockers are generally considered as contraindicated when an atheromatous renal artery stenosis (ARAS) is diagnosed. The main reason is the fear of inducing renal ischemia and, hence, accelerating renal fibrosis and the progression towards end stage renal disease, albeit RAS blocker have been shown to be highly effective in controlling blood pressure. Part of the solution came by the development of the revascularization. There is now growing evidence showing no superiority of angioplasty over medical treatment on cardiovascular events and mortality, renal function and blood pressure control. Hence, RAS blockers resurfaced based on their proven beneficial effects on blood pressure control and cardiovascular prevention in high risk atherosclerotic patients. Thus, RAS blockers belong today to the standard treatment of hypertensive patients with ARAS. However they were not systematically prescribed in trials focusing on ARAS. The ongoing CORAL trial will give us further information on the place of this class of antihypertensive drugs in patients with ARAS.
Resumo:
BACKGROUND: Intravenously administered antimicrobial agents have been the standard choice for the empirical management of fever in patients with cancer and granulocytopenia. If orally administered empirical therapy is as effective as intravenous therapy, it would offer advantages such as improved quality of life and lower cost. METHODS: In a prospective, open-label, multicenter trial, we randomly assigned febrile patients with cancer who had granulocytopenia that was expected to resolve within 10 days to receive empirical therapy with either oral ciprofloxacin (750 mg twice daily) plus amoxicillin-clavulanate (625 mg three times daily) or standard daily doses of intravenous ceftriaxone plus amikacin. All patients were hospitalized until their fever resolved. The primary objective of the study was to determine whether there was equivalence between the regimens, defined as an absolute difference in the rates of success of 10 percent or less. RESULTS: Equivalence was demonstrated at the second interim analysis, and the trial was terminated after the enrollment of 353 patients. In the analysis of the 312 patients who were treated according to the protocol and who could be evaluated, treatment was successful in 86 percent of the patients in the oral-therapy group (95 percent confidence interval, 80 to 91 percent) and 84 percent of those in the intravenous-therapy group (95 percent confidence interval, 78 to 90 percent; P=0.02). The results were similar in the intention-to-treat analysis (80 percent and 77 percent, respectively; P=0.03), as were the duration of fever, the time to a change in the regimen, the reasons for such a change, the duration of therapy, and survival. The types of adverse events differed slightly between the groups but were similar in frequency. CONCLUSIONS: In low-risk patients with cancer who have fever and granulocytopenia, oral therapy with ciprofloxacin plus amoxicillin-clavulanate is as effective as intravenous therapy.
Resumo:
BACKGROUND: Angiographic studies suggest that acute vasospasm within 48 h of aneurysmal subarachnoid hemorrhage (SAH) predicts symptomatic vasospasm. However, the value of transcranial Doppler within 48 h of SAH is unknown. METHODS: We analyzed 199 patients who had at least 1 middle cerebral artery (MCA) transcranial Doppler examination within 48 h of SAH onset. Abnormal MCA mean blood flow velocity (mBFV) was defined as >90 cm/s. Delayed cerebral ischemia (DCI) was defined as clinical deterioration or radiological evidence of infarction due to vasospasm. RESULTS: Seventy-six patients (38%) had an elevation of MCA mBFV >90 cm/s within 48 h of SAH onset. The predictors of elevated mBFV included younger age (OR = 0.97 per year of age, p = 0.002), admission angiographic vasospasm (OR = 5.4, p = 0.009) and elevated white blood cell count (OR = 1.1 per 1,000 white blood cells, p = 0.003). Patients with elevated mBFV were more likely to experience a 10 cm/s fall in velocity at the first follow-up than those with normal baseline velocities (24 vs. 10%, p < 0.01), suggestive of resolving spasm. DCI developed in 19% of the patients. An elevated admission mBFV >90 cm/s during the first 48 h (adjusted OR = 2.7, p = 0.007) and a poor clinical grade (Hunt-Hess score 4 or 5, OR = 3.2, p = 0.002) were associated with a significant increase in the risk of DCI. CONCLUSION: Early elevations of mBFV correlate with acute angiographic vasospasm and are associated with a significantly increased risk of DCI. Transcranial Doppler ultrasound may be an early useful tool to identify patients at higher risk to develop DCI after SAH.
Resumo:
Objective This study assessed the efficacy and safety of canakinumab, a fully human anti-interleukin 1 beta monoclonal antibody, for prophylaxis against acute gouty arthritis flares in patients initiating urate-lowering treatment.Methods In this double-blind, double-dummy, dose-ranging study, 432 patients with gouty arthritis initiating allopurinol treatment were randomised 1:1:1:1:1:1:2 to receive: a single dose of canakinumab, 25, 50, 100, 200, or 300 mg subcutaneously; 4 x 4-weekly doses of canakinumab (50 + 50 + 25 + 25 mg subcutaneously); or daily colchicine 0.5 mg orally for 16 weeks. Patients recorded details of flares in diaries. The study aimed to determine the canakinumab dose having equivalent efficacy to colchicine 0.5 mg at 16 weeks.Results A dose-response for canakinumab was not apparent with any of the four predefined dose-response models. The estimated canakinumab dose with equivalent efficacy to colchicine was below the range of doses tested. At 16 weeks, there was a 62% to 72% reduction in the mean number of flares per patient for canakinumab doses >= 50 mg versus colchicine based on a negative binomial model (rate ratio: 0.28-0.38, p <= 0.0083), and the percentage of patients experiencing >= 1 flare was significantly lower for all canakinumab doses (15% to 27%) versus colchicine (44%, p<0.05). There was a 64% to 72% reduction in the risk of experiencing >= 1 flare for canakinumab doses >= 50 mg versus colchicine at 16 weeks (hazard ratio (HR): 0.28-0.36, p <= 0.05). The incidence of adverse events was similar across treatment groups.Conclusions Single canakinumab doses >= 50 mg or four 4-weekly doses provided superior prophylaxis against flares compared with daily colchicine 0.5 mg.
Resumo:
The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable approach to risk estimation would be to use organ-specific non-linear risk models applied to the dose distributions of organs within or near the treatment fields (lungs and contralateral breast in the case of breast radiotherapy) as the majority of radiation-induced secondary cancers are found in the beam-bordering regions.