205 resultados para sub-seasonal prediction
Resumo:
Natural fluctuations in soil microbial communities are poorly documented because of the inherent difficulty to perform a simultaneous analysis of the relative abundances of multiple populations over a long time period. Yet, it is important to understand the magnitudes of community composition variability as a function of natural influences (e.g., temperature, plant growth, or rainfall) because this forms the reference or baseline against which external disturbances (e.g., anthropogenic emissions) can be judged. Second, definition of baseline fluctuations in complex microbial communities may help to understand at which point the systems become unbalanced and cannot return to their original composition. In this paper, we examined the seasonal fluctuations in the bacterial community of an agricultural soil used for regular plant crop production by using terminal restriction fragment length polymorphism profiling (T-RFLP) of the amplified 16S ribosomal ribonucleic acid (rRNA) gene diversity. Cluster and statistical analysis of T-RFLP data showed that soil bacterial communities fluctuated very little during the seasons (similarity indices between 0.835 and 0.997) with insignificant variations in 16S rRNA gene richness and diversity indices. Despite overall insignificant fluctuations, between 8 and 30% of all terminal restriction fragments changed their relative intensity in a significant manner among consecutive time samples. To determine the magnitude of community variations induced by external factors, soil samples were subjected to either inoculation with a pure bacterial culture, addition of the herbicide mecoprop, or addition of nutrients. All treatments resulted in statistically measurable changes of T-RFLP profiles of the communities. Addition of nutrients or bacteria plus mecoprop resulted in bacteria composition, which did not return to the original profile within 14 days. We propose that at less than 70% similarity in T-RFLP, the bacterial communities risk to drift apart to inherently different states.
Resumo:
The blood pressure (BP) lowering effect of the orally active angiotensin converting enzyme inhibitor, captopril (SQ14225), was studied in 59 hypertensive patients maintained on a constant sodium intake. Within 2 hours of the first dose of captopril BP fell from 171/107 to a maximum low of 142/92 mm Hg (p less than 0.001), and after 4 to 8 days to treatment BP averaged 145/94 mm Hg (p less than 0.001). The magnitude of BP drop induced by captopril was significantly correlated to baseline plasma renin activity (PRA) both during the acute phase (r = -0.38, p less than 0.01) and after the 4 to 8-day interval (r = -0.33, p less than 0.01). Because of considerable scatter in individual data, renin profiling was not precisely predictive of the immediate or delayed BP response of separate patients. However, the BP levels achieved following the initial dose of captopril were closely correlated to BP measured after 4 to 8 days of therapy, and appeared to have greater predictive value than control PRA of the long-term efficacy of chronic captopril therapy despite marked BP changes occurring in some patients during the intermediate period. Because of these intermediate BP changes, addition of a diuretic to enhance antihypertensive effectiveness of angiotensin blockade should be restrained for several days after initiation of captopril therapy.
Resumo:
Oseltamivir is the ester-type prodrug of the neuraminidase inhibitor oseltamivir carboxylate. It has been shown to be an effective treatment for both seasonal influenza and the recent pandemic 2009 A/H1N1 influenza, reducing both the duration and severity of the illness. It is also effective when used preventively. This review aims to describe the current knowledge of the pharmacokinetic and pharmacodynamic characteristics of this agent, and to address the issue of possible therapeutic drug monitoring. According to the currently available literature, the pharmacokinetics of oseltamivir carboxylate after oral administration of oseltamivir are characterized by mean ± SD bioavailability of 79 ± 12%, apparent clearance of 25.3 ± 7.0 L/h, an elimination half-life of 7.4 ± 2.5 hours and an apparent terminal volume of distribution of 267 ± 122 L. A maximum plasma concentration of 342 ± 83 μg/L, a time to reach the maximum plasma concentration of 4.2 ± 1.1 hours, a trough plasma concentration of 168 ± 32 μg/L and an area under the plasma concentration-time curve from 0 to 24 hours of 6110 ± 1330 μg · h/L for a 75 mg twice-daily regimen were derived from literature data. The apparent clearance is highly correlated with renal function, hence the dosage needs to be adjusted in proportion to the glomerular filtration rate. Interpatient variability is moderate (28% in apparent clearance and 46% in the apparent central volume of distribution); there is no indication of significant erratic or limited absorption in given patient subgroups. The in vitro pharmacodynamics of oseltamivir carboxylate reveal wide variation in the concentration producing 50% inhibition of influenza A and B strains (range 0.17-44 μg/L). A formal correlation between systemic exposure to oseltamivir carboxylate and clinical antiviral activity or tolerance in influenza patients has not yet been demonstrated; thus no formal therapeutic or toxic range can be proposed. The pharmacokinetic parameters of oseltamivir carboxylate after oseltamivir administration (bioavailability, apparent clearance and the volume of distribution) are fairly predictable in healthy subjects, with little interpatient variability outside the effect of renal function in all patients and bodyweight in children. Thus oseltamivir carboxylate exposure can probably be controlled with sufficient accuracy by thorough dosage adjustment according to patient characteristics. However, there is a lack of clinical study data on naturally infected patients. In addition, the therapeutic margin of oseltamivir carboxylate is poorly defined. The usefulness of systematic therapeutic drug monitoring in patients therefore appears to be questionable; however, studies are still needed to extend the knowledge to particular subgroups of patients or dosage regimens.
Resumo:
Introduction.- Knee injuries are frequent in a young and active population. Most of the patients resume their professional activity but few studies were interested in factors that predict a return to work. The aim of this study is to identify these predictors from a large panel of bio-psychosocial variables. We postulated that the return to work 3 months and 2 years after discharge is mostly predicted by psychosocial variables.Patients and methods.- Prospective study, patients hospitalized for a knee injury. Variables measured: the abbreviated injury score (AIS) for the gravity of the injuries, analog visual scale for the intensity of pain, INTERMED for the bio-psychosocial complexity, SF-36 for the quality of life, HADs for the anxiety/depression symptoms and IKDC score for the knee function. Univariate logistic regressions, adjusted for age and gender, were performed in order to predict return to work.Results.- One hundred and twenty-six patients hospitalized during 8 months after the accident were included into this prospective study. A total of 73 (58%) and 75 (59%) questionnaires were available after 3 months and 2 years, respectively. The SF-36 pain was the sole predictor of return to work at 3 months (odds Ratio 1.06 [1.02-1.10], P = 0.01; for a one point increase) and 2 years (odds Ratio 1.06 [1.02-1.10], P = 0.01). At three months, other factors are SF-36 (physic sub-scale), IKDC score, the presence of a work contract and the presence of litigation. The bio-psychosocial complexity, the presence of depressive symptoms predicts the return to work at two years.Discussion.- Our working hypothesis was partially confirmed: some psychosocial factors (i.e. depressive symptoms, work contract, litigation, INTERMED) predict the return to work but the physical health and the knee function, perceived by the patient, are also correlated. Pain is the sole factor isolated at both times (i.e. 3 months and 2 years) and, consequently, appears a key element in the prediction of the return to work. Some factors are accessible to the rehabilitation program but only if an interdisciplinary approach is performed.
Resumo:
The Mg/Ca and Sr/Ca ratios of living ostracods belonging to 15 different species and sampled monthly over a one year-cycle at five sites (2, 5, 13, 33, and 70 m water depths) in western Lake Geneva (Switzerland) are compared to the oxygen and carbon isotope compositions measured on the same samples as well as to the temperature and chemical composition of the water (δ18OH2O, δ13CDIC, Mg/CaH2O, and Sr/CaH2O) at the time of ostracod calcification. The results indicate that trace element incorporation varied at the species level, mainly because of the ecological and biological differences between the different species (life-cycle, (micro-)habitat preference, biomineralisation processes) and the control thereof on trace element incorporation of the ostracods. In littoral zones, the Mg/Ca and Sr/Ca of ostracod valves increase as temperature and Mg/Ca and Sr/Ca of water increase during spring and summer, hence reflecting mainly seasonal variations. However, given that for Lake Geneva the Mg/Ca and Sr/Ca of water also vary with temperature, it is not possible to distinguish the effects of temperature from those of changes in chemical composition of water on the trace element content in ostracod valves. Results support that both water temperature and water Mg/Ca and Sr/Ca ratios control the final trace element content of Cyprididae valves. In contrast, the trace element content of species living in deeper zones of the basin is influenced by variations in the chemical composition of the pore water for the infaunal species. Trace element content measured for these specimens cannot, therefore, be used to reconstruct the compositions of the water lake bottom. In addition, incorporation of Mg and Sr into the shell differs from one family, sub-family, or even species to the other. This suggests that the distinctive Mg and Sr partition coefcients for the analysed taxa result from different valve calcification strategies that may be phylogenetic.
Resumo:
There is controversy over the use of the Ross procedure with regard to the sub-coronary and root replacement technique and its long-term durability. A systematic review of the literature may provide insight into the outcomes of these two surgical subvariants. A systematic review of reports between 1967 and February 2013 on sub-coronary and root replacement Ross procedures was undertaken. Twenty-four articles were included and divided into (i) sub-coronary technique and (ii) root replacement technique. The 10-year survival rate for a mixed-patient population in the sub-coronary procedure was 87.3% with a 95% confidence interval (CI) of 79.7-93.4 and 89.1% (95% CI, 85.3-92.1) in the root replacement technique category. For adults, it was 94 vs 95.3% (CI, 88.9-98.1) and in the paediatric series it was 90 vs 92.7% (CI, 86.9-96.0), respectively. Freedom from reoperation at 10 years was, in the mixed population, 83.3% (95% CI, 69.9-93.4) and 93.3% (95% CI, 89.4-95.9) for sub-coronary versus root replacement technique, respectively. In adults, it was 98 vs 91.2% (95% CI, 82.4-295.8), and in the paediatric series 93.3 vs 92.0% (95% CI, 86.1-96.5) for sub-coronary versus root replacement technique, respectively. The Ross procedure arguably has satisfactory results over 5 and 10 years for both adults and children. The results do not support the advantages of the sub-coronary technique over the root replacement technique. Root replacement was of benefit to patients undergoing reoperations on neoaorta and for long-term survival in mixed series.
Resumo:
Bioactive small molecules, such as drugs or metabolites, bind to proteins or other macro-molecular targets to modulate their activity, which in turn results in the observed phenotypic effects. For this reason, mapping the targets of bioactive small molecules is a key step toward unraveling the molecular mechanisms underlying their bioactivity and predicting potential side effects or cross-reactivity. Recently, large datasets of protein-small molecule interactions have become available, providing a unique source of information for the development of knowledge-based approaches to computationally identify new targets for uncharacterized molecules or secondary targets for known molecules. Here, we introduce SwissTargetPrediction, a web server to accurately predict the targets of bioactive molecules based on a combination of 2D and 3D similarity measures with known ligands. Predictions can be carried out in five different organisms, and mapping predictions by homology within and between different species is enabled for close paralogs and orthologs. SwissTargetPrediction is accessible free of charge and without login requirement at http://www.swisstargetprediction.ch.
Resumo:
Background/objectives:Bioelectrical impedance analysis (BIA) is used in population and clinical studies as a technique for estimating body composition. Because of significant under-representation in existing literature, we sought to develop and validate predictive equation(s) for BIA for studies in populations of African origin.Subjects/methods:Among five cohorts of the Modeling the Epidemiologic Transition Study, height, weight, waist circumference and body composition, using isotope dilution, were measured in 362 adults, ages 25-45 with mean body mass indexes ranging from 24 to 32. BIA measures of resistance and reactance were measured using tetrapolar placement of electrodes and the same model of analyzer across sites (BIA 101Q, RJL Systems). Multiple linear regression analysis was used to develop equations for predicting fat-free mass (FFM), as measured by isotope dilution; covariates included sex, age, waist, reactance and height(2)/resistance, along with dummy variables for each site. Developed equations were then tested in a validation sample; FFM predicted by previously published equations were tested in the total sample.Results:A site-combined equation and site-specific equations were developed. The mean differences between FFM (reference) and FFM predicted by the study-derived equations were between 0.4 and 0.6âeuro0/00kg (that is, 1% difference between the actual and predicted FFM), and the measured and predicted values were highly correlated. The site-combined equation performed slightly better than the site-specific equations and the previously published equations.Conclusions:Relatively small differences exist between BIA equations to estimate FFM, whether study-derived or published equations, although the site-combined equation performed slightly better than others. The study-derived equations provide an important tool for research in these understudied populations.
Resumo:
BACKGROUND: Several markers of atherosclerosis and of inflammation have been shown to predict coronary heart disease (CHD) individually. However, the utility of markers of atherosclerosis and of inflammation on prediction of CHD over traditional risk factors has not been well established, especially in the elderly. METHODS: We studied 2202 men and women, aged 70-79, without baseline cardiovascular disease over 6-year follow-up to assess the risk of incident CHD associated with baseline noninvasive measures of atherosclerosis (ankle-arm index [AAI], aortic pulse wave velocity [aPWV]) and inflammatory markers (interleukin-6 [IL-6], C-reactive protein [CRP], tumor necrosis factor-a [TNF-a]). CHD events were studied as either nonfatal myocardial infarction or coronary death ("hard" events), and "hard" events plus hospitalization for angina, or the need for coronary-revascularization procedures (total CHD events). RESULTS: During the 6-year follow-up, 283 participants had CHD events (including 136 "hard" events). IL-6, TNF-a and AAI independently predicted CHD events above Framingham Risk Score (FRS) with hazard ratios [HR] for the highest as compared with the lowest quartile for IL-6 of 1.95 (95%CI: 1.38-2.75, p for trend<0.001), TNF-a of 1.45 (95%CI: 1.04-2.02, p for trend 0.03), of 1.66 (95%CI: 1.19-2.31) for AAI £0.9, as compared to AAI 1.01-1.30. CRP and aPWV were not independently associated with CHD events. Results were similar for "hard" CHD events. Addition of IL-6 and AAI to traditional cardiovascular risk factors yielded the greatest improvement in the prediction of CHD; C-index for "hard"/total CHD events increased from 0.62/0.62 for traditional risk factors to 0.64/0.64 for IL-6 addition, 0.65/0.63 for AAI, and 0.66/0.64 for IL-6 combined with AAI. Being in the highest quartile of IL-6 combined with an AAI £ 0.90 or >1.40 yielded an HR of 2.51 (1.50-4.19) and 4.55 (1.65-12.50) above FRS, respectively. With use of CHD risk categories, risk prediction at 5 years was more accurate in models that included IL-6, AAI or both, with 8.0, 8.3 and 12.1% correctly reclassified respectively. CONCLUSIONS: Among older adults, markers of atherosclerosis and of inflammation, particularly IL-6 and AAI, are independently associated with CHD. However, these markers only modestly improve cardiovascular risk prediction beyond traditional risk factors. Acknowledgments: This study was supported by Contracts NO1-AG-6-2101, NO1-AG-6- 2103, and NO1-AG-6-2106 of the National Institute on Aging. This research was supported in part by the Intramural Research Program of the NIH, National Institute on Aging.
Resumo:
Osteoporotic hip fractures increase dramatically with age and are responsible for considerable morbidity and mortality. Several treatments to prevent the occurrence of hip fracture have been validated in large randomized trials and the current challenge is to improve the identification of individuals at high risk of fracture who would benefit from therapeutic or preventive intervention. We have performed an exhaustive literature review on hip fracture predictors, focusing primarily on clinical risk factors, dual X-ray absorptiometry (DXA), quantitative ultrasound, and bone markers. This review is based on original articles and meta-analyses. We have selected studies that aim both to predict the risk of hip fracture and to discriminate individuals with or without fracture. We have included only postmenopausal women in our review. For studies involving both men and women, only results concerning women have been considered. Regarding clinical factors, only prospective studies have been taken into account. Predictive factors have been used as stand-alone tools to predict hip fracture or sequentially through successive selection processes or by combination into risk scores. There is still much debate as to whether or not the combination of these various parameters, as risk scores or as sequential or concurrent combinations, could help to better predict hip fracture. There are conflicting results on whether or not such combinations provide improvement over each method alone. Sequential combination of bone mineral density and ultrasound parameters might be cost-effective compared with DXA alone, because of fewer bone mineral density measurements. However, use of multiple techniques may increase costs. One problem that precludes comparison of most published studies is that they use either relative risk, or absolute risk, or sensitivity and specificity. The absolute risk of individuals given their risk factors and bone assessment results would be a more appropriate model for decision-making than relative risk. Currently, a group appointed by the World Health Organization and lead by Professor John Kanis is working on such a model. It will therefore be possible to further assess the best choice of threshold to optimize the number of women needed to screen for each country and each treatment.
Resumo:
BACKGROUND: The influence of anti-T-cell therapy in the immunogenicity of the influenza vaccine in kidney transplant recipients remains unclear. METHODS: During the 2010 to 2011 influenza season, we evaluated the immune response to the inactivated trivalent influenza vaccine in kidney transplant recipients having received Thymoglobulin or basiliximab as induction therapy. A hemagglutination inhibition assay was used to assess the immunogenicity of the vaccine. The primary outcome was geometric mean titers of hemagglutination inhibition after influenza vaccination. RESULTS: Sixty patients (Thymoglobulin n=22 and basiliximab n=38) were included. Patients in the Thymoglobulin group were older (P=0.16), showed higher creatinine levels (P=0.16) and had more frequently received a previous transplant (P=0.02). There were no significant differences in geometric mean titers for any of the three viral strains between groups (P=0.69 for H1N1, P=0.56 for H3N2, and P=0.7 for B strain). Seroconversion to at least one viral strain was seen in 15 (68%) of 22 patients in the Thymoglobulin group and 28 (73%) of 38 in the basiliximab group (P=0.77). In patients vaccinated during the first year after receiving anti-T-cell therapy (n=25), there was a trend toward lower vaccine responses in the Thymoglobulin group. Patients who received Thymoglobulin showed lower CD4 cell counts and lower levels of IgM, at an average of 16.2 months after transplantation. A multivariate analysis showed that only the absence of mycophenolate was associated with a better vaccine response (odds ratio=9.47; 95% confidence interval, 1.03-86.9; P=0.047). CONCLUSION: No significant differences were seen in immunogenicity of the influenza vaccine in kidney transplant recipients having received either Thymoglobulin or basiliximab.
Resumo:
BACKGROUND: Guidelines for the prevention of coronary heart disease (CHD) recommend use of Framingham-based risk scores that were developed in white middle-aged populations. It remains unclear whether and how CHD risk prediction might be improved among older adults. We aimed to compare the prognostic performance of the Framingham risk score (FRS), directly and after recalibration, with refit functions derived from the present cohort, as well as to assess the utility of adding other routinely available risk parameters to FRS.¦METHODS: Among 2193 black and white older adults (mean age, 73.5 years) without pre-existing cardiovascular disease from the Health ABC cohort, we examined adjudicated CHD events, defined as incident myocardial infarction, CHD death, and hospitalization for angina or coronary revascularization.¦RESULTS: During 8-year follow-up, 351 participants experienced CHD events. The FRS poorly discriminated between persons who experienced CHD events vs. not (C-index: 0.577 in women; 0.583 in men) and underestimated absolute risk prediction by 51% in women and 8% in men. Recalibration of the FRS improved absolute risk prediction, particulary for women. For both genders, refitting these functions substantially improved absolute risk prediction, with similar discrimination to the FRS. Results did not differ between whites and blacks. The addition of lifestyle variables, waist circumference and creatinine did not improve risk prediction beyond risk factors of the FRS.¦CONCLUSIONS: The FRS underestimates CHD risk in older adults, particularly in women, although traditional risk factors remain the best predictors of CHD. Re-estimated risk functions using these factors improve accurate estimation of absolute risk.
Resumo:
Objective. To measure support for seasonal influenza vaccination requirements among US healthcare personnel (HCP) and its associations with attitudes regarding influenza and influenza vaccination and self-reported coverage by existing vaccination requirements. Design. Between June 1 and June 30, 2010, we surveyed a sample of US HCP ([Formula: see text]) recruited using an existing probability-based online research panel of participants representing the US general population as a sampling frame. Setting. General community. Participants. Eligible HCP who (1) reported having worked as medical doctors, health technologists, healthcare support staff, or other health practitioners or who (2) reported having worked in hospitals, ambulatory care facilities, long-term care facilities, or other health-related settings. Methods. We analyzed support for seasonal influenza vaccination requirements for HCP using proportion estimation and multivariable probit models. Results. A total of 57.4% (95% confidence interval, 53.3%-61.5%) of US HCP agreed that HCP should be required to be vaccinated for seasonal influenza. Support for mandatory vaccination was statistically significantly higher among HCP who were subject to employer-based influenza vaccination requirements, who considered influenza to be a serious disease, and who agreed that influenza vaccine was safe and effective. Conclusions. A majority of HCP support influenza vaccination requirements. Moreover, providing HCP with information about the safety of influenza vaccination and communicating that immunization of HCP is a patient safety issue may be important for generating staff support for influenza vaccination requirements.
Resumo:
OBJECTIVE: To study delayed failure after subthalamic nucleus (STN) deep brain stimulation in Parkinson's disease (PD) patients. METHODS: Out of 56 consecutive bilaterally STN-implanted PD patients, we selected subjects who, after initial clinical improvement (1 month after surgery), lost benefit (delayed failure, DF). RESULTS: Five patients developed sub-acutely severe gait disorders (DF). In 4/5 DF patients, a micro-lesion effect, defined as improvement without stimulation, was observed; immediate post-operative MRI demonstrated electrode located above or behind to the STN. CONCLUSIONS: Patients presenting micro-lesion effect should be carefully monitored, as this phenomenon can mask electrodes misplacement and evolution in DF