217 resultados para serum level
Resumo:
The peroxisome proliferator-activated receptor (PPAR) family comprises three distinct isotypes: PPARalpha, PPARbeta/delta and PPARgamma. PPARs are nuclear hormone receptors that mediate the effects of fatty acids and their derivatives at the transcriptional level. Until recently, the characterisation of the important role of PPARalpha in fatty acid oxidation and of PPARgamma in lipid storage contrasted with the sparse information concerning PPARbeta/delta. However, evidence is now emerging for a role of PPARbeta/delta in tissue repair and energy homeostasis. Experiments with tissue-specific overexpression of PPARbeta/delta or treatment of mice with selective PPARbeta/delta agonists demonstrated that activation of PPARbeta/delta in vivo increases lipid catabolism in skeletal muscle, heart and adipose tissue and improves the serum lipid profile and insulin sensitivity in several animal models. PPARbeta/delta activation also prevents the development of obesity and improves cholesterol homeostasis in obesity-prone mouse models. These new insights into PPARbeta/delta functions suggest that targeting PPARbeta/delta may be helpful for treating disorders associated with the metabolic syndrome. Although these perspectives are promising, several independent and contradictory reports raise concerns about the safety of PPARbeta/delta ligands with respect to tumourigenic activity in the gut. Thus, it appears that further exploration of PPARbeta/delta functions is necessary to better define its potential as a therapeutic target.
Resumo:
Elevated serum ferritin levels may reflect a systemic inflammatory state as well as increased iron storage, both of which may contribute to an unfavorable outcome of chronic hepatitis C (CHC). We therefore performed a comprehensive analysis of the role of serum ferritin and its genetic determinants in the pathogenesis and treatment of CHC. To this end, serum ferritin levels at baseline of therapy with pegylated interferon-alpha and ribavirin or before biopsy were correlated with clinical and histological features of chronic hepatitis C virus (HCV) infection, including necroinflammatory activity (N = 970), fibrosis (N = 980), steatosis (N = 886), and response to treatment (N = 876). The association between high serum ferritin levels (> median) and the endpoints was assessed by logistic regression. Moreover, a candidate gene as well as a genome-wide association study of serum ferritin were performed. We found that serum ferritin ≥ the sex-specific median was one of the strongest pretreatment predictors of treatment failure (univariate P < 0.0001, odds ratio [OR] = 0.45, 95% confidence interval [CI] = 0.34-0.60). This association remained highly significant in a multivariate analysis (P = 0.0002, OR = 0.35, 95% CI = 0.20-0.61), with an OR comparable to that of interleukin (IL)28B genotype. When patients with the unfavorable IL28B genotypes were stratified according to high versus low ferritin levels, SVR rates differed by > 30% in both HCV genotype 1- and genotype 3-infected patients (P < 0.001). Serum ferritin levels were also independently associated with severe liver fibrosis (P < 0.0001, OR = 2.67, 95% CI = 1.68-4.25) and steatosis (P = 0.002, OR = 2.29, 95% CI = 1.35-3.91), but not with necroinflammatory activity (P = 0.3). Genetic variations had only a limited impact on serum ferritin levels. Conclusion: In patients with CHC, elevated serum ferritin levels are independently associated with advanced liver fibrosis, hepatic steatosis, and poor response to interferon-alpha-based therapy.
Resumo:
BACKGROUND: Cardiovascular diseases (CVD) cause 1.8 million premature (<75 years) death annually in Europe. The majority of these deaths are preventable with the most efficient and cost-effective approach being on the population level. The aim of this position paper is to assist authorities in selecting the most adequate management strategies to prevent CVD. DESIGN AND METHODS: Experts reviewed and summarized the published evidence on the major modifiable CVD risk factors: food, physical inactivity, smoking, and alcohol. Population-based preventive strategies focus on fiscal measures (e.g. taxation), national and regional policies (e.g. smoke-free legislation), and environmental changes (e.g. availability of alcohol). RESULTS: Food is a complex area, but several strategies can be effective in increasing fruit and vegetables and lowering intake of salt, saturated fat, trans-fats, and free sugars. Tobacco and alcohol can be regulated mainly by fiscal measures and national policies, but local availability also plays a role. Changes in national policies and the built environment will integrate physical activity into daily life. CONCLUSION: Societal changes and commercial influences have led to the present unhealthy environment, in which default option in life style increases CVD risk. A challenge for both central and local authorities is, therefore, to ensure healthier defaults. This position paper summarizes the evidence and recommends a number of structural strategies at international, national, and regional levels that in combination can substantially reduce CVD.
Resumo:
We did a subject-level meta-analysis of the changes (Δ) in blood pressure (BP) observed 3 and 6 months after renal denervation (RDN) at 10 European centers. Recruited patients (n=109; 46.8% women; mean age 58.2 years) had essential hypertension confirmed by ambulatory BP. From baseline to 6 months, treatment score declined slightly from 4.7 to 4.4 drugs per day. Systolic/diastolic BP fell by 17.6/7.1 mm Hg for office BP, and by 5.9/3.5, 6.2/3.4, and 4.4/2.5 mm Hg for 24-h, daytime and nighttime BP (P0.03 for all). In 47 patients with 3- and 6-month ambulatory measurements, systolic BP did not change between these two time points (P0.08). Normalization was a systolic BP of <140 mm Hg on office measurement or <130 mm Hg on 24-h monitoring and improvement was a fall of 10 mm Hg, irrespective of measurement technique. For office BP, at 6 months, normalization, improvement or no decrease occurred in 22.9, 59.6 and 22.9% of patients, respectively; for 24-h BP, these proportions were 14.7, 31.2 and 34.9%, respectively. Higher baseline BP predicted greater BP fall at follow-up; higher baseline serum creatinine was associated with lower probability of improvement of 24-h BP (odds ratio for 20-μmol l(-1) increase, 0.60; P=0.05) and higher probability of experiencing no BP decrease (OR, 1.66; P=0.01). In conclusion, BP responses to RDN include regression-to-the-mean and remain to be consolidated in randomized trials based on ambulatory BP monitoring. For now, RDN should remain the last resort in patients in whom all other ways to control BP failed, and it must be cautiously used in patients with renal impairment.
Resumo:
Meta-analysis of genome-wide association studies (GWASs) has led to the discoveries of many common variants associated with complex human diseases. There is a growing recognition that identifying "causal" rare variants also requires large-scale meta-analysis. The fact that association tests with rare variants are performed at the gene level rather than at the variant level poses unprecedented challenges in the meta-analysis. First, different studies may adopt different gene-level tests, so the results are not compatible. Second, gene-level tests require multivariate statistics (i.e., components of the test statistic and their covariance matrix), which are difficult to obtain. To overcome these challenges, we propose to perform gene-level tests for rare variants by combining the results of single-variant analysis (i.e., p values of association tests and effect estimates) from participating studies. This simple strategy is possible because of an insight that multivariate statistics can be recovered from single-variant statistics, together with the correlation matrix of the single-variant test statistics, which can be estimated from one of the participating studies or from a publicly available database. We show both theoretically and numerically that the proposed meta-analysis approach provides accurate control of the type I error and is as powerful as joint analysis of individual participant data. This approach accommodates any disease phenotype and any study design and produces all commonly used gene-level tests. An application to the GWAS summary results of the Genetic Investigation of ANthropometric Traits (GIANT) consortium reveals rare and low-frequency variants associated with human height. The relevant software is freely available.
Resumo:
OBJECTIVE: To discuss, on the basis of the experience of two clinical cases and extensive literature review, the significance of extremely low levels of anti-Müllerian hormone (AMH), also known as Müllerian-inhibiting substance, in infertile women. DESIGN: Case report. SETTING: University-based infertility clinic at a medical center in Switzerland. PATIENT(S): Two women, 29 and 41 years of age and with a 2- and 4-year history of secondary infertility, respectively. INTERVENTION(S): Clinical, radiological, and biological investigation of infertility, including repeated measurements of the serum AMH with serial ELISA assays. MAIN OUTCOME MEASURE(S): Levels of AMH and development of ongoing pregnancy. RESULT(S): Both women had a spontaneous ongoing pregnancy despite undetectable AMH levels. CONCLUSION(S): Although it is helpful for day-to-day management of infertile patients, the predictive value of AMH for the occurrence of a spontaneous ongoing pregnancy has limits.
Resumo:
Mountain ranges are biodiversity hotspots worldwide and provide refuge to many organisms under contemporary climate change. Gathering field information on mountain biodiversity over time is of primary importance to understand the response of biotic communities to climate changes. For plants, several long-term observation sites and networks of mountain biodiversity are emerging worldwide to gather field data and monitor altitudinal range shifts and community composition changes under contemporary climate change. Most of these monitoring sites, however, focus on alpine ecosystems and mountain summits, such as the global observation research initiative in alpine environments (GLORIA). Here we describe the Alps Vegetation Database, a comprehensive community level archive (GIVD ID EU-00-014) which aims at compiling all available geo-referenced vegetation plots from lowland forests to alpine grasslands across the greatest mountain range in Europe: the Alps. This research initiative was funded between 2008 and 2011 by the Danish Council for Independent Research and was part of a larger project to compare cross-scale plant community structure between the Alps and the Scandes. The Alps Vegetation Database currently harbours 35,731 geo-referenced vegetation plots and 5,023 valid taxa across Mediterranean, temperate and alpine environments. The data are mainly used by the main contributors of the Alps Vegetation Database in an ecoinformatics approach to test hypotheses related to plant macroecology and biogeography, but external proposals for joint collaborations are welcome.
Resumo:
BACKGROUND: Maintaining therapeutic concentrations of drugs with a narrow therapeutic window is a complex task. Several computer systems have been designed to help doctors determine optimum drug dosage. Significant improvements in health care could be achieved if computer advice improved health outcomes and could be implemented in routine practice in a cost effective fashion. This is an updated version of an earlier Cochrane systematic review, by Walton et al, published in 2001. OBJECTIVES: To assess whether computerised advice on drug dosage has beneficial effects on the process or outcome of health care. SEARCH STRATEGY: We searched the Cochrane Effective Practice and Organisation of Care Group specialized register (June 1996 to December 2006), MEDLINE (1966 to December 2006), EMBASE (1980 to December 2006), hand searched the journal Therapeutic Drug Monitoring (1979 to March 2007) and the Journal of the American Medical Informatics Association (1996 to March 2007) as well as reference lists from primary articles. SELECTION CRITERIA: Randomized controlled trials, controlled trials, controlled before and after studies and interrupted time series analyses of computerized advice on drug dosage were included. The participants were health professionals responsible for patient care. The outcomes were: any objectively measured change in the behaviour of the health care provider (such as changes in the dose of drug used); any change in the health of patients resulting from computerized advice (such as adverse reactions to drugs). DATA COLLECTION AND ANALYSIS: Two reviewers independently extracted data and assessed study quality. MAIN RESULTS: Twenty-six comparisons (23 articles) were included (as compared to fifteen comparisons in the original review) including a wide range of drugs in inpatient and outpatient settings. Interventions usually targeted doctors although some studies attempted to influence prescriptions by pharmacists and nurses. Although all studies used reliable outcome measures, their quality was generally low. Computerized advice for drug dosage gave significant benefits by:1.increasing the initial dose (standardised mean difference 1.12, 95% CI 0.33 to 1.92)2.increasing serum concentrations (standradised mean difference 1.12, 95% CI 0.43 to 1.82)3.reducing the time to therapeutic stabilisation (standardised mean difference -0.55, 95%CI -1.03 to -0.08)4.reducing the risk of toxic drug level (rate ratio 0.45, 95% CI 0.30 to 0.70)5.reducing the length of hospital stay (standardised mean difference -0.35, 95% CI -0.52 to -0.17). AUTHORS' CONCLUSIONS: This review suggests that computerized advice for drug dosage has some benefits: it increased the initial dose of drug, increased serum drug concentrations and led to a more rapid therapeutic control. It also reduced the risk of toxic drug levels and the length of time spent in the hospital. However, it had no effect on adverse reactions. In addition, there was no evidence to suggest that some decision support technical features (such as its integration into a computer physician order entry system) or aspects of organization of care (such as the setting) could optimise the effect of computerised advice.
Resumo:
In cooperative multiagent systems, agents interac to solve tasks. Global dynamics of multiagent teams result from local agent interactions, and are complex and difficult to predict. Evolutionary computation has proven a promising approach to the design of such teams. The majority of current studies use teams composed of agents with identical control rules ("geneti- cally homogeneous teams") and select behavior at the team level ("team-level selection"). Here we extend current approaches to include four combinations of genetic team composition and level of selection. We compare the performance of genetically homo- geneous teams evolved with individual-level selection, genetically homogeneous teams evolved with team-level selection, genetically heterogeneous teams evolved with individual-level selection, and genetically heterogeneous teams evolved with team-level selection. We use a simulated foraging task to show that the optimal combination depends on the amount of cooperation required by the task. Accordingly, we distinguish between three types of cooperative tasks and suggest guidelines for the optimal choice of genetic team composition and level of selection
Resumo:
BACKGROUND: Tenofovir (TDF) use has been associated with proximal renal tubulopathy, reduced calculated glomerular filtration rates (cGFR) and losses in bone mineral density. Bone resorption could result in a compensatory osteoblast activation indicated by an increase in serum alkaline phosphatase (sAP). A few small studies have reported a positive correlation between renal phosphate losses, increased bone turnover and sAP. METHODS: We analysed sAP dynamics in patients initiating (n = 657), reinitiating (n = 361) and discontinuing (n = 73) combined antiretroviral therapy with and without TDF and assessed correlations with clinical and epidemiological parameters. RESULTS: TDF use was associated with a significant increase of sAP from a median of 74 U/I (interquartile range 60-98) to a plateau of 99 U/I (82-123) after 6 months (P < 0.0001), with a prompt return to baseline upon TDF discontinuation. No change occurred in TDF-sparing regimes. Univariable and multivariable linear regression analyses revealed a positive correlation between sAP and TDF use (P < or = 0.003), but no correlation with baseline cGFR, TDF-related cGFR reduction, changes in serum alanine aminotransferase (sALT) or active hepatitis C. CONCLUSIONS: We document a highly significant association between TDF use and increased sAP in a large observational cohort. The lack of correlation between TDF use and sALT suggests that the increase in sAP is because of the bone isoenzyme and indicates stimulated bone turnover. This finding, together with published data on TDF-related renal phosphate losses, this finding raises concerns that TDF use could result in osteomalacia with a loss in bone mineral density at least in a subset of patients. This potentially severe long-term toxicity should be addressed in future studies.