38 resultados para 715
Resumo:
BACKGROUND Current international treatment guidelines recommending therapeutic exercise for people with symptomatic hip osteoarthritis (OA) report are based on limited evidence. OBJECTIVES To determine whether land-based therapeutic exercise is beneficial for people with hip OA in terms of reduced joint pain and improved physical function and quality of life. SEARCH METHODS We searched five databases from inception up to February 2013. SELECTION CRITERIA All randomised controlled trials (RCTs) recruiting people with hip OA and comparing some form of land-based therapeutic exercise (as opposed to exercises conducted in water) with a non-exercise group. DATA COLLECTION AND ANALYSIS Four review authors independently selected studies for inclusion. We resolved disagreements through consensus. Two review authors independently extracted data, assessed risk of bias and the quality of the body of evidence for each outcome using the GRADE approach. We conducted analyses on continuous outcomes (pain, physical function and quality of life) and dichotomous outcomes (proportion of study withdrawals). MAIN RESULTS We considered that seven of the 10 included RCTs had a low risk of bias. However, the results may be vulnerable to performance and detection bias as none of the RCTs were able to blind participants to treatment allocation and, while most RCTs reported blinded outcome assessment, pain, physical function and quality of life were participant self reported. One of the 10 RCTs was only reported as a conference abstract and did not provide sufficient data for the evaluation of bias risk.High-quality evidence from nine trials (549 participants) indicated that exercise reduced pain (standardised mean difference (SMD) -0.38, 95% confidence interval (CI) -0.55 to -0.20) and improved physical function (SMD -0.38, 95% CI -0.54 to -0.05) immediately after treatment. Pain and physical function were estimated to be 29 points on a 0- to 100-point scale (0 was no pain or loss of physical function) in the control group; exercise reduced pain by an equivalent of 8 points (95% CI 4 to 11 points; number needed to treat for an additional beneficial outcome (NNTB) 6) and improved physical function by an equivalent of 7 points (95% CI 1 to 12 points; NNTB 6). Only three small studies (183 participants) evaluated quality of life, with overall low quality evidence, with no benefit of exercise demonstrated (SMD -0.07, 95% CI -0.23 to 0.36). Quality of life was estimated to be 50 points on a norm-based mean (standard deviation (SD)) score of 50 (10) in the general population in the control group; exercise improved quality of life by 0 points. Moderate-quality evidence from seven trials (715 participants) indicated an increased likelihood of withdrawal from the exercise allocation (event rate 6%) compared with the control group (event rate 3%), but this difference was not significant (risk difference 1%; 95% CI -1% to 4%). Of the five studies reporting adverse events, each study reported only one or two events and all were related to increased pain attributed to the exercise programme.The reduction in pain was sustained at least three to six months after ceasing monitored treatment (five RCTs, 391 participants): pain (SMD -0.38, 95% CI -0.58 to -0.18). Pain was estimated to be 29 points on a 0- to 100-point scale (0 was no pain) in the control group, the improvement in pain translated to a sustained reduction in pain intensity of 8 points (95% CI 4 to 12 points) compared with the control group (0 to 100 scale). The improvement in physical function was also sustained (five RCTs, 367 participants): physical function (SMD -0.37, 95% CI -0.57 to -0.16). Physical function was estimated to be 24 points on a 0- to 100-point scale (0 was no loss of physical function) in the control group, the improvement translated to a mean of 7 points (95% CI 4 to 13) compared with the control group.Only five of the 10 RCTs exclusively recruited people with symptomatic hip OA (419 participants). There was no significant difference in pain or physical function outcomes compared with five studies recruiting participants with hip or knee OA (130 participants). AUTHORS' CONCLUSIONS Pooling the results of these 10 RCTs demonstrated that land-based therapeutic exercise programmes can reduce pain and improve physical function among people with symptomatic hip OA.
Resumo:
BACKGROUND Copper and its main transport protein ceruloplasmin have been suggested to promote the development of atherosclerosis. Most of the data come from experimental and animal model studies. Copper and mortality have not been simultaneously evaluated in patients undergoing coronary angiography. METHODS AND RESULTS We examined whether serum copper and ceruloplasmin concentrations are associated with angiographic coronary artery disease (CAD) and mortality from all causes and cardiovascular causes in 3253 participants of the Ludwigshafen Risk and Cardiovascular Health Study. Age and sex-adjusted hazard ratios (HR) for death from any cause were 2.23 (95% CI, 1.85-2.68) for copper and 2.63 (95% CI, 2.17-3.20) for ceruloplasmin when we compared the highest with the lowest quartiles. Corresponding hazard ratios (HR) for death from cardiovascular causes were 2.58 (95% CI, 2.05-3.25) and 3.02 (95% CI, 2.36-3.86), respectively. Further adjustments for various risk factors and clinical variables considerably attenuated these associations, which, however, were still statistically significant and the results remained consistent across subgroups. CONCLUSIONS The elevated concentrations of both copper and ceruloplasmin are independently associated with increased risk of mortality from all causes and from cardiovascular causes.
Resumo:
BACKGROUND Single nucleotide polymorphisms (SNPs) in immune genes have been associated with susceptibility to invasive mold infection (IMI) among hematopoietic stem cell (HSCT) but not solid organ transplant (SOT) recipients. METHODS 24 SNPs from systematically selected genes were genotyped among 1101 SOT recipients (715 kidneys, 190 liver, 102 lungs, 79 hearts, 15 other) from the Swiss Transplant Cohort Study. Association between SNPs and the endpoint were assessed by log-rank test and Cox regression models. Cytokine production upon Aspergillus stimulation was measured by ELISA in PBMCs from healthy volunteers and correlated with relevant genotypes. RESULTS Mold colonization (N=45) and proven/probable IMI (N=26) were associated with polymorphisms in interleukin-1 beta (IL1B, rs16944; log-rank test, recessive mode, colonization P=0.001 and IMI P=0.00005), interleukin-1 receptor antagonist (IL1RN, rs419598; P=0.01 and P=0.02) and β-defensin-1 (DEFB1, rs1800972; P=0.001 and P=0.0002, respectively). The associations with IL1B and DEFB1 remained significant in a multivariate regression model (IL1B rs16944 P=0.002; DEFB1 rs1800972 P=0.01). Presence of two copies of the rare allele of rs16944 or rs419598 was associated with reduced Aspergillus-induced IL-1β and TNFα secretion by PBMCs. CONCLUSIONS Functional polymorphisms in IL1B and DEFB1 influence susceptibility to mold infection in SOT recipients. This observation may contribute to individual risk stratification.
Resumo:
In order to improve the water-solubility of dinuclear thiolato-bridged arene ruthenium complexes, a new series was synthesized by conjugating octaarginine, octalysine, and cyclo[Lys-Arg-Gly-Asp-D-Phe] using chloroacetyl thioether (ClAc) ligation, resulting in cytotoxic conjugates against A2780 human ovarian cancer cells (IC50 = 2–8 μM) and against the cisplatin resistant line A2780cisR (IC50 = 7–15 μM). These metal complexes represent, to the best of our knowledge, the most cytotoxic ruthenium bioconjugates reported so far.
Resumo:
AIMS Tumour buds in colorectal cancer represent an aggressive subgroup of non-proliferating and non-apoptotic tumour cells. We hypothesize that the survival of tumour buds is dependent upon anoikis resistance. The role of tyrosine kinase receptor B (TrkB), a promoter of epithelial-mesenchymal transition and anoikis resistance, in facilitating budding was investigated. METHODS AND RESULTS Tyrosine kinase receptor B immunohistochemistry was performed on a multiple-punch tissue microarray of 211 colorectal cancer resections. Membranous/cytoplasmic and nuclear expression was evaluated in tumour and buds. Tumour budding was assessed on corresponding whole tissue slides. Relationship to Ki-67 and caspase-3 was investigated. Analysis of Kirsten Ras (KRAS), proto-oncogene B-RAF (BRAF) and cytosine-phosphate-guanosine island methylator phenotype (CIMP) was performed. Membranous/cytoplasmic and nuclear TrkB were strongly, inversely correlated (P < 0.0001; r = -0.41). Membranous/cytoplasmic TrkB was overexpressed in buds compared to the main tumour body (P < 0.0001), associated with larger tumours (P = 0.0236), high-grade budding (P = 0.0011) and KRAS mutation (P = 0.0008). Nuclear TrkB was absent in buds (P <0.0001) and in high-grade budding cancers (P =0.0073). Among patients with membranous/cytoplasmic TrkB-positive buds, high tumour membranous/cytoplasmic TrkB expression was a significant, independent adverse prognostic factor [P = 0.033; 1.79, 95% confidence interval (CI) 1.05-3.05]. Inverse correlations between membranous/cytoplasmic TrkB and Ki-67 (r = -0.41; P < 0.0001) and caspase-3 (r =-0.19; P < 0.05) were observed. CONCLUSIONS Membranous/cytoplasmic TrkB may promote an epithelial-mesenchymal transition (EMT)-like phenotype with high-grade budding and maintain viability of buds themselves.
Resumo:
BACKGROUND After heart transplantation (HTx), the interindividual pharmacokinetic variability of immunosuppressive drugs represents a major therapeutic challenge due to the narrow therapeutic window between over-immunosuppression causing toxicity and under-immunosuppression leading to graft rejection. Although genetic polymorphisms have been shown to influence pharmacokinetics of immunosuppressants, data in the context of HTx are scarce. We thus assessed the role of genetic variation in CYP3A4, CYP3A5, POR, NR1I2, and ABCB1 acting jointly in immunosuppressive drug pathways in tacrolimus (TAC) and ciclosporin (CSA) dose requirement in HTx recipients. METHODS Associations between 7 functional genetic variants and blood dose-adjusted trough (C0) concentrations of TAC and CSA at 1, 3, 6, and 12 months after HTx were evaluated in cohorts of 52 and 45 patients, respectively. RESULTS Compared with CYP3A5 nonexpressors (*3/*3 genotype), CYP3A5 expressors (*1/*3 or *1/*1 genotype) required around 2.2- to 2.6-fold higher daily TAC doses to reach the targeted C0 concentration at all studied time points (P ≤ 0.003). Additionally, the POR*28 variant carriers showed higher dose-adjusted TAC-C0 concentrations at all time points resulting in significant differences at 3 (P = 0.025) and 6 months (P = 0.047) after HTx. No significant associations were observed between the genetic variants and the CSA dose requirement. CONCLUSIONS The CYP3A5*3 variant has a major influence on the required TAC dose in HTx recipients, whereas the POR*28 may additionally contribute to the observed variability. These results support the importance of genetic markers in TAC dose optimization after HTx.
Resumo:
Lake Victoria is Africa’s single most important source of inland fishery production. After it was initially fished down in the first half of the 20th century, Lake Victoria became home to a series of introduced food fishes, culminating in the eventual demographic dominance of the Nile perch, Lates niloticus. Simultaneously with the changes in fish stocks, Lake Victoria experienced dramatic changes in its ecology. The lake fishery during most of the 20th century was a multispecies fishery resting on a diverse lake ecosystem, in which native food fishes were targeted. The lake ended the century with a much more productive fishery, but one in which three species — two of them introduced — made up the majority of the catch. Although many fish stocks in Lake Victoria had declined before the expansion of the Nile perch population, a dramatic increase in the population size of Nile perch in the 1980s roughly coincided with the drastic decline or disappearance of many indigenous species. Now, two decades after the rise of Nile perch in Lake Victoria, this species has shown signs of being overfished, and some of the native species that were in retreat — or even thought extinct — are now reemerging. Data on the resurgence of the indigenous species suggest that heavy fishing of Nile perch may enhance biodiversity; this has spawned renewed interest in management options that promote both fishery sustainability and biodiversity conservation.
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.