33 resultados para Comparative Methodologies and Theories
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Participatory approaches to conservation have been applied worldwide by governments and non-governmental organisations. However, results from a comparative analysis of the impacts of global change on management issues in 13 protected areas in Africa, Latin America, Asia, and Europe show that in many cases the involvement of local people has remained limited, and economic gains for local livelihoods have been limited or non-existent. Viewed from a ‘new institutionalist’ perspective and focusing on power relations and ideologies, the results of this study carried out within the framework of the Swiss National Centre of Competence in Research (NCCR) North-South show that in African cases local people do not feel part of the process and, therefore, become disengaged. In Asia, and even more so in Latin America, local indigenous peoples and their leaders support protected areas as a means to gain political rights over areas threatened by immigration. The European (Swiss) case is the only one where political rights and economic incentives present a context in which participation is of direct interest to local people. Meanwhile, recent debates on new global conservation developments in the context of climate change policy indicate a growing tendency to treat conservation as a commodity. We argue that this can have problematical effects on efforts to devolve power to the local level in the context of conservation.
Resumo:
Little sequence information exists on the matrix-protein (MA) encoding region of small ruminant lentiviruses (SRLV). Fifty-two novel sequences were established and permitted a first phylogenetic analysis of this region of the SRLV genome. The variability of the MA encoding region is higher compared to the gag region encoding the capsid protein and surprisingly close to that reported for the env gene. In contrast to primate lentiviruses, the deduced amino acid sequences of the N- and C-terminal domains of MA are variable. This permitted to pinpoint a basic domain in the N-terminal domain that is conserved in all lentiviruses and likely to play an important functional role. Additionally, a seven amino acid insertion was detected in all MVV strains, which may be used to differentiate CAEV and MVV isolates. A molecular epidemiology analysis based on these sequences indicates that the Italian lentivirus strains are closely related to each other and to the CAEV-CO strain, a prototypic strain isolated three decades ago in the US. This suggests a common origin of the SRLV circulating in the monitored flocks, possibly related to the introduction of infected goats in a negative population. Finally, this study shows that the MA region is suitable for phylogenetic studies and may be applied to monitor SRLV eradication programs.
Resumo:
We report a high-quality draft sequence of the genome of the horse (Equus caballus). The genome is relatively repetitive but has little segmental duplication. Chromosomes appear to have undergone few historical rearrangements: 53% of equine chromosomes show conserved synteny to a single human chromosome. Equine chromosome 11 is shown to have an evolutionary new centromere devoid of centromeric satellite DNA, suggesting that centromeric function may arise before satellite repeat accumulation. Linkage disequilibrium, showing the influences of early domestication of large herds of female horses, is intermediate in length between dog and human, and there is long-range haplotype sharing among breeds.
Resumo:
Background: Despite immense efforts into development of new antidepressant drugs, the increases of serotoninergic and catechominergic neurotransmission have remained the two major pharmacodynamic principles of current drug treatments for depression. Consequently, psychopathological or biological markers that predict response to drugs that selectively increase serotonin and/or catecholamine neurotransmission hold the potential to optimize the prescriber’s selection among currently available treatment options. The aim of this study was to elucidate the differential symptomatology and neurophysiology in response to reductions in serotonergic versus catecholaminergic neurotransmission in subjects at high risk of depression recurrence. Methods: Using identical neuroimaging procedures with [18F] fluorodeoxyglucose positron emission tomography after tryptophan depletion (TD) and catecholamine depletion (CD), subjects with remitted depression were compared to healthy controls in a double-blind, randomized, crossover design. Results: While TD induced significantly more depressed mood, sadness and hopelessness than CD, CD induced more inactivity, concentration difficulties, lassitude and somatic anxiety than TD. CD specifically increased glucose metabolism in the bilateral ventral striatum and decreased glucose metabolism in the bilateral orbitofrontal cortex, whereas TD specifically increased metabolism in the right prefrontal cortex and the posterior cingulate cortex (PCC). While we found direct associations between changes in brain metabolism and induced depressive symptoms following CD, the relationship between neural activity and symptoms was less clear after TD. Conclusions: In conclusion, this study showed that serotonin and catecholamines play common and differential roles in the pathophysiology of depression.
Resumo:
OBJECTIVES The purpose of this study was to compare the 2-year safety and effectiveness of new- versus early-generation drug-eluting stents (DES) according to the severity of coronary artery disease (CAD) as assessed by the SYNTAX (Synergy between Percutaneous Coronary Intervention with Taxus and Cardiac Surgery) score. BACKGROUND New-generation DES are considered the standard-of-care in patients with CAD undergoing percutaneous coronary intervention. However, there are few data investigating the effects of new- over early-generation DES according to the anatomic complexity of CAD. METHODS Patient-level data from 4 contemporary, all-comers trials were pooled. The primary device-oriented clinical endpoint was the composite of cardiac death, myocardial infarction, or ischemia-driven target-lesion revascularization (TLR). The principal effectiveness and safety endpoints were TLR and definite stent thrombosis (ST), respectively. Adjusted hazard ratios (HRs) with 95% confidence intervals (CIs) were calculated at 2 years for overall comparisons, as well as stratified for patients with lower (SYNTAX score ≤11) and higher complexity (SYNTAX score >11). RESULTS A total of 6,081 patients were included in the study. New-generation DES (n = 4,554) compared with early-generation DES (n = 1,527) reduced the primary endpoint (HR: 0.75 [95% CI: 0.63 to 0.89]; p = 0.001) without interaction (p = 0.219) between patients with lower (HR: 0.86 [95% CI: 0.64 to 1.16]; p = 0.322) versus higher CAD complexity (HR: 0.68 [95% CI: 0.54 to 0.85]; p = 0.001). In patients with SYNTAX score >11, new-generation DES significantly reduced TLR (HR: 0.36 [95% CI: 0.26 to 0.51]; p < 0.001) and definite ST (HR: 0.28 [95% CI: 0.15 to 0.55]; p < 0.001) to a greater extent than in the low-complexity group (TLR pint = 0.059; ST pint = 0.013). New-generation DES decreased the risk of cardiac mortality in patients with SYNTAX score >11 (HR: 0.45 [95% CI: 0.27 to 0.76]; p = 0.003) but not in patients with SYNTAX score ≤11 (pint = 0.042). CONCLUSIONS New-generation DES improve clinical outcomes compared with early-generation DES, with a greater safety and effectiveness in patients with SYNTAX score >11.
Resumo:
The spectacular diversity in sexually selected traits in the animal kingdom has inspired the hypothesis that sexual selection can promote species divergence. In recent years, several studies have attempted to test this idea by correlating species richness with estimates of sexual selection across phylogenies. These studies have yielded mixed results and it remains unclear whether the comparative evidence can be taken as generally supportive. Here, we conduct a meta-analysis of the comparative evidence and find a small but significant positive overall correlation between sexual selection and speciation rate. However, we also find that effect size estimates are influenced by methodological choices. Analyses that included deeper phylogenetic nodes yielded weaker correlations, and different proxies for sexual selection showed different relationships with species richness. We discuss the biological and methodological implications of these findings. We argue that progress requires more representative sampling and justification of chosen proxies for sexual selection and speciation rate, as well as more mechanistic approaches.
Resumo:
The purpose of this retrospective study was to compare patterns of vertebral fractures and luxations in 42 cats and 47 dogs, and to evaluate the impact of species-related differences on clinical outcome. Data regarding aetiology, neurological status, radiographic appearance and follow-up were compared between the groups. The thoracolumbar (Th3-L3) area was the most commonly affected location in both cats (49%) and dogs (58%). No lesions were observed in the cervical vertebral segments in cats, and none of the cats showed any signs of a Schiff-Sherrington syndrome. Vertebral luxations were significantly more frequent in dogs (20%) than in cats (6%), whereas combined fracture-luxations occurred significantly more often in cats (65%) than in dogs (37%). Caudal vertebral segment displacement was mostly dorsal in cats and ventral in dogs, with a significant difference in direction between cats and large dogs. The clinical outcome did not differ significantly between the two populations, and was poor in most cases (cats: 61%; dogs: 56%). The degree of dislocation and axis deviation were both significantly associated with a worse outcome in dogs, but not in cats. Although several differences in vertebral fractures and luxation patterns exist between cats and dogs, these generally do not seem to affect outcome.
Resumo:
Objectives: To update the 2006 systematic review of the comparative benefits and harms of erythropoiesis-stimulating agent (ESA) strategies and non-ESA strategies to manage anemia in patients undergoing chemotherapy and/or radiation for malignancy (excluding myelodysplastic syndrome and acute leukemia), including the impact of alternative thresholds for initiating treatment and optimal duration of therapy. Data sources: Literature searches were updated in electronic databases (n=3), conference proceedings (n=3), and Food and Drug Administration transcripts. Multiple sources (n=13) were searched for potential gray literature. A primary source for current survival evidence was a recently published individual patient data meta-analysis. In that meta-analysis, patient data were obtained from investigators for studies enrolling more than 50 patients per arm. Because those data constitute the most currently available data for this update, as well as the source for on-study (active treatment) mortality data, we limited inclusion in the current report to studies enrolling more than 50 patients per arm to avoid potential differential endpoint ascertainment in smaller studies. Review methods: Title and abstract screening was performed by one or two (to resolve uncertainty) reviewers; potentially included publications were reviewed in full text. Two or three (to resolve disagreements) reviewers assessed trial quality. Results were independently verified and pooled for outcomes of interest. The balance of benefits and harms was examined in a decision model. Results: We evaluated evidence from 5 trials directly comparing darbepoetin with epoetin, 41 trials comparing epoetin with control, and 8 trials comparing darbepoetin with control; 5 trials evaluated early versus late (delay until Hb ≤9 to 11 g/dL) treatment. Trials varied according to duration, tumor types, cancer therapy, trial quality, iron supplementation, baseline hemoglobin, ESA dosing frequency (and therefore amount per dose), and dose escalation. ESAs decreased the risk of transfusion (pooled relative risk [RR], 0.58; 95% confidence interval [CI], 0.53 to 0.64; I2 = 51%; 38 trials) without evidence of meaningful difference between epoetin and darbepoetin. Thromboembolic event rates were higher in ESA-treated patients (pooled RR, 1.51; 95% CI, 1.30 to 1.74; I2 = 0%; 37 trials) without difference between epoetin and darbepoetin. In 14 trials reporting the Functional Assessment of Cancer Therapy (FACT)-Fatigue subscale, the most common patient-reported outcome, scores decreased by −0.6 in control arms (95% CI, −6.4 to 5.2; I2 = 0%) and increased by 2.1 in ESA arms (95% CI, −3.9 to 8.1; I2 = 0%). There were fewer thromboembolic and on-study mortality adverse events when ESA treatment was delayed until baseline Hb was less than 10 g/dL, in keeping with current treatment practice, but the difference in effect from early treatment was not significant, and the evidence was limited and insufficient for conclusions. No evidence informed optimal duration of therapy. Mortality was increased during the on-study period (pooled hazard ratio [HR], 1.17; 95% CI, 1.04 to 1.31; I2 = 0%; 37 trials). There was one additional death for every 59 treated patients when the control arm on-study mortality was 10 percent and one additional death for every 588 treated patients when the control-arm on-study mortality was 1 percent. A cohort decision model yielded a consistent result—greater loss of life-years when control arm on-study mortality was higher. There was no discernible increase in mortality with ESA use over the longest available followup (pooled HR, 1.04; 95% CI, 0.99 to 1.10; I2 = 38%; 44 trials), but many trials did not include an overall survival endpoint and potential time-dependent confounding was not considered. Conclusions: Results of this update were consistent with the 2006 review. ESAs reduced the need for transfusions and increased the risk of thromboembolism. FACT-Fatigue scores were better with ESA use but the magnitude was less than the minimal clinically important difference. An increase in mortality accompanied the use of ESAs. An important unanswered question is whether dosing practices and overall ESA exposure might influence harms.
Resumo:
Studies about the influence of patient characteristics on mechanical failure of cups in total hip replacement have applied different methodologies and revealed inconclusive results. The fixation mode has rarely been investigated. Therefore, we conducted a detailed analysis of the influence of patient characteristics and fixation mode on cup failure risks.
Resumo:
The aim of this study was to assess the prevalence of incomplete distal renal tubular acidosis (idRTA) in men with recurrent calcium nephrolithiasis and its potential impact on bone mineral density. We conducted a retrospective analysis of 150 consecutive, male idiopathic recurrent calcium stone formers (RCSFs), which had originally been referred to the tertiary care stone center of the University Hospital of Berne for further metabolic evaluation. All RCSFs had been maintained on a free-choice diet while collecting two 24-h urine samples and delivered second morning urine samples after 12 h fasting. Among 12 RCSFs with a fasting urine pH >5.8, a modified 3-day ammonium chloride loading test identified idRTA in 10 patients (urine pH >5.32, idRTA group). We matched to each idRTA subject 5 control subjects from the 150 RCSFs, primary by BMI and then by age, i.e., 50 patients, without any acidification defect (non-RTA group) for comparative biochemistry and dual energy X-ray absorptiometry (DEXA) analyses. The prevalence of primary idRTA among RCSFs was 6.7% (10/150). Patients with idRTA had significantly higher 2-h fasting and 24-h urine pH (2-h urine pH: 6.6 ± 0.4 vs. 5.2 ± 0.1, p = 0.001; 24-h urine pH: 6.1 ± 0.2 vs. 5.3 ± 0.3, p = 0.001), 24-h urinary calcium excretion (7.70 ± 1.75 vs. 5.69 ± 1.73 mmol/d, p = 0.02), but significantly lower 24-h urinary urea excretion (323 ± 53 vs. 399 ± 114 mmol/d, p = 0.01), urinary citrate levels (2.32 ± 0.82 vs. 3.01 ± 0.72 mmol/d, p = 0.04) and renal phosphate threshold normalized for the glomerular filtration rate (TmPO(4)/GFR: 0.66 ± 0.17 vs. 0.82 ± 0.21, p = 0.03) compared to non-RTA patients. No significant difference in bone mineral density (BMD) was found between idRTA and non-RTA patients for the lumbar spine (LS BMD (g/cm(2)): 1.046 ± 0.245 SD vs. 1.005 ± 0.119 SD, p = 0.42) or femoral neck (FN BMD (g/cm(2)): 0.830 ± 0.135 SD vs. 0.852 ± 0.127 SD). Thus, idRTA occurs in 1 in 15 male RCSFs and should be sought in all recurrent calcium nephrolithiasis patients. Bone mineral density, however, does not appear to be significantly affected by idRTA.
Resumo:
Purpose The accuracy, efficiency, and efficacy of four commonly recommended medication safety assessment methodologies were systematically reviewed. Methods Medical literature databases were systematically searched for any comparative study conducted between January 2000 and October 2009 in which at least two of the four methodologies—incident report review, direct observation, chart review, and trigger tool—were compared with one another. Any study that compared two or more methodologies for quantitative accuracy (adequacy of the assessment of medication errors and adverse drug events) efficiency (effort and cost), and efficacy and that provided numerical data was included in the analysis. Results Twenty-eight studies were included in this review. Of these, 22 compared two of the methodologies, and 6 compared three methods. Direct observation identified the greatest number of reports of drug-related problems (DRPs), while incident report review identified the fewest. However, incident report review generally showed a higher specificity compared to the other methods and most effectively captured severe DRPs. In contrast, the sensitivity of incident report review was lower when compared with trigger tool. While trigger tool was the least labor-intensive of the four methodologies, incident report review appeared to be the least expensive, but only when linked with concomitant automated reporting systems and targeted follow-up. Conclusion All four medication safety assessment techniques—incident report review, chart review, direct observation, and trigger tool—have different strengths and weaknesses. Overlap between different methods in identifying DRPs is minimal. While trigger tool appeared to be the most effective and labor-efficient method, incident report review best identified high-severity DRPs.
Resumo:
Transferrin (TF)-mediated provision of iron is essential for a productive infection by many bacterial pathogens, and iron-depletion of TF is a first line defence against bacterial infections. Therefore, the transferrin (TF) gene can be considered a candidate gene for disease resistance. We obtained the complete DNA sequence of the porcine TF gene, which spans 40 kb and contains 17 exons. We identified polymorphisms on a panel of 10 different pig breeds. Comparative intra- and interbreed sequence analysis revealed 62 polymorphisms in the TF gene including one microsatellite. Ten polymorphisms were located in the coding sequence of the TF gene. Four SNPs (c.902A>T, c.980G>A, c.1417A>G, c.1810A>C) were predicted to cause amino acid exchanges (p.Lys301Ile, p.Arg327Lys, p.Lys473Glu, p.Asn604His). We performed association analyses using six selected TF markers and 116 pigs experimentally infected with Actinobacillus pleuropneumoniae serotype 7. The analysis showed breed-specific TF allele frequencies. In German Landrace, we found evidence for a possible association of the severity of A. pleuropneumoniae infection with TF genotypes.