35 resultados para Sharing the Cost of a Public Good: an Incentive-Constrained Axiomatic Approach
Resumo:
CONTEXT Subclinical hypothyroidism has been associated with increased risk of coronary heart disease (CHD), particularly with thyrotropin levels of 10.0 mIU/L or greater. The measurement of thyroid antibodies helps predict the progression to overt hypothyroidism, but it is unclear whether thyroid autoimmunity independently affects CHD risk. OBJECTIVE The objective of the study was to compare the CHD risk of subclinical hypothyroidism with and without thyroid peroxidase antibodies (TPOAbs). DATA SOURCES AND STUDY SELECTION A MEDLINE and EMBASE search from 1950 to 2011 was conducted for prospective cohorts, reporting baseline thyroid function, antibodies, and CHD outcomes. DATA EXTRACTION Individual data of 38 274 participants from six cohorts for CHD mortality followed up for 460 333 person-years and 33 394 participants from four cohorts for CHD events. DATA SYNTHESIS Among 38 274 adults (median age 55 y, 63% women), 1691 (4.4%) had subclinical hypothyroidism, of whom 775 (45.8%) had positive TPOAbs. During follow-up, 1436 participants died of CHD and 3285 had CHD events. Compared with euthyroid individuals, age- and gender-adjusted risks of CHD mortality in subclinical hypothyroidism were similar among individuals with and without TPOAbs [hazard ratio (HR) 1.15, 95% confidence interval (CI) 0.87-1.53 vs HR 1.26, CI 1.01-1.58, P for interaction = .62], as were risks of CHD events (HR 1.16, CI 0.87-1.56 vs HR 1.26, CI 1.02-1.56, P for interaction = .65). Risks of CHD mortality and events increased with higher thyrotropin, but within each stratum, risks did not differ by TPOAb status. CONCLUSIONS CHD risk associated with subclinical hypothyroidism did not differ by TPOAb status, suggesting that biomarkers of thyroid autoimmunity do not add independent prognostic information for CHD outcomes.
Resumo:
BACKGROUND Microvascular dysfunction and microthrombi formation are believed to contribute to development of early brain injury (EBI) after aneurysmal subarachnoid hemorrhage (SAH). OBJECTIVE This study aimed to determine (i) extent of microthrombus formation and neuronal apoptosis in the brain parenchyma using a blood shunt SAH model in rabbits; (ii) correlation of structural changes in microvessels with EBI characteristics. METHODS Acute SAH was induced using a rabbit shunt cisterna magna model. Extent of microthrombosis was detected 24 h post-SAH (n = 8) by fibrinogen immunostaining, compared to controls (n = 4). We assessed apoptosis by terminal deoxynucleotidyl transferase nick end labeling (TUNEL) in cortex and hippocampus. RESULTS Our results showed significantly more TUNEL-positive cells (SAH: 115 ± 13; controls: 58 ± 10; P = 0.016) and fibrinogen-positive microthromboemboli (SAH: 9 ± 2; controls: 2 ± 1; P = 0.03) in the hippocampus after aneurysmal SAH. CONCLUSIONS We found clear evidence of early microclot formation in a rabbit model of acute SAH. The extent of microthrombosis did not correlate with early apoptosis or CPP depletion after SAH; however, the total number of TUNEL positive cells in the cortex and the hippocampus significantly correlated with mean CPP reduction during the phase of maximum depletion after SAH induction. Both microthrombosis and neuronal apoptosis may contribute to EBI and subsequent DCI.
Resumo:
There is a growing demand for better understanding of the link between research, policy and practice in development. This article provides findings from a study that aimed to gain insights into how researchers engage with their non-academic partners. It draws on experiences from the National Centre of Competence in Research North-South programme, a development research network of Swiss, African, Asian and Latin American institutions. Conceptually, this study is concerned with research effectiveness as a means to identify knowledge useful for society. Research can be improved and adapted when monitoring the effects of interactions between researchers and non-academic partners. Therefore, a monitoring and learning approach was chosen. This study reveals researchers' strategies in engaging with non-academic partners and points to framing conditions considered decisive for soccessful interactions. It concludes that reserachrs need to systematically analyse the socio-political context in which they intervene. By providing insights from the ground and reflecting on them in the light of the latest theoretical concepts, this article contributes to the emerging literature founded on practice-based experience.
Resumo:
BACKGROUND The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. METHODS We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. RESULTS Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. CONCLUSION Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.
Resumo:
Background Simple Sequence Repeats (SSRs) are widely used in population genetic studies but their classical development is costly and time-consuming. The ever-increasing available DNA datasets generated by high-throughput techniques offer an inexpensive alternative for SSRs discovery. Expressed Sequence Tags (ESTs) have been widely used as SSR source for plants of economic relevance but their application to non-model species is still modest. Methods Here, we explored the use of publicly available ESTs (GenBank at the National Center for Biotechnology Information-NCBI) for SSRs development in non-model plants, focusing on genera listed by the International Union for the Conservation of Nature (IUCN). We also search two model genera with fully annotated genomes for EST-SSRs, Arabidopsis and Oryza, and used them as controls for genome distribution analyses. Overall, we downloaded 16 031 555 sequences for 258 plant genera which were mined for SSRsand their primers with the help of QDD1. Genome distribution analyses in Oryza and Arabidopsis were done by blasting the sequences with SSR against the Oryza sativa and Arabidopsis thaliana reference genomes implemented in the Basal Local Alignment Tool (BLAST) of the NCBI website. Finally, we performed an empirical test to determine the performance of our EST-SSRs in a few individuals from four species of two eudicot genera, Trifolium and Centaurea. Results We explored a total of 14 498 726 EST sequences from the dbEST database (NCBI) in 257 plant genera from the IUCN Red List. We identify a very large number (17 102) of ready-to-test EST-SSRs in most plant genera (193) at no cost. Overall, dinucleotide and trinucleotide repeats were the prevalent types but the abundance of the various types of repeat differed between taxonomic groups. Control genomes revealed that trinucleotide repeats were mostly located in coding regions while dinucleotide repeats were largely associated with untranslated regions. Our results from the empirical test revealed considerable amplification success and transferability between congenerics. Conclusions The present work represents the first large-scale study developing SSRs by utilizing publicly accessible EST databases in threatened plants. Here we provide a very large number of ready-to-test EST-SSR (17 102) for 193 genera. The cross-species transferability suggests that the number of possible target species would be large. Since trinucleotide repeats are abundant and mainly linked to exons they might be useful in evolutionary and conservation studies. Altogether, our study highly supports the use of EST databases as an extremely affordable and fast alternative for SSR developing in threatened plants.
Resumo:
Screening people without symptoms of disease is an attractive idea. Screening allows early detection of disease or elevated risk of disease, and has the potential for improved treatment and reduction of mortality. The list of future screening opportunities is set to grow because of the refinement of screening techniques, the increasing frequency of degenerative and chronic diseases, and the steadily growing body of evidence on genetic predispositions for various diseases. But how should we decide on the diseases for which screening should be done and on recommendations for how it should be implemented? We use the examples of prostate cancer and genetic screening to show the importance of considering screening as an ongoing population-based intervention with beneficial and harmful effects, and not simply the use of a test. Assessing whether screening should be recommended and implemented for any named disease is therefore a multi-dimensional task in health technology assessment. There are several countries that already use established processes and criteria to assess the appropriateness of screening. We argue that the Swiss healthcare system needs a nationwide screening commission mandated to conduct appropriate evidence-based evaluation of the impact of proposed screening interventions, to issue evidence-based recommendations, and to monitor the performance of screening programmes introduced. Without explicit processes there is a danger that beneficial screening programmes could be neglected and that ineffective, and potentially harmful, screening procedures could be introduced.
Resumo:
The concept of warning behaviors offers an additional perspective in threat assessment. Warning behaviors are acts which constitute evidence of increasing or accelerating risk. They are acute, dynamic, and particularly toxic changes in patterns of behavior which may aid in structuring a professional's judgment that an individual of concern now poses a threat - whether the actual target has been identified or not. They require an operational response. A typology of eight warning behaviors for assessing the threat of intended violence is proposed: pathway, fixation, identification, novel aggression, energy burst, leakage, directly communicated threat, and last resort warning behaviors. Previous research on risk factors associated with such warning behaviors is reviewed, and examples of each warning behavior from various intended violence cases are presented, including public figure assassination, adolescent and adult mass murder, corporate celebrity stalking, and both domestic and foreign acts of terrorism. Practical applications and future research into warning behaviors are suggested. Copyright © 2011 John Wiley & Sons, Ltd.
Resumo:
Background The release of quality data from acute care hospitals to the general public is based on the aim to inform the public, to provide transparency and to foster quality-based competition among providers. Due to the expected mechanisms of action and possibly the adverse consequences of public quality comparison, it is a controversial topic. The perspective of physicians and nurses is of particular importance in this context. They are mainly responsible for the collection of quality-control data, and are directly confronted with the results of public comparison. The research focus of this qualitative study was to discover what the views and opinions of the Swiss physicians and nurses were regarding these issues. It was investigated as to how the two professional groups appraised the opportunities as well as the risks of the release of quality data in Switzerland. Methods A qualitative approach was chosen to answer the research question. For data collection, four focus groups were conducted with physicians and nurses who were employed in Swiss acute care hospitals. Qualitative content analysis was applied to the data. Results The results revealed that both occupational groups had a very critical and negative attitude regarding the recent developments. The perceived risks were dominating their view. In summary, their main concerns were: the reduction of complexity, the one-sided focus on measurable quality variables, risk selection, the threat of data manipulation and the abuse of published information by the media. An additional concern was that the impression is given that the complex construct of quality can be reduced to a few key figures, and it that it is constructed from a false message which then influences society and politics. This critical attitude is associated with the different value system and the professional self-concept that both physicians and nurses have, in comparison to the underlying principles of a market-based economy and the economic orientation of health care business. Conclusions The critical and negative attitude of Swiss physicians and nurses must, under all conditions, be heeded to and investigated regarding its impact on work motivation and identification with the profession. At the same time, the two professional groups are obligated to reflect upon their critical attitude and take a proactive role in the development of appropriate quality indicators for the publication of quality data in Switzerland.
Resumo:
Following an abortion in a beef herd in the summer of 2009, three outbreaks of infectious bovine rhinotracheitis (IBR) were diagnosed in the cantons of Jura and Neuchatel. An epidemiological outbreak investigation was conducted with the aims to identify the source of introduction of the bovine herpes virus 1 (BoHV-1) into the affected herds and to prevent further spread of the disease. The attack rates in the three outbreak farms were 0.89, 0.28 and 0, respectively. BoHV-1 could be isolated from nasal swabs of two animals originating from one of the affected farms. Comparative restriction enzyme analysis revealed slight differences between the isolates of the two animals, but a high similarity to previous BoHV-1 isolates from the canton of Jura, as well as to a French BoHV-1 isolate. This IBR outbreak has shown the importance of reporting and analyzing abortions. The current disease outbreaks recall the main risk factors for the spread of IBR in Switzerland: purchase and movement of bovines and semen of often unknown IBR status.
Resumo:
A small subset of familial pancreatic endocrine tumors (PET) arises in patients with von Hippel-Lindau syndrome and these tumors may have an adverse outcome compared to other familial PET. Sporadic PET rarely harbors somatic VHL mutations, but the chromosomal location of the VHL gene is frequently deleted in sporadic PET. A subset of sporadic PET shows active hypoxia signals on mRNA and protein level. To identify the frequency of functionally relevant VHL inactivation in sporadic PET and to examine a possible prognostic significance we correlated epigenetic and genetic VHL alterations with hypoxia signals. VHL mutations were absent in all 37 PETs examined. In 2 out of 35 informative PET (6%) methylation of the VHL promoter region was detected and VHL deletion by fluorescence in situ hybridization was found in 14 out of 79 PET (18%). Hypoxia inducible factor 1alpha (HIF1-alpha), carbonic anhydrase 9 (CA-9), and glucose transporter 1 (GLUT-1) protein was expressed in 19, 27, and 30% of the 152 PETs examined. Protein expression of the HIF1-alpha downstream target CA-9 correlated significantly with the expression of CA-9 RNA (P<0.001), VHL RNA (P<0.05), and VHL deletion (P<0.001) as well as with HIF1-alpha (P<0.005) and GLUT-1 immunohistochemistry (P<0.001). These PET with VHL alterations and signs of hypoxia signalling were characterized by a significantly shortened disease-free survival. We conclude that VHL gene impairment by promoter methylation and VHL deletion in nearly 25% of PET leads to the activation of the HIF-pathway. Our data suggest that VHL inactivation and consecutive hypoxia signals may be a mechanism for the development of sporadic PET with an adverse outcome.
Resumo:
Predicting the response of species to environmental changes is a great and on-going challenge for ecologists, and this requires a more in-depth understanding of the importance of biotic interactions and the population structuration in the landscape. Using a reciprocal transplantation experiment, we tested the response of five species to an elevational gradient. This was combined to a neighbour removal treatment to test the importance of local adaptation and biotic interactions. The trait studied was performance measured as survival and biomass. Species response varied along the elevational gradient, but with no consistent pattern. Performance of species was influenced by environmental conditions occurring locally at each site, as well as by positive or negative effects of the surrounding vegetation. Indeed, we observed a shift from competition for biomass to facilitation for survival as a response to the increase in environmental stress occurring in the different sites. Unlike previous studies pointing out an increase of stress along the elevation gradient, our results supported a stress gradient related to water availability, which was not strictly parallel to the elevational gradient. For three of our species, we observed a greater biomass production for the population coming from the site where the species was dominant (central population) compared to population sampled at the limit of the distribution (marginal population). Nevertheless, we did not observe any pattern of local adaptation that could indicate adaptation of populations to a particular habitat. Altogether, our results highlighted the great ability of plant species to cope with environmental changes, with no local adaptation and great variability in response to local conditions. Our study confirms the importance of taking into account biotic interactions and population structure occurring at local scale in the prediction of communities’ responses to global environmental changes.
Resumo:
Temperature plays a critical role in determining the biology of ectotherms. Many animals have evolved mechanisms that allow them to compensate biological rates, i.e. adjust biological rates to overcome thermodynamic effects. For low energy-organisms, such as bivalves, the costs of thermal compensation may be greater than the benefits, and thus prohibitive. To examine this, two experiments were designed to explore thermal compensation in Unio tumidus. Experiment 1 examined seasonal changes in behaviour in U. tumidus throughout a year. Temperature had a clear effect on burrowing rate with no evidence of compensation. Valve closure duration and frequency were also strongly affected by seasonal temperature change, but there was slight evidence of partial compensation. Experiment 2 examined oxygen consumption during burrowing, immediately following valve opening and at rest in summer (24 °C), autumn (14 °C), winter (4 °C), and spring (14 °C) acclimatized U. tumidus. Again, there was little evidence of burrowing rate compensation, but some evidence of partial compensation of valve closure duration and frequency. None of the oxygen compensation rates showed any evidence of thermal compensation. Thus, in general, there was only very limited evidence of thermal compensation of behaviour and no evidence of thermal compensation of oxygen compensation rates. Based upon this evidence, we argue that there is no evolutionary pressure for these bivalves to compensate these biological rates. Any pressure may be to maintain or even lower oxygen consumption as their only defence against predation is to close their valves and wait. An increase in oxygen consumption will be detrimental in this regard so the cost of thermal compensation may outweigh the benefits.
Resumo:
INTRODUCTION Spinal disc herniation, lumbar spinal stenosis and spondylolisthesis are known to be leading causes of lumbar back pain. The cost of low back pain management and related operations are continuously increasing in the healthcare sector. There are many studies regarding complications after spine surgery but little is known about the factors predicting the length of stay in hospital. The purpose of this study was to identify these factors in lumbar spine surgery in order to adapt the postoperative treatment. MATERIAL AND METHODS The current study was carried out as a post hoc analysis on the basis of the German spine registry. Patients who underwent lumbar spine surgery by posterior surgical access and with posterior fusion and/or rigid stabilization, whereby procedures with dynamic stabilization were excluded. Patient characteristics were tested for association with length of stay (LOS) using bivariate and multivariate analyses. RESULTS A total of 356 patients met the inclusion criteria. The average age of all patients was 64.6 years and the mean LOS was 11.9 ± 6.0 days with a range of 2-44 days. Independent factors that were influencing LOS were increased age at the time of surgery, higher body mass index, male gender, blood transfusion of 1-2 erythrocyte concentrates and the presence of surgical complications. CONCLUSION Identification of predictive factors for prolonged LOS may allow for estimation of patient hospitalization time and for optimization of postoperative care. In individual cases this may result of a reduction in the LOS.