26 resultados para optimising of the price hedging
Resumo:
Most recently discussion about the optimal treatment for different subsets of patients suffering from coronary artery disease has re-emerged, mainly because of the uncertainty caused by doctors and patients regarding the phenomenon of unpredictable early and late stent thrombosis. Surgical revascularization using multiple arterial bypass grafts has repeatedly proven its superiority compared to percutaneous intervention techniques, especially in patients suffering from left main stem disease and coronary 3-vessels disease. Several prospective randomized multicenter studies comparing early and mid-term results following PCI and CABG have been really restrictive, with respect to patient enrollment, with less than 5% of all patients treated during the same time period been enrolled. Coronary artery bypass grafting allows the most complete revascularization in one session, because all target coronary vessels larger than 1 mm can be bypassed in their distal segments. Once the patient has been turn-off for surgery, surgeons have to consider the most complete arterial revascularization in order to decrease the long-term necessity for re-revascularization; for instance patency rate of the left internal thoracic artery grafted to the distal part left anterior descending artery may be as high as 90-95% after 10 to 15 years. Early mortality following isolated CABG operation has been as low as 0.6 to 1% in the most recent period (reports from the University Hospital Berne and the University Hospital of Zurich); beside these excellent results, the CABG option seems to be less expensive than PCI with time, since the necessity for additional PCI is rather high following initial PCI, and the price of stent devices is still very high, particularly in Switzerland. Patients, insurance and experts in health care should be better and more honestly informed concerning the risk and costs of PCI and CABG procedures as well as about the much higher rate of subsequent interventions following PCI. Team approach for all patients in whom both options could be offered seems mandatory to avoid unbalanced information of the patients. Looking at the recent developments in transcatheter valve treatments, the revival of cardiological-cardiosurgical conferences seems to a good option to optimize the cooperation between the two medical specialties: cardiology and cardiac surgery.
Resumo:
OBJECTIVES In 2003 the International Breast Cancer Study Group (IBCSG) initiated the TEXT and SOFT randomized phase III trials to answer two questions concerning adjuvant treatment for premenopausal women with endocrine-responsive early breast cancer: 1-What is the role of aromatase inhibitors (AI) for women treated with ovarian function suppression (OFS)? 2-What is the role of OFS for women who remain premenopausal and are treated with tamoxifen? METHODS TEXT randomized patients to receive exemestane or tamoxifen with OFS. SOFT randomized patients to receive exemestane with OFS, tamoxifen with OFS, or tamoxifen alone. Treatment was for 5 years from randomization. RESULTS TEXT and SOFT successfully met their enrollment goals in 2011. The 5738 enrolled women had lower-risk disease and lower observed disease-free survival (DFS) event rates than anticipated. Consequently, 7 and 13 additional years of follow-up for TEXT and SOFT, respectively, were required to reach the targeted DFS events (median follow-up about 10.5 and 15 years). To provide timely answers, protocol amendments in 2011 specified analyses based on chronological time and median follow-up. To assess the AI question, exemestane + OFS versus tamoxifen + OFS, a combined analysis of TEXT and SOFT became the primary analysis (n = 4717). The OFS question became the primary analysis from SOFT, assessing the unique comparison of tamoxifen + OFS versus tamoxifen alone (n = 2045). The first reports are anticipated in mid- and late-2014. CONCLUSIONS We present the original designs of TEXT and SOFT and adaptations to ensure timely answers to two questions concerning optimal adjuvant endocrine treatment for premenopausal women with endocrine-responsive breast cancer. Trial Registration TEXT: Clinicaltrials.govNCT00066703 SOFT: Clinicaltrials.govNCT00066690.
Resumo:
BACKGROUND The use of transcatheter mitral valve repair (TMVR) has gained widespread acceptance in Europe, but data on immediate success, safety, and long-term echocardiographic follow-up in real-world patients are still limited. OBJECTIVES The aim of this multinational registry is to present a real-world overview of TMVR use in Europe. METHODS The Transcatheter Valve Treatment Sentinel Pilot Registry is a prospective, independent, consecutive collection of individual patient data. RESULTS A total of 628 patients (mean age 74.2 ± 9.7 years, 63.1% men) underwent TMVR between January 2011 and December 2012 in 25 centers in 8 European countries. The prevalent pathogenesis was functional mitral regurgitation (FMR) (n = 452 [72.0%]). The majority of patients (85.5%) were highly symptomatic (New York Heart Association functional class III or higher), with a high logistic EuroSCORE (European System for Cardiac Operative Risk Evaluation) (20.4 ± 16.7%). Acute procedural success was high (95.4%) and similar in FMR and degenerative mitral regurgitation (p = 0.662). One clip was implanted in 61.4% of patients. In-hospital mortality was low (2.9%), without significant differences between groups. The estimated 1-year mortality was 15.3%, which was similar for FMR and degenerative mitral regurgitation. The estimated 1-year rate of rehospitalization because of heart failure was 22.8%, significantly higher in the FMR group (25.8% vs. 12.0%, p[log-rank] = 0.009). Paired echocardiographic data from the 1-year follow-up, available for 368 consecutive patients in 15 centers, showed a persistent reduction in the degree of mitral regurgitation at 1 year (6.0% of patients with severe mitral regurgitation). CONCLUSIONS This independent, contemporary registry shows that TMVR is associated with high immediate success, low complication rates, and sustained 1-year reduction of the severity of mitral regurgitation and improvement of clinical symptoms.
Resumo:
AIMS/HYPOTHESIS Plasminogen activator inhibitor-1 (PAI-1) has been regarded as the main antifibrinolytic protein in diabetes, but recent work indicates that complement C3 (C3), an inflammatory protein, directly compromises fibrinolysis in type 1 diabetes. The aim of the current project was to investigate associations between C3 and fibrinolysis in a large cohort of individuals with type 2 diabetes. METHODS Plasma levels of C3, C-reactive protein (CRP), PAI-1 and fibrinogen were analysed by ELISA in 837 patients enrolled in the Edinburgh Type 2 Diabetes Study. Fibrin clot lysis was analysed using a validated turbidimetric assay. RESULTS Clot lysis time correlated with C3 and PAI-1 plasma levels (r = 0.24, p < 0.001 and r = 0.22, p < 0.001, respectively). In a multivariable regression model involving age, sex, BMI, C3, PAI-1, CRP and fibrinogen, and using log-transformed data as appropriate, C3 was associated with clot lysis time (regression coefficient 0.227 [95% CI 0.161, 0.292], p < 0.001), as was PAI-1 (regression coefficient 0.033 [95% CI 0.020, 0.064], p < 0.05) but not fibrinogen (regression coefficient 0.003 [95% CI -0.046, 0.051], p = 0.92) or CRP (regression coefficient 0.024 [95% CI -0.008, 0.056], p = 0.14). No correlation was demonstrated between plasma levels of C3 and PAI-1 (r = -0.03, p = 0.44), consistent with previous observations that the two proteins affect different pathways in the fibrinolytic system. CONCLUSIONS/INTERPRETATION Similarly to PAI-1, C3 plasma levels are independently associated with fibrin clot lysis in individuals with type 2 diabetes. Therefore, future studies should analyse C3 plasma levels as a surrogate marker of fibrinolysis potential in this population.
Resumo:
This study compares the performance of four commonly used approaches to measure consumers’ willingness to pay with real purchase data (REAL): the open-ended (OE) question format; choicebased conjoint (CBC) analysis; Becker, DeGroot, and Marschak’s (BDM) incentive-compatible mechanism; and incentive-aligned choice-based conjoint (ICBC) analysis. With this five-in-one approach, the authors test the relative strengths of the four measurement methods, using REAL as the benchmark, on the basis of statistical criteria and decision-relevant metrics. The results indicate that the BDM and ICBC approaches can pass statistical and decision-oriented tests. The authors find that respondents are more price sensitive in incentive-aligned settings than in non-incentive-aligned settings and the REAL setting. Furthermore, they find a large number of “none” choices under ICBC than under hypothetical conjoint analysis. This study uncovers an intriguing possibility: Even when the OE format and CBC analysis generate hypothetical bias, they may still lead to the right demand curves and right pricing decisions.
Resumo:
In land systems, equitably managing trade-offs between planetary boundaries and human development needs represents a grand challenge in sustainability oriented initiatives. Informing such initiatives requires knowledge about the nexus between land use, poverty, and environment. This paper presents results from Lao PDR, where we combined nationwide spatial data on land use types and the environmental state of landscapes with village-level poverty indicators. Our analysis reveals two general but contrasting trends. First, landscapes with paddy or permanent agriculture allow a greater number of people to live in less poverty but come at the price of a decrease in natural vegetation cover. Second, people practising extensive swidden agriculture and living in intact environments are often better off than people in degraded paddy or permanent agriculture. As poverty rates within different landscape types vary more than between landscape types, we cannot stipulate a land use–poverty–environment nexus. However, the distinct spatial patterns or configurations of these rates point to other important factors at play. Drawing on ethnicity as a proximate factor for endogenous development potentials and accessibility as a proximate factor for external influences, we further explore these linkages. Ethnicity is strongly related to poverty in all land use types almost independently of accessibility, implying that social distance outweighs geographic or physical distance. In turn, accessibility, almost a precondition for poverty alleviation, is mainly beneficial to ethnic majority groups and people living in paddy or permanent agriculture. These groups are able to translate improved accessibility into poverty alleviation. Our results show that the concurrence of external influences with local—highly contextual—development potentials is key to shaping outcomes of the land use–poverty–environment nexus. By addressing such leverage points, these findings help guide more effective development interventions. At the same time, they point to the need in land change science to better integrate the understanding of place-based land indicators with process-based drivers of land use change.
Weather and War – Economic and social vulnerability in Switzerland at the end of the First World War
Resumo:
Neutral Switzerland – not embedded in the fighting forces – yet was involved in the Great War mainly in economical terms. Since Switzerland is a landlocked country especially agriculture became an important topic of war economy in regard to food security. Until 1916 national food supply was limited but could be maintained through barter trade. In 1916 a crisis on both supply and production level occurred and led to a decline in food availability and to immense price risings causing social turmoil. This paper aims to outline the factors of vulnerability in respect of food in Switzerland during the First World War and further it will show different coping strategies that were undertaken during that time. The paper takes the work of Mario Aeby and Christian Pfister (University of Bern) into consideration that pointed out to weather anomalies during the years 1916 and 1917 aggravating the already tense food situation. Arguing for an overlap of supply and production crisis the paper focuses on agricultural and economic history including environmental impacts. Further the paper addresses the question of what makes a food system resilient to such unforeseen impacts.
Resumo:
The most recent comprehensive assessment carried out by the Intergovernmental Panel on Climate Change has concluded that “Human influence on the climate system is clear,” a headline statement that was approved by all governments in consensus. This influence will have long-lasting consequences for ecosystems, and the resulting impacts will continue to be felt millennia from now. Although the terrestrial impacts of climate change are readily apparent now and have received widespread public attention, the effects of climate change on the oceans have been relatively invisible. However, the world ocean provides a number of crucial services that are of global significance, all of which come with an increasing price caused by human activities. This needs to be taken into account when considering adaptation to and mitigation of anthropogenic climate change.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.