41 resultados para Kaski, Antti: The security complex: a theoretical analysis and the Baltic case
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
This article seeks to contribute to the illumination of the so-called 'paradox of voting' using the German Bundestag elections of 1998 as an empirical case. Downs' model of voter participation will be extended to include elements of the theory of subjective expected utility (SEU). This will allow a theoretical and empirical exploration of the crucial mechanisms of individual voters' decisions to participate, or abstain from voting, in the German general election of 1998. It will be argued that the infinitely low probability of an individual citizen's vote to decide the election outcome will not necessarily reduce the probability of electoral participation. The empirical analysis is largely based on data from the ALLBUS 1998. It confirms the predictions derived from SEU theory. The voters' expected benefits and their subjective expectation to be able to influence government policy by voting are the crucial mechanisms to explain participation. By contrast, the explanatory contribution of perceived information and opportunity costs is low.
Resumo:
We present a derivation and, based on it, an extension of a model originally proposed by V.G. Niziev to describe continuous wave laser cutting of metals. Starting from a local energy balance and by incorporating heat removal through heat conduction to the bulk material, we find a differential equation for the cutting profile. This equation is solved numerically and yields, besides the cutting profiles, the maximum cutting speed, the absorptivity profiles, and other relevant quantities. Our main goal is to demonstrate the model’s capability to explain some of the experimentally observed differences between laser cutting at around 1 and 10 μm wavelengths. To compare our numerical results to experimental observations, we perform simulations for exactly the same material and laser beam parameters as those used in a recent comparative experimental study. Generally, we find good agreement between theoretical and experimental results and show that the main differences between laser cutting with 1- and 10-μm beams arise from the different absorptivity profiles and absorbed intensities. Especially the latter suggests that the energy transfer, and thus the laser cutting process, is more efficient in the case of laser cutting with 1-μm beams.
Resumo:
In the last decades affine algebraic varieties and Stein manifolds with big (infinite-dimensional) automorphism groups have been intensively studied. Several notions expressing that the automorphisms group is big have been proposed. All of them imply that the manifold in question is an Oka–Forstnerič manifold. This important notion has also recently merged from the intensive studies around the homotopy principle in Complex Analysis. This homotopy principle, which goes back to the 1930s, has had an enormous impact on the development of the area of Several Complex Variables and the number of its applications is constantly growing. In this overview chapter we present three classes of properties: (1) density property, (2) flexibility, and (3) Oka–Forstnerič. For each class we give the relevant definitions, its most significant features and explain the known implications between all these properties. Many difficult mathematical problems could be solved by applying the developed theory, we indicate some of the most spectacular ones.
Resumo:
Many biological processes depend on the sequential assembly of protein complexes. However, studying the kinetics of such processes by direct methods is often not feasible. As an important class of such protein complexes, pore-forming toxins start their journey as soluble monomeric proteins, and oligomerize into transmembrane complexes to eventually form pores in the target cell membrane. Here, we monitored pore formation kinetics for the well-characterized bacterial pore-forming toxin aerolysin in single cells in real time to determine the lag times leading to the formation of the first functional pores per cell. Probabilistic modeling of these lag times revealed that one slow and seven equally fast rate-limiting reactions best explain the overall pore formation kinetics. The model predicted that monomer activation is the rate-limiting step for the entire pore formation process. We hypothesized that this could be through release of a propeptide and indeed found that peptide removal abolished these steps. This study illustrates how stochasticity in the kinetics of a complex process can be exploited to identify rate-limiting mechanisms underlying multistep biomolecular assembly pathways.
Resumo:
By switching the level of analysis and aggregating data from the micro-level of individual cases to the macro-level, quantitative data can be analysed within a more case-based approach. This paper presents such an approach in two steps: In a first step, it discusses the combination of Social Network Analysis (SNA) and Qualitative Comparative Analysis (QCA) in a sequential mixed-methods research design. In such a design, quantitative social network data on individual cases and their relations at the micro-level are used to describe the structure of the network that these cases constitute at the macro-level. Different network structures can then be compared by QCA. This strategy allows adding an element of potential causal explanation to SNA, while SNA-indicators allow for a systematic description of the cases to be compared by QCA. Because mixing methods can be a promising, but also a risky endeavour, the methodological part also discusses the possibility that underlying assumptions of both methods could clash. In a second step, the research design presented beforehand is applied to an empirical study of policy network structures in Swiss politics. Through a comparison of 11 policy networks, causal paths that lead to a conflictual or consensual policy network structure are identified and discussed. The analysis reveals that different theoretical factors matter and that multiple conjunctural causation is at work. Based on both the methodological discussion and the empirical application, it appears that a combination of SNA and QCA can represent a helpful methodological design for social science research and a possibility of using quantitative data with a more case-based approach.
Resumo:
Immediate breast reconstruction (IBR) has become an established procedure for women necessitating mastectomy. Traditionally, the nipple-areola complex (NAC) is resected during this procedure. The NAC, in turn, is a principal factor determining aesthetic outcome after breast reconstruction, and due to its particular texture and shape, a natural-looking NAC can barely be reconstructed with other tissues. The aim of this study was to assess the oncological safety as well as morbidity and aesthetic outcome after replantation of the NAC some days after IBR. Retrospective analysis of 85 patients receiving 88 mastectomies and IBR between 1998 and 2007 was conducted. NAC (n=29) or the nipple alone (n=23) were replanted 7 days (median, range 2-10 days) after IBR in 49 patients, provided the subareolar tissue was histologically negative for tumour infiltration. Local recurrence rate was assessed after 49 months (median, range 6-120 months). Aesthetic outcome was evaluated by clinical assessment during routine follow-up at least 12 months after the last intervention. Malignant involvement of the subareolar tissue was found in eight cases (9.1%). Patients qualifying for NAC replantation were in stage 0 in 29%, stage I in 15%, stage IIa in 31%, stage IIb in 17% and stage III in 8%. Total or partial necrosis occurred in 69% and 26% if the entire NAC or only the nipple were replanted, respectively (P<0.01). Depigmentation was seen in 52% and corrective surgery was done in 11 out of 52 NAC or nipple replantations. Local recurrence and isolated regional lymph node metastasis were observed in one single case each. Another 5.8% of the patients showed distant metastases. We conclude that the replantation of the NAC in IBR is oncologically safe, provided the subareolar tissue is free of tumour. However, the long-term aesthetic outcome of NAC replantation is not satisfying, which advocates replanting the nipple alone.
Resumo:
Background Parasitic wasps constitute one of the largest group of venomous animals. Although some physiological effects of their venoms are well documented, relatively little is known at the molecular level on the protein composition of these secretions. To identify the majority of the venom proteins of the endoparasitoid wasp Chelonus inanitus (Hymenoptera: Braconidae), we have randomly sequenced 2111 expressed sequence tags (ESTs) from a cDNA library of venom gland. In parallel, proteins from pure venom were separated by gel electrophoresis and individually submitted to a nano-LC-MS/MS analysis allowing comparison of peptides and ESTs sequences. Results About 60% of sequenced ESTs encoded proteins whose presence in venom was attested by mass spectrometry. Most of the remaining ESTs corresponded to gene products likely involved in the transcriptional and translational machinery of venom gland cells. In addition, a small number of transcripts were found to encode proteins that share sequence similarity with well-known venom constituents of social hymenopteran species, such as hyaluronidase-like proteins and an Allergen-5 protein. An overall number of 29 venom proteins could be identified through the combination of ESTs sequencing and proteomic analyses. The most highly redundant set of ESTs encoded a protein that shared sequence similarity with a venom protein of unknown function potentially specific of the Chelonus lineage. Venom components specific to C. inanitus included a C-type lectin domain containing protein, a chemosensory protein-like protein, a protein related to yellow-e3 and ten new proteins which shared no significant sequence similarity with known sequences. In addition, several venom proteins potentially able to interact with chitin were also identified including a chitinase, an imaginal disc growth factor-like protein and two putative mucin-like peritrophins. Conclusions The use of the combined approaches has allowed to discriminate between cellular and truly venom proteins. The venom of C. inanitus appears as a mixture of conserved venom components and of potentially lineage-specific proteins. These new molecular data enrich our knowledge on parasitoid venoms and more generally, might contribute to a better understanding of the evolution and functional diversity of venom proteins within Hymenoptera.