976 resultados para Interaction process
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
Cell adhesion is a fundamentally important process which has been implicated in morphogenesis, metastasis and wound healing. Fibronectin (Fn), a large glycoprotein present in body fluids, the extracellular matrix, and on the cell surface, mediates adhesion of fibroblastic cells. To study the interaction of Fn with Chinese Hamster Cell (CHO) cell membranes, latex beads coated with H('3)-Fn (Fn-beads) were used as surface probes. Binding of Fn-beads was independent of temperature, divalent cations, and metabolic activity. Identification of fibronectin-receptors has been problematical. To study Fn binding components, Fn-beads were pre-incubated with purified glycosaminoglycans (GAGs) and glycolipids. Among the GAGs tested, heparin and heparan sulfate blocked bead binding. Only sialylated glycolipids, GT(,1) and GD(,1) were inhibitory; however, neuraminidase treatment of cells had no effect. It was further shown that Fn-bead binding could be blocked by pre-treating cells with papain. Furthermore, papain digestion releases cellular material which blocks Fn-bead-cell binding. Beads coated with a fragment of Fn which binds to cells but not heparin (F105) were also blocked by soluble papain digests. It was observed that the ability of F105-beads to bind to CHO cells was dependent on surface charge as F105 on uncharged beads did not bind to cells; whereas, F105 on positive or negative beads displayed cell binding activity. The active component in the papain digests was apparently macromolecular (i.e. non-dialysable) and heat stable (i.e. 100(DEGREES)C for 15 min.). This suggested the inhibitory factor is more likely a glycopeptide, rather than a GAG or glycolipid. The findings of this research can be summarized as follows: (1) the expression of cell binding of Fn and Fn fragments can be modulated by the chemical nature of the surface used for adsorption; (2) factors can be released by proteolytic digestion which block Fn and Fn-fragment bead binding; and (3) since bead binding can be done under conditions which reflect initial Fn-cell interaction, it seems likely that the component(s) identified in this way may play a direct role in the recognition phases of cell adhesion to Fn. ^
Resumo:
Filamin is a high molecular weight (2 x 250,000) actin crosslinking protein found in a wide variety of cells and tissues. The most striking feature of filamin is its ability to crosslink F-actin filaments and cause ATP-independent gelation and contraction of F-actin solutions. The gelation of actin filaments by filamin involves binding to actin and crosslinking of the filaments by filamin self-association. In order to understand the role of filamin-actin interactions in the regulation of cytoskeletal assembly, two approaches were used. First, the structural relationship between self-association and actin-binding was examined using proteolytic fragments of filamin. Treatment of filamin with papain generated two major fragments, 90Kd and 180Kd. Upon incubation of the papain digest with F-actin and centrifugation at 100,000 x g, only the 180Kd fragment co-sedimented with F-actin. The binding of the 180Kd fragment, P180, was similar to native filamin in its sensitivity to ionic strength. Analytical gel filtration studies indicated that, unlike native filamin, P180 was monomeric and did not self-associate. Thermolysin treatment of P180 produced a 170Kd fragment, PT170, which no longer bound and co-sedimented with F-actin. These results suggested that filamin contained a discrete actin-binding domain. In order to locate the actin-binding domain, affinity purified antibodies to the papain and thermolysin sensitive regions of filamin were used in conjunction with filamin fragments generated by digestion with S. aureus V8 protease and elastase. The results indicated that the papain and thermolysin cleavage sites were close together, and, most likely, within 10Kd of one another. Taken together, these data suggest that filamin contains a discrete, internal actin-binding domain. The second approach was to use the non-crosslinking fragment P180 to develop a quantitative assay of filamin-actin binding. The binding of ('14)C-carboxyalkylated P180 was examined using the co-sedimentation assay. ('14)C-P180 binding to actin was equivalent to that of unlabelled P180 and exhibited comparable sensitivity of binding to changes in ionic strength. Within 5 min. of incubation the process had reached equilibrium. The specificity of binding was shown by the lack of binding of ('14)C-PT170. The binding of ('14)C-P180 was found to be a reversible and saturable process, with a K(,d) of 2 x 10('-7) M. . . . (Author's abstract exceeds stipulated maximum length. Discontinued here with permission of author.) UMI ^
Resumo:
In eukaryotic cells, the ESCRTs (endosomal sorting complexes required for transport) machinery is required for cellular processes such as endosomal sorting, retroviral budding and cytokinesis. The ALG-2 interacting protein Alix is a modular adaptor protein that is critically involved in these ESCRTs-associated cellular processes and consists of an N-terminal Bro1 domain, a middle V domain and C-terminal Pro-rich domain (PRD). In these cellular processes, Alix interacts with the ESCRT-III component CHMP4 at the Bro1 domain, with HIV-1 p6 Gag or EIAV p9Gag at the V domain, and with the ESCRT-I component TSG101 at the Pro-rich domain. Here we demonstrate that the N-terminal Bro1 domain forms an intramolecular interaction with C-terminal PRD within Alix. This Bro1-PRD intramolecular interaction forms a closed conformation of Alix that autoinhibits Alix interaction with all of these partner proteins. Moreover, the binding of Ca2+-activated ALG-2 to the PRD of Alix relieves the autoinhibitory intramolecular interaction, resulting in an open conformation of Alix which is able to interact with all of these partner proteins. The partner proteins bound to Alix in turn maintain Alix in the open conformation after ALG-2 dissociation with Alix. Consistent with the effect of Ca2+-activated ALG-2 on opening/activating Alix in these ESCRTs-associated functions, ALG-2 overexpression accelerates EGF-induced degradation of EGFR in an Alix-dependent manner. These findings discover an intrinsic autoinhibitory mechanism of Alix and a two-step process to activate/open Alix and then keep Alix active/open. This study has solved long-standing issues on the regulations of Alix in ESCRTs-associated functions and the role of ALG-2-Alix interaction, and may serve as the structural basis for further studies about Alix regulations. ^
Resumo:
Macromolecular interactions, such as protein-protein interactions and protein-DNA interactions, play important roles in executing biological functions in cells. However the complexity of such interactions often makes it very challenging to elucidate the structural details of these subjects. In this thesis, two different research strategies were applied on two different two macromolecular systems: X-ray crystallography on three tandem FF domains of transcription regulator CA150 and electron microscopy on STAT1-importin α5 complex. The results from these studies provide novel insights into the function-structure relationships of transcription coupled RNA splicing mediated by CA150 and the nuclear import process of the JAK-STAT signaling pathway. ^ The first project aimed at the protein-protein interaction module FF domain, which often occurs as tandem repeats. Crystallographic structure of the first three FF domains of human CA150 was determined to 2.7 Å resolution. This is the only crystal structure of an FF domain and the only structure on tandem FF domains to date. It revealed a striking connectivity between an FF domain and the next. Peptide binding assay with the potential binding ligand of FF domains was performed using fluorescence polarization. Furthermore, for the first time, FF domains were found to potentially interact with DNA. DNA binding assays were also performed and the results were supportive to this newly proposed functionality of an FF domain. ^ The second project aimed at understanding the molecular mechanism of the nuclear import process of transcription factor STAT1. The first structural model of pSTAT1-importin α5 complex in solution was built from the images of negative staining electron microscopy. Two STAT1 molecules were observed to interact with one molecule of importin α5 in an asymmetric manner. This seems to imply that STAT1 interacts with importin α5 with a novel mechanism that is different from canonical importin α-cargo interactions. Further in vitro binding assays were performed to obtain more details on the pSTAT1-importin α5 interaction. ^
Resumo:
Complex diseases such as cancer result from multiple genetic changes and environmental exposures. Due to the rapid development of genotyping and sequencing technologies, we are now able to more accurately assess causal effects of many genetic and environmental factors. Genome-wide association studies have been able to localize many causal genetic variants predisposing to certain diseases. However, these studies only explain a small portion of variations in the heritability of diseases. More advanced statistical models are urgently needed to identify and characterize some additional genetic and environmental factors and their interactions, which will enable us to better understand the causes of complex diseases. In the past decade, thanks to the increasing computational capabilities and novel statistical developments, Bayesian methods have been widely applied in the genetics/genomics researches and demonstrating superiority over some regular approaches in certain research areas. Gene-environment and gene-gene interaction studies are among the areas where Bayesian methods may fully exert its functionalities and advantages. This dissertation focuses on developing new Bayesian statistical methods for data analysis with complex gene-environment and gene-gene interactions, as well as extending some existing methods for gene-environment interactions to other related areas. It includes three sections: (1) Deriving the Bayesian variable selection framework for the hierarchical gene-environment and gene-gene interactions; (2) Developing the Bayesian Natural and Orthogonal Interaction (NOIA) models for gene-environment interactions; and (3) extending the applications of two Bayesian statistical methods which were developed for gene-environment interaction studies, to other related types of studies such as adaptive borrowing historical data. We propose a Bayesian hierarchical mixture model framework that allows us to investigate the genetic and environmental effects, gene by gene interactions (epistasis) and gene by environment interactions in the same model. It is well known that, in many practical situations, there exists a natural hierarchical structure between the main effects and interactions in the linear model. Here we propose a model that incorporates this hierarchical structure into the Bayesian mixture model, such that the irrelevant interaction effects can be removed more efficiently, resulting in more robust, parsimonious and powerful models. We evaluate both of the 'strong hierarchical' and 'weak hierarchical' models, which specify that both or one of the main effects between interacting factors must be present for the interactions to be included in the model. The extensive simulation results show that the proposed strong and weak hierarchical mixture models control the proportion of false positive discoveries and yield a powerful approach to identify the predisposing main effects and interactions in the studies with complex gene-environment and gene-gene interactions. We also compare these two models with the 'independent' model that does not impose this hierarchical constraint and observe their superior performances in most of the considered situations. The proposed models are implemented in the real data analysis of gene and environment interactions in the cases of lung cancer and cutaneous melanoma case-control studies. The Bayesian statistical models enjoy the properties of being allowed to incorporate useful prior information in the modeling process. Moreover, the Bayesian mixture model outperforms the multivariate logistic model in terms of the performances on the parameter estimation and variable selection in most cases. Our proposed models hold the hierarchical constraints, that further improve the Bayesian mixture model by reducing the proportion of false positive findings among the identified interactions and successfully identifying the reported associations. This is practically appealing for the study of investigating the causal factors from a moderate number of candidate genetic and environmental factors along with a relatively large number of interactions. The natural and orthogonal interaction (NOIA) models of genetic effects have previously been developed to provide an analysis framework, by which the estimates of effects for a quantitative trait are statistically orthogonal regardless of the existence of Hardy-Weinberg Equilibrium (HWE) within loci. Ma et al. (2012) recently developed a NOIA model for the gene-environment interaction studies and have shown the advantages of using the model for detecting the true main effects and interactions, compared with the usual functional model. In this project, we propose a novel Bayesian statistical model that combines the Bayesian hierarchical mixture model with the NOIA statistical model and the usual functional model. The proposed Bayesian NOIA model demonstrates more power at detecting the non-null effects with higher marginal posterior probabilities. Also, we review two Bayesian statistical models (Bayesian empirical shrinkage-type estimator and Bayesian model averaging), which were developed for the gene-environment interaction studies. Inspired by these Bayesian models, we develop two novel statistical methods that are able to handle the related problems such as borrowing data from historical studies. The proposed methods are analogous to the methods for the gene-environment interactions on behalf of the success on balancing the statistical efficiency and bias in a unified model. By extensive simulation studies, we compare the operating characteristics of the proposed models with the existing models including the hierarchical meta-analysis model. The results show that the proposed approaches adaptively borrow the historical data in a data-driven way. These novel models may have a broad range of statistical applications in both of genetic/genomic and clinical studies.
Resumo:
The executive - legislative relations in the Philippines have been described in two contrasting stories, namely the "strong president" story, and the "strong congress" story. This paper tries to consolidate the existing arguments and propose a new perspective focusing on the "compromise exchange" between the president and the congress across the different policy areas. It considers that the policy outcome is not brought by unilateral power of the president or the congress, but formed as the product of such an exchange. Interaction of powers and their complementary function are addressed. Furthermore, aside from the constitutional power, the weak party discipline is pointed out as a key factor in making the exchange possible.
Resumo:
In India, as the production of passenger cars increased, many local small and medium enterprises (SMEs) entered the parts and components manufacturing sector. The sources of knowledge for large enterprises and SMEs are different. Naturally, spillover effects among large enterprises and between large enterprises and SMEs are different. This paper focuses on knowledge spillover among large enterprises and from large enterprises to SMEs. Subcontractor can absorb relation-specific skills through repeated interaction with parent company. The results of field survey emphasizes that relation-specific skills are a determinant factor of spillover effects from assemblers and large auto component manufacturers to SMEs. Econometric analysis shows that spillover effects among medium and large automobile units and from medium and large automobile units to small units went beyond boundary of cluster.
Resumo:
Usability is the capability of the software product to be understood, learned, used and attractive to the user, when used under specified conditions. Many studies demonstrate the benefits of usability, yet to this day software products continue to exhibit consistently low levels of this quality attribute. Furthermore, poor usability in software systems contributes largely to software failing in actual use. One of the main disciplines involved in usability is that of Human-Computer Interaction (HCI). Over the past two decades the HCI community has proposed specific features that should be present in applications to improve their usability, yet incorporating them into software continues to be far from trivial for software developers. These difficulties are due to multiple factors, including the high level of abstraction at which these HCI recommendations are made and how far removed they are from actual software implementation. In order to bridge this gap, the Software Engineering community has long proposed software design solutions to help developers include usability features into software, however, the problem remains an open research question. This doctoral thesis addresses the problem of helping software developers include specific usability features into their applications by providing them with a structured and tangible guidance in the form of a process, which we have termed the Usability-Oriented Software Development Process. This process is supported by a set of Software Usability Guidelines that help developers to incorporate a set of eleven usability features with high impact on software design. After developing the Usability-oriented Software Development Process and the Software Usability Guidelines, they have been validated across multiple academic projects and proven to help software developers to include such usability features into their software applications. In doing so, their use significantly reduced development time and improved the quality of the resulting designs of these projects. Furthermore, in this work we propose a software tool to automate the application of the proposed process. In sum, this work contributes to the integration of the Software Engineering and HCI disciplines providing a framework that helps software developers to create usable applications in an efficient way.
Resumo:
Speech Technologies can provide important benefits for the development of more usable and safe in-vehicle human-machine interactive systems (HMIs). However mainly due robustness issues, the use of spoken interaction can entail important distractions to the driver. In this challenging scenario, while speech technologies are evolving, further research is necessary to explore how they can be complemented with both other modalities (multimodality) and information from the increasing number of available sensors (context-awareness). The perceived quality of speech technologies can significantly be increased by implementing such policies, which simply try to make the best use of all the available resources; and the in vehicle scenario is an excellent test-bed for this kind of initiatives. In this contribution we propose an event-based HMI design framework which combines context modelling and multimodal interaction using a W3C XML language known as SCXML. SCXML provides a general process control mechanism that is being considered by W3C to improve both voice interaction (VoiceXML) and multimodal interaction (MMI). In our approach we try to anticipate and extend these initiatives presenting a flexible SCXML-based approach for the design of a wide range of multimodal context-aware HMI in-vehicle interfaces. The proposed framework for HMI design and specification has been implemented in an automotive OSGi service platform, and it is being used and tested in the Spanish research project MARTA for the development of several in-vehicle interactive applications.
Resumo:
There is increasing pressure on developers to produce usable systems, which requires the use of appropriate methods to support user centred design during development. There is currently no consistent advice on which methods are appropriate in which circumstances, so the selection of methods relies on individual experience and expertise. Considerable effort is required to collate information from various sources and to understand the applicability of each method in a particular situation. Usability Planner is a tool aimed to support the selection of the most appropriate methods depending on project and organizational constraints. Many of the rules employed are derived from ISO standards, complemented with rules from the authors’ experience.
Resumo:
The solubility parameters of two SBS commercial rubbers with different structures (lineal and radial), and with slightly different styrene content have been determined by inverse gas chromatography technique. The Flory–Huggins interaction parameters of several polymer–solvent mixtures have also been calculated. The influence of the polymer composition, the solvent molecular weight and the temperature over these parameters have been discussed; besides, these parameters have been compared with previous ones, obtained by intrinsic viscosity measurements. From the Flory–Huggins interaction parameters, the infinite dilution activity coefficients of the solvents have been calculated and fitted to the well-known NRTL model. These NRTL binary interaction parameters have a great importance in modelling the separation steps in the process of obtaining the rubber.
Resumo:
In this paper, label-free biosensing for antibody screening by periodic lattices of high-aspect ratio SU-8 nano-pillars (BICELLs) is presented. As a demonstration, the determination of anti-gestrinone antibodies from whole rabbit serum is carried out, and for the first time, the dissociation constant (KD = 6 nM) of antigen-antibody recognition process is calculated using this sensing system. After gestrinone antigen immobilization on the BICELLs, the immunorecognition was performed. The cells were interrogated vertically by using micron spot size Fourier transform visible and IR spectrometry (FT-VIS-IR), and the dip wavenumber shift was monitored. The biosensing assay exhibited good reproducibility and sensitivity (LOD = 0.75 ng/mL).
Resumo:
Learning analytics is the analysis of static and dynamic data extracted from virtual learning environments, in order to understand and optimize the learning process. Generally, this dynamic data is generated by the interactions which take place in the virtual learning environment. At the present time, many implementations for grouping of data have been proposed, but there is no consensus yet on which interactions and groups must be measured and analyzed. There is also no agreement on what is the influence of these interactions, if any, on learning outcomes, academic performance or student success. This study presents three different extant interaction typologies in e-learning and analyzes the relation of their components with students? academic performance. The three different classifications are based on the agents involved in the learning process, the frequency of use and the participation mode, respectively. The main findings from the research are: a) that agent-based classifications offer a better explanation of student academic performance; b) that at least one component in each typology predicts academic performance; and c) that student-teacher and student-student, evaluating students, and active interactions, respectively, have a significant impact on academic performance, while the other interaction types are not significantly related to academic performance.
Resumo:
There is an increasing interest in the intersection of human-computer interaction and public policy. This day-long workshop will examine successes and challenges related to public policy and human computer interaction, in order to provide a forum to create a baseline of examples and to start the process of writing a white paper on the topic.