953 resultados para efficiency and sustainability analysis
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
This study examines the relationship between stock market reaction to horizontal merger announcements and technical efficiency levels of the participating firms. The analysis is based on data pertaining to eighty mergers between firms in the U.S. manufacturing industry during the 1990s. We employ Data Envelopment Analysis (DEA) to measure technical efficiency, which capture the firms. competence to produce the maximum output given certain productive resources. Abnormal returns related to the merger announcements provide the investor.s re-evaluation on the future performance of the participating firms. In order to avoid the problem of nonnormality, heteroskedasticity in the regression analysis, bootstrap method is employed for estimations and inferences. We found that there is a significant relationship between technical efficiency and market response. The market apparently welcomes the merger as an arrangement to improve resource utilizations.
Resumo:
To identify more mutations that can affect the early development of Myxococcus xanthus, the synthetic transposon TnT41 was designed and constructed. By virtue of its special features, it can greatly facilitate the processes of mutation screening/selection, mapping, cloning and DNA sequencing. In addition, it allows for the systematic discovery of genes in regulatory hierarchies using their target promoters. In this study, the minimal regulatory region of the early developmentally regulated gene 4521 was used as a reporter in the TnT41 mutagenesis. Both positive (P) mutations and negative (N) mutations were isolated based on their effects on 4521 expression.^ Four of these mutations, i.e. N1, N2, P52 and P54 were analyzed in detail. Mutations N1 and N2 are insertion mutations in a gene designated sasB. The sasB gene is also identified in this study by genetic and molecular analysis of five UV-generated 4521 suppressor mutations. The sasB gene encodes a protein without meaningful homology in the databases. The sasB gene negatively regulates 4521 expression possibly through the SasS-SasR two component system. A wild-type sasB gene is required for normal M. xanthus fruiting body formation and sporulation.^ Cloning and sequencing analysis of the P52 mutation led to the identification of an operon that encodes the M. xanthus high-affinity branched-chain amino acid transporter system. This liv operon consists of five genes designated livK, livH, livM, livC, and livF, respectively. The Liv proteins are highly similar to their counterparts from other bacteria in both amino acid sequences, functional motifs and predicted secondary structures. This system is required for development since liv null mutations cause abnormality in fruiting body formation and a 100-fold decrease in sporulation efficiency.^ Mutation P54 is a TnT41 insertion in the sscM gene of the ssc chemotaxis system, which has been independently identified by Dr. Shi's lab. The sscM gene encodes a MCP (methyl-accepting chemotaxis protein) homologue. The SscM protein is predicted to contain two transmembrane domains, a signaling domain and at least one putative methylation site. Null mutations of this gene abolish the aggregation of starving cells at a very early stage, though the sporulation levels of the mutant can reach 10% that of wild-type cells. ^
Resumo:
Urban areas benefit from significant improvements in accessibility when a new high speed rail (HSR) project is built. These improvements, which are due mainly to a rise in efficiency, produce locational advantagesand increase the attractiveness of these cities, thereby possibly enhancing their competitivenessand economic growth. However, there may be equity issues at stake, as the main accessibility benefits are primarily concentrated in urban areas with a HSR station, whereas other locations obtain only limited benefits. HSR extensions may contribute to an increase in spatial imbalance and lead to more polarized patterns of spatial development. Procedures for assessing the spatial impacts of HSR must therefore follow a twofold approach which addresses issues of both efficiency and equity. This analysis can be made by jointly assessing both the magnitude and distribution of the accessibility improvements deriving from a HSR project. This paper describes an assessment methodology for HSR projects which follows this twofold approach. The procedure uses spatial impact analysis techniques and is based on the computation of accessibility indicators, supported by a Geographical Information System (GIS). Efficiency impacts are assessed in terms of the improvements in accessibility resulting from the HSR project, with a focus on major urban areas; and spatial equity implications are derived from changes in the distribution of accessibility values among these urban agglomerations.
Resumo:
"Slow Fashion" attempts to offset the demand for fast fashion and mass production (Fletcher, 2007). Consumers' response to sustainability-based practices is a limited discourse and studies for slow fashion concept are scarce. This study thus aims to enlighten the subject of how slow fashion concept could improve local economies and how Spanish consumers respond to such initiatives. This paper is based on an exploratory qualitative research for which focus group interviews including three group discussions with Spanish consumers were held. The data was examined by constant comparison analysis to present consumer insights. Moreover, a case study was conducted with a Spanish apparel brand. Saint Brissant was chosen since it manufactures in Spain to (i) ensure its products? high quality and (ii) to empower Spanish economy. This paper provides empirical insights. Even though local manufacturing was perceived to have a higher quality, Spanish consumers? behavioural intentions of using local brands were not high.Self-interest, mainly price and design, was recorded as the most influential purchase criteria. Furthermore, Saint Brissant case demonstrated that local manufacturing could boost local economies by creating workforce. However, governmental subsidies should be rearranged and consumers? perceptions should be improved to support local manufacturers in Spain.
Resumo:
This paper presents a project for providing the students of Structural Engineering with the flexibility to learn outside classroom schedules. The goal is a framework for adaptive E-learning based on a repository of open educational courseware with a set of basic Structural Engineering concepts and fundamentals. These are paramount for students to expand their technical knowledge and skills in structural analysis and design of tall buildings, arch-type structures as well as bridges. Thus, concepts related to structural behaviour such as linearity, compatibility, stiffness and influence lines have traditionally been elusive for students. The objective is to facilitate the student a teachinglearning process to acquire the necessary intuitive knowledge, cognitive skills and the basis for further technological modules and professional development in this area. As a side effect, the system is expected to help the students improve their preparation for exams on the subject. In this project, a web-based open-source system for studying influence lines on continuous beams is presented. It encompasses a collection of interactive user-friendly applications accessible via Web, written in JavaScript under JQuery and Dygraph Libraries, taking advantage of their efficiency and graphic capabilities. It is performed in both Spanish and English languages. The student is enabled to set the geometric, topologic, boundary and mechanic layout of a continuous beam. While changing the loading and the support conditions, the changes in the beam response prompt on the screen, so that the effects of the several issues involved in structural analysis become apparent. This open interaction with the user allows the student to simulate and virtually infer the structural response. Different levels of complexity can be handled, whereas an ongoing help is at hand for any of them. Students can freely boost their experiential learning on this subject at their own pace, in order to further share, process, generalize and apply the relevant essential concepts of Structural Engineering analysis. Besides, this collection is being added to the "Virtual Lab of Continuum Mechanics" of the UPM, launched in 2013 (http://serviciosgate.upm.es/laboratoriosvirtuales/laboratorios/medios-continuos-en-construcci%C3%B3n)
Resumo:
??
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-05
Resumo:
The advent of molecular markers as a tool to aid selection has provided plant breeders with the opportunity to rapidly deliver superior genetic solutions to problems in agricultural production systems. However, a major constraint to the implementation of marker-assisted selection (MAS) in pragmatic breeding programs in the past has been the perceived high relative cost of MAS compared to conventional phenotypic selection. In this paper, computer simulation was used to design a genetically effective and economically efficient marker-assisted breeding strategy aimed at a specific outcome. Under investigation was a strategy involving the integration of both restricted backcrossing and doubled haploid (DH) technology. The point at which molecular markers are applied in a selection strategy can be critical to the effectiveness and cost efficiency of that strategy. The application of molecular markers was considered at three phases in the strategy: allele enrichment in the BC1F1 population, gene selection at the haploid stage and the selection for recurrent parent background of DHs prior to field testing. Overall, incorporating MAS at all three stages was the most effective, in terms of delivering a high frequency of desired outcomes and at combining the selected favourable rust resistance, end use quality and grain yield alleles. However, when costs were included in the model the combination of MAS at the BC1F1 and haploid stage was identified as the optimal strategy. A detailed economic analysis showed that incorporation of marker selection at these two stages not only increased genetic gain over the phenotypic alternative but actually reduced the over all cost by 40%.
Resumo:
Purpose – The data used in this study is for the period 1980-2000. Almost midway through this period (in 1992), the Kenyan government liberalized the sugar industry and the role of the market increased, while the government's role with respect to control of prices, imports and other aspects in the sector declined. This exposed the local sugar manufacturers to external competition from other sugar producers, especially from the COMESA region. This study aims to find whether there were any changes in efficiency of production between the two periods (pre and post-liberalization). Design/methodology/approach – The study utilized two methodologies to efficiency estimation: data envelopment analysis (DEA) and the stochastic frontier. DEA uses mathematical programming techniques and does not impose any functional form on the data. However, it attributes all deviation from the mean function to inefficiencies. The stochastic frontier utilizes econometric techniques. Findings – The test for structural differences in the two periods does not show any statistically significant differences between the two periods. However, both methodologies show a decline in efficiency levels from 1992, with the lowest period experienced in 1998. From then on, efficiency levels began to increase. Originality/value – To the best of the authors' knowledge, this is the first paper to use both methodologies in the sugar industry in Kenya. It is shown that in industries where the noise (error) term is minimal (such as manufacturing), the DEA and stochastic frontier give similar results.
Resumo:
This paper explores the use of the optimisation procedures in SAS/OR software with application to the measurement of efficiency and productivity of decision-making units (DMUs) using data envelopment analysis (DEA) techniques. DEA was originally introduced by Charnes et al. [J. Oper. Res. 2 (1978) 429] is a linear programming method for assessing the efficiency and productivity of DMUs. Over the last two decades, DEA has gained considerable attention as a managerial tool for measuring performance of organisations and it has widely been used for assessing the efficiency of public and private sectors such as banks, airlines, hospitals, universities and manufactures. As a result, new applications with more variables and more complicated models are being introduced. Further to successive development of DEA a non-parametric productivity measure, Malmquist index, has been introduced by Fare et al. [J. Prod. Anal. 3 (1992) 85]. Employing Malmquist index, productivity growth can be decomposed into technical change and efficiency change. On the other hand, the SAS is a powerful software and it is capable of running various optimisation problems such as linear programming with all types of constraints. To facilitate the use of DEA and Malmquist index by SAS users, a SAS/MALM code was implemented in the SAS programming language. The SAS macro developed in this paper selects the chosen variables from a SAS data file and constructs sets of linear-programming models based on the selected DEA. An example is given to illustrate how one could use the code to measure the efficiency and productivity of organisations.
Resumo:
We present an implementation of the domain-theoretic Picard method for solving initial value problems (IVPs) introduced by Edalat and Pattinson [1]. Compared to Edalat and Pattinson's implementation, our algorithm uses a more efficient arithmetic based on an arbitrary precision floating-point library. Despite the additional overestimations due to floating-point rounding, we obtain a similar bound on the convergence rate of the produced approximations. Moreover, our convergence analysis is detailed enough to allow a static optimisation in the growth of the precision used in successive Picard iterations. Such optimisation greatly improves the efficiency of the solving process. Although a similar optimisation could be performed dynamically without our analysis, a static one gives us a significant advantage: we are able to predict the time it will take the solver to obtain an approximation of a certain (arbitrarily high) quality.
Resumo:
This study employs stochastic frontier analysis to analyze Malaysian commercial banks during 1996-2002, and particularly focuses on determining the impact of Islamic banking on performance. We derive both net and gross efficiency estimates, thereby demonstrating that differences in operating characteristics explain much of the difference in costs between Malaysian banks. We also decompose productivity change into efficiency, technical, and scale change using a generalised Malmquist productivity index. On average, Malaysian banks experience moderate scale economies and annual productivity change of 2.68 percent, with the latter driven primarily by technical change, which has declined over time. Our gross efficiency estimates suggest that Islamic banking is associated with higher input requirements. However, our productivity estimates indicate that full-fledged Islamic banks have overcome some of these cost disadvantages with rapid technical change, although this is not the case for conventional banks operating Islamic windows. Merged banks are found to have higher input usage and lower productivity change, suggesting that bank mergers have not contributed positively to bank performance. Finally, our results suggest that while the East Asian financial crisis had a short-term cost-reducing effect in 1998, the crisis triggered a more lasting negative impact by increasing the volume of non-performing loans.
Resumo:
Orthodox contingency theory links effective organisational performance to compatible relationships between the environment and organisation strategy and structure and assumes that organisations have the capacity to adapt as the environment changes. Recent contributions to the literature on organisation theory claim that the key to effective performance is effective adaptation which in turn requires the simultaneous reconciliation of efficiency and innovation which is afforded by an unique environment-organisation configuration. The literature on organisation theory recognises the continuing confusion caused by the fragmented and often conflicting results from cross-sectional studies. Although the case is made for longitudinal studies which comprehensively describe the evolving relationship between the environment and the organisation there is little to suggest how such studies should be executed in practice. Typically the choice is between the approaches of the historicised case study and statistical analysis of large populations which examine the relationship between environment and organisation strategy and/or structure and ignore the product-process relationship. This study combines the historicised case study and the multi-variable and ordinal scale approach of statistical analysis to construct an analytical framework which tracks and exposes the environment-organisation-performance relationship over time. The framework examines changes in the environment, strategy and structure and uniquely includes an assessment of the organisation's product-process relationship and its contribution to organisational efficiency and innovation. The analytical framework is applied to examine the evolving environment-organisation relationship of two organisations in the same industry over the same twenty-five year period to provide a sector perspective of organisational adaptation. The findings demonstrate the significance of the environment-organisation configuration to the scope and frequency of adaptation and suggest that the level of sector homogeneity may be linked to the level of product-process standardisation.