919 resultados para Transaction-cost theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whilst much is known of new technology adopters, little research has addressed the role of their attitudes in adoption decisions; particularly, for technologies with evident economic potential that have not been taken up by farmers. This paper presents recent research that has used a new approach which examines the role that adopters' attitudes play in identifying the drivers of and barriers to adoption. The study was concerned with technologies for livestock farming systems in SW England, specifically oestrus detection, nitrogen supply management, and, inclusion of white clover. The adoption behaviour is analysed using the social-psychology theory of reasoned action to identify factors that affect the adoption of technologies, which are confirmed using principal components analysis. The results presented here relate to the specific adoption behaviour regarding the Milk Development Council's recommended observation times for heat detection. The factors that affect the adoption of this technology are: cost effectiveness, improved detection and conception rates as the main drivers, whilst the threat to demean the personal knowledge and skills of a farmer in 'knowing' their cows is a barrier. This research shows clearly that promotion of a technology and transfer of knowledge for a farming system need to take account of the beliefs and attitudes of potential adopters. (C) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The constructivist model of 'soft' value management (VM) is contrasted with the VM discourse appropriated by cost consultants who operate from within UK quantity surveying (QS) practices. The enactment of VM by cost consultants is shaped by the institutional context within which they operate and is not necessarily representative of VM practice per se. Opportunities to perform VM during the formative stages of design are further constrained by the positivistic rhetoric that such practitioners use to conceptualize and promote their services. The complex interplay between VM theory and practice is highlighted and analysed from a non-deterministic perspective. Codified models of 'best practice' are seen to be socially constructed and legitimized through human interaction in the context of interorganizational networks. Published methodologies are seen to inform practice in only a loose and indirect manner, with extensive scope for localized improvization. New insights into the relationship between VM theory and practice are derived from the dramaturgical metaphor. The social reality of VM is seen to be constituted through scripts and performances, both of which are continuously contested across organizational arenas. It is concluded that VM defies universal definition and is conceptualized and enacted differently across different localized contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Firms form consortia in order to win contracts. Once a project has been awarded to a consortium each member then concentrates on his or her own contract with the client. Therefore, consortia are marketing devices, which present the impression of teamworking, but the production process is just as fragmented as under conventional procurement methods. In this way, the consortium forms a barrier between the client and the actual construction production process. Firms form consortia, not as a simple development of normal ways of working, but because the circumstances for specific projects make it a necessary vehicle. These circumstances include projects that are too large or too complex to undertake alone or projects that require on-going services which cannot be provided by the individual firms inhouse. It is not a preferred way of working, because participants carry extra risk in the form of liability for the actions of their partners in the consortium. The behaviour of members of consortia is determined by their relative power, based on several factors, including financial commitment and ease of replacement. The level of supply chain visibility to the public sector client and to the industry is reduced by the existence of a consortium because the consortium forms an additional obstacle between the client and the firms undertaking the actual construction work. Supply chain visibility matters to the client who otherwise loses control over the process of construction or service provision, while remaining accountable for cost overruns. To overcome this separation there is a convincing argument in favour of adopting the approach put forward in the Project Partnering Contract 2000 (PPC2000) Agreement. Members of consortia do not necessarily go on to work in the same consortia again because members need to respond flexibly to opportunities as and when they arise. Decision-making processes within consortia tend to be on an ad hoc basis. Construction risk is taken by the contractor and the construction supply chain but the reputational risk is carried by all the firms associated with a consortium. There is a wide variation in the manner that consortia are formed, determined by the individual circumstances of each project; its requirements, size and complexity, and the attitude of individual project leaders. However, there are a number of close working relationships based on generic models of consortia-like arrangements for the purpose of building production, such as the Housing Corporation Guidance Notes and the PPC2000.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal and analytical models that contractors can use to assess and price project risk at the tender stage have proliferated in recent years. However, they are rarely used in practice. Introducing more models would, therefore, not necessarily help. A better understanding is needed of how contractors arrive at a bid price in practice, and how, and in what circumstances, risk apportionment actually influences pricing levels. More than 60 proposed risk models for contractors that are published in journals were examined and classified. Then exploratory interviews with five UK contractors and documentary analyses on how contractors price work generally and risk specifically were carried out to help in comparing the propositions from the literature to what contractors actually do. No comprehensive literature on the real bidding processes used in practice was found, and there is no evidence that pricing is systematic. Hence, systematic risk and pricing models for contractors may have no justifiable basis. Contractors process their bids through certain tendering gateways. They acknowledge the risk that they should price. However, the final settlement depends on a set of complex, micro-economic factors. Hence, risk accountability may be smaller than its true cost to the contractor. Risk apportionment occurs at three stages of the whole bid-pricing process. However, analytical approaches tend not to incorporate this, although they could.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real estate development appraisal is a quantification of future expectations. The appraisal model relies upon the valuer/developer having an understanding of the future in terms of the future marketability of the completed development and the future cost of development. In some cases the developer has some degree of control over the possible variation in the variables, as with the cost of construction through the choice of specification. However, other variables, such as the sale price of the final product, are totally dependent upon the vagaries of the market at the completion date. To try to address the risk of a different outcome to the one expected (modelled) the developer will often carry out a sensitivity analysis on the development. However, traditional sensitivity analysis has generally only looked at the best and worst scenarios and has focused on the anticipated or expected outcomes. This does not take into account uncertainty and the range of outcomes that can happen. A fuller analysis should include examination of the uncertainties in each of the components of the appraisal and account for the appropriate distributions of the variables. Similarly, as many of the variables in the model are not independent, the variables need to be correlated. This requires a standardised approach and we suggest that the use of a generic forecasting software package, in this case Crystal Ball, allows the analyst to work with an existing development appraisal model set up in Excel (or other spreadsheet) and to work with a predetermined set of probability distributions. Without a full knowledge of risk, developers are unable to determine the anticipated level of return that should be sought to compensate for the risk. This model allows the user a better understanding of the possible outcomes for the development. Ultimately the final decision will be made relative to current expectations and current business constraints, but by assessing the upside and downside risks more appropriately, the decision maker should be better placed to make a more informed and “better”.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper relates the key findings of the optimal economic enforcement literature to practical issues of enforcing forest and wildlife management access restrictions in developing countries. Our experiences, particularly from Tanzania and eastern India, provide detail of the key pragmatic issues facing those responsible for protecting natural resources. We identify large gaps in the theoretical literature that limit its ability to inform practical management, including issues of limited funding and cost recovery, multiple tiers of enforcement and the incentives facing enforcement officers, and conflict between protected area managers and rural people's needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

First, we survey recent research in the application of optimal tax theory to housing. This work suggests that the under-taxation of housing for owner occupation distorts investment so that owner occupiers are encouraged to over-invest in housing. Simulations of the US economy suggest that this is true there. But, the theoretical work excludes consideration of land and the simulations exclude consideration of taxes other than income taxes. These exclusions are important for the US and UK economies. In the US, the property tax is relatively high. We argue that excluding the property tax is wrong, so that, when the property tax is taken into account, owner occupied housing is not undertaxed in the US. In the UK, property taxes are relatively low but the cost of land has been increasing in real terms for forty years as a result of a policy of constraining land for development. The price of land for housing is now higher than elsewhere. Effectively, an implicit tax is paid by first time buyers which has reduced housing investment. When land is taken into account over-investment in housing is not encouraged in the UK either.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Commercial real estate is a highly specific asset: heterogeneous, indivisible and with less information transparency than most other commonly held investment assets. These attributes encourage the use of intermediaries during asset acquisition and disposal. However, there are few attempts to explain the use of different brokerage models (with differing costs) in different markets. This study aims to address this gap. Design/methodology/approach – The study analyses 9,338 real estate transactions in London and New York City from 2001 to 2011. Data are provided by Real Capital Analytics and cover over $450 billion of investments in this period. Brokerage trends in the two cities are compared and probit regressions are used to test whether the decision to transact with broker representation varies with investor or asset characteristics. Findings – Results indicate greater use of brokerage in London, especially by purchasers. This persists when data are disaggregated by sector, time or investor type, pointing to the role of local market culture and institutions in shaping brokerage models and transaction costs. Within each city, the nature of the investors involved seems to be a more significant influence on broker use than the characteristics of the assets being traded. Originality/value – Brokerage costs are the single largest non-tax charge to an investor when trading commercial real estate, yet there is little research in this area. This study examines the role of brokers and provides empirical evidence on factors that influence the use and mode of brokerage in two major investment destinations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyses the impact of trading costs on the profitability of momentum strategies in the United Kingdom and concludes that losers are more expensive to trade than winners. The observed asymmetry in the costs of trading winners and losers crucially relates to the high cost of selling loser stocks with small size and low trading volume. Since transaction costs severely impact net momentum profits, the paper defines a new low-cost relative-strength strategy by shortlisting from all winner and loser stocks those with the lowest total transaction costs. While the study severely questions the profitability of standard momentum strategies, it concludes that there is still room for momentum-based return enhancement, should asset managers decide to adopt low-cost relative-strength strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Last fall, a network of the European Cooperation in Science and Technology (COST), called “Basic Concepts for Convection Parameterization in Weather Forecast and Climate Models” (COST Action ES0905; see http://w3.cost.esf.org/index.php?id=205&action_number=ES0905), organized a 10-day training course on atmospheric convection and its parameterization. The aim of the workshop, held on the island of Brac, Croatia, was to help young scientists develop an in-depth understanding of the core theory underpinning convection parameterizations. The speakers also sought to impart an appreciation of the various approximations, compromises, and ansatz necessary to translate theory into operational practice for numerical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the determinants of cross-platform arbitrage profits. We develop a structural model that enables us to decompose the likelihood of an arbitrage opportunity into three distinct factors: the fixed cost to trade the opportunity, the level at which one of the platforms delays a price update and the impact of the order flow on the quoted prices (inventory and asymmetric information effects). We then investigate the predictions from the theoretical model for the European Bond market with the estimation of a probit model. Our main finding is that the results found in the empirical part corroborate strongly the predictions from the structural model. The event of a cross market arbitrage opportunity has a certain degree of predictability where an optimal ex ante scenario is represented by a low level of spreads on both platforms, a time of the day close to the end of trading hours and a high volume of trade.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta tese de Doutorado é dedicada ao estudo de instabilidade financeira e dinâmica em Teoria Monet ária. E demonstrado que corridas banc árias são eliminadas sem custos no modelo padrão de teoria banc ária quando a popula ção não é pequena. É proposta uma extensão em que incerteza agregada é mais severa e o custo da estabilidade financeira é relevante. Finalmente, estabelece-se otimalidade de transições na distribui ção de moeda em economias em que oportunidades de trocas são escassas e heterogêneas. Em particular, otimalidade da inflação depende dos incentivos dinâmicos proporcionados por tais transi ções. O capí tulo 1 estabelece o resultado de estabilidade sem custos para economias grandes ao estudar os efeitos do tamanho populacional na an álise de corridas banc árias de Peck & Shell. No capí tulo 2, otimalidade de dinâmica é estudada no modelo de monet ário de Kiyotaki & Wright quando a sociedade é capaz de implementar uma polí tica inflacion ária. Apesar de adotar a abordagem de desenho de mecanismos, este capí tulo faz um paralelo com a an álise de Sargent & Wallace (1981) ao destacar efeitos de incentivos dinâmicos sobre a interação entre as polí ticas monet ária e fiscal. O cap ítulo 3 retoma o tema de estabilidade fi nanceira ao quanti car os custos envolvidos no desenho ótimo de um setor bancário à prova de corridas e ao propor uma estrutura informacional alternativa que possibilita bancos insolventes. A primeira an álise mostra que o esquema de estabilidade ótima exibe altas taxas de juros de longo prazo e a segunda que monitoramento imperfeito pode levar a corridas bancárias com insolvência.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When crosscutting concerns identification is performed from the beginning of development, on the activities involved in requirements engineering, there are many gains in terms of quality, cost and efficiency throughout the lifecycle of software development. This early identification supports the evolution of requirements, detects possible flaws in the requirements specification, improves traceability among requirements, provides better software modularity and prevents possible rework. However, despite these several advantages, the crosscutting concerns identification over requirements engineering faces several difficulties such as the lack of systematization and tools that support it. Furthermore, it is difficult to justify why some concerns are identified as crosscutting or not, since this identification is, most often, made without any methodology that systematizes and bases it. In this context, this paper proposes an approach based on Grounded Theory, called GT4CCI, for systematizing and basing the process of identifying crosscutting concerns in the initial stages of the software development process in the requirements document. Grounded Theory is a renowned methodology for qualitative analysis of data. Through the use of GT4CCI it is possible to better understand, track and document concerns, adding gains in terms of quality, reliability and modularity of the entire lifecycle of software

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A body of research has developed within the context of nonlinear signal and image processing that deals with the automatic, statistical design of digital window-based filters. Based on pairs of ideal and observed signals, a filter is designed in an effort to minimize the error between the ideal and filtered signals. The goodness of an optimal filter depends on the relation between the ideal and observed signals, but the goodness of a designed filter also depends on the amount of sample data from which it is designed. In order to lessen the design cost, a filter is often chosen from a given class of filters, thereby constraining the optimization and increasing the error of the optimal filter. To a great extent, the problem of filter design concerns striking the correct balance between the degree of constraint and the design cost. From a different perspective and in a different context, the problem of constraint versus sample size has been a major focus of study within the theory of pattern recognition. This paper discusses the design problem for nonlinear signal processing, shows how the issue naturally transitions into pattern recognition, and then provides a review of salient related pattern-recognition theory. In particular, it discusses classification rules, constrained classification, the Vapnik-Chervonenkis theory, and implications of that theory for morphological classifiers and neural networks. The paper closes by discussing some design approaches developed for nonlinear signal processing, and how the nature of these naturally lead to a decomposition of the error of a designed filter into a sum of the following components: the Bayes error of the unconstrained optimal filter, the cost of constraint, the cost of reducing complexity by compressing the original signal distribution, the design cost, and the contribution of prior knowledge to a decrease in the error. The main purpose of the paper is to present fundamental principles of pattern recognition theory within the framework of active research in nonlinear signal processing.