35 resultados para Average Case Complexity
Resumo:
Purpose – The purpose of this paper is to focus on the Fédération Internationale des Ingénieurs-Conseils (FIDIC) White Book standard form of building contract. It tracks the changes to this contract over its four editions, and seeks to identify their underlying causes. Design/methodology/approach – The changes made to the White Book are quantified using a specific type of quantitative content analysis. The amended clauses are then examined to understand the nature of the changes made. Findings – The length of the contract increased by 34 per cent between 1990 and 2006. A large proportion of the overall increase can be attributed to the clauses dealing with “conflict of interest/corruption” and “dispute resolution”. In both instances, the FIDIC drafting committees have responded to international developments to discourage corruption, and to encourage the use of alternative dispute resolution. Between 1998 and 2006, the average length of the sentences increased slightly, raising the question of whether long sentences are easily understood by users of contracts. Research limitations/implications – Quantification of text appears to be particularly useful for the analysis of documents which are regularly updated because changes can be clearly identified and the length of sentences can be determined, leading to conclusions about the readability of the text. However, caution is needed because changes of great relevance can be made to contract clauses without actually affecting their length. Practical implications – The paper will be instructive for contract drafters and informative for users of FIDIC's White Book. Originality/value – Quantifying text has been rarely used regarding standard-form contracts in the field of construction.
Resumo:
The present paper investigates pesticide application types adopted by smallholder potato producers in the Department of Boyacá , Colombia. In this region, environmental, health and adverse economic effects due to pesticide mis- or over-use respectively have been observed. Firstly, pesticide application types were identified based on input-effectiveness. Secondly, their determinants of adoption were investigated. Finally suggestions were given to develop intervention options for transition towards a more sustainable pesticide use. Three application types were identified for fungicide and insecticide. The types differed in terms of input (intensity of pesticide application), effect (damage control), frequency of application, average quantity applied per application, chemical class, and productivity. Then, the determinants of different pesticide application types were investigated with a multinomial logistic regression approach and applying the integrative agent centred (IAC) framework. The area of the plot, attendance at training sessions and educational and income levels were among the most relevant determinants. The analysis suggested that better pesticide use could be fostered to reduce pesticide-related risks in the region. Intervention options were outlined, which may help in targeting this issue. They aim not only at educating farmers, but to change their social and institutional context, by involving other agents of the agricultural system (i.e. pesticide producers), facilitating new institutional settings (i.e. cooperatives) and targeting social dynamics (i.e. conformity to social norms).
Resumo:
Complexity is integral to planning today. Everyone and everything seem to be interconnected, causality appears ambiguous, unintended consequences are ubiquitous, and information overload is a constant challenge. The nature of complexity, the consequences of it for society, and the ways in which one might confront it, understand it and deal with it in order to allow for the possibility of planning, are issues increasingly demanding analytical attention. One theoretical framework that can potentially assist planners in this regard is Luhmann's theory of autopoiesis. This article uses insights from Luhmann's ideas to understand the nature of complexity and its reduction, thereby redefining issues in planning, and explores the ways in which management of these issues might be observed in actual planning practice via a reinterpreted case study of the People's Planning Campaign in Kerala, India. Overall, this reinterpretation leads to a different understanding of the scope of planning and planning practice, telling a story about complexity and systemic response. It allows the reinterpretation of otherwise familiar phenomena, both highlighting the empirical relevance of the theory and providing new and original insight into particular dynamics of the case study. This not only provides a greater understanding of the dynamics of complexity, but also produces advice to help planners implement structures and processes that can cope with complexity in practice.
Resumo:
Forest soils account for a large part of the stable carbon pool held in terrestrial ecosystems. Future levels of atmospheric CO2 are likely to increase C input into the soils through increased above- and below-ground production of forests. This increased input will result in greater sequestration of C only if the additional C enters stable pools. In this review, we compare current observations from four large-scale Free Air FACE Enrichment (FACE) experiments on forest ecosystems (EuroFACE, Aspen-FACE, Duke FACE and ORNL-FACE) and consider their predictive power for long-term C sequestration. At all sites, FACE increased fine root biomass, and in most cases higher fine root turnover resulted in higher C input into soil via root necromass. However, at all sites, soil CO2 efflux also increased in excess of the increased root necromass inputs. A mass balance calculation suggests that a large part of the stimulation of soil CO2 efflux may be due to increased root respiration. Given the duration of these experiments compared with the life cycle of a forest and the complexity of processes involved, it is not yet possible to predict whether elevated CO2 will result in increased C storage in forest soil.
Resumo:
At a time when cities are competing with one another to attract or retain jobs within a globalizing economy, city governments are providing an array of financial incentives to stimulate job growth and retain existing jobs, particularly in high cost locations. This paper provides the first systematic and comprehensive analysis of datasets on economic development incentives in New York City over the last fifteen years. The evidence on job retention and creation is mixed. Although many companies do not meet their agreed-upon job targets in absolute terms, the evidence suggests that companies receiving subsidies outperform their respective industries in terms of employment growth, that is, the grow more, or decline less. We emphasize that this finding is difficult to interpret, since firms receiving incentives may not be representative of the industry as a whole. In other words, their above-average performance may simply reflect the fact that the Economic Development Corporation (EDC) selects economically promising companies within manufacturing (or other industries) when granting incentives. At the same time, it is also possible that receiving incentives helps these companies to become stronger.
Resumo:
This paper investigates the potential benefits and limitations of equal and value-weighted diversification using as the example the UK institutional property market. To achieve this it uses the largest sample (392) of actual property returns that is currently available, over the period 1981 to 1996. To evaluate these issues two approaches are adopted; first, an analysis of the correlations within the sectors and regions and secondly simulations of property portfolios of increasing size constructed both naively and with value-weighting. Using these methods it is shown that the extent of possible risk reduction is limited because of the high positive correlations between assets in any portfolio, even when naively diversified. It is also shown that portfolios exhibit high levels of variability around the average risk, suggesting that previous work seriously understates the number of properties needed to achieve a satisfactory level of diversification. The results have implications for the development and maintenance of a property portfolio because they indicate that the achievable level of risk reduction depends upon the availability of assets, the weighting system used and the investor’s risk tolerance.
Resumo:
More than 30 epiphytic lichens, collected in Agadir (Morroco) and along a 150-km transect from the Atlantic Ocean eastward, were analyzed for their metal content and lead isotopic composition. This dataset was used to evaluate atmospheric metal contamination and the impact of the city on the surrounding area. The concentrations of Cu, Pb, and Zn (average ± 1 SD) were 20.9 ± 15.2 μg g−1, 13.8 ± 9.0 μg g−1, and 56.6 ± 26.6 μg g−1, respectively, with the highest values observed in lichens collected within the urban area. The 206Pb/207Pb and 208Pb/207Pb ratios in the lichens varied from 1.146 to 1.186 and from 2.423 to 2.460, respectively. Alkyllead-gasoline sold in Morocco by the major petrol companies gave isotopic ratios of 206Pb/207Pb = 1.076–1.081 and 208Pb/207Pb = 2.348–2.360. These new, homogeneous values for gasoline-derived lead improve and update the scarce isotopic database of potential lead sources in Morocco, and may be of great value to future environmental surveys on the presence of lead in natural reservoirs, where it persists over time (e.g., soils and sediments). The interest of normalizing metal concentrations in lichens to concentrations of a lithogenic element is demonstrated by the consistency of the results thus obtained with lead isotopic ratios. Leaded gasoline contributed less than 50% of the total amount of lead accumulated in lichens, even in areas subject to high vehicular traffic. This strongly suggests that the recent banishment of leaded gasoline in Morocco will not trigger a drastic improvement in air quality, at least in Agadir.
Resumo:
It is thought that speciation in phytophagous insects is often due to colonization of novel host plants, because radiations of plant and insect lineages are typically asynchronous. Recent phylogenetic comparisons have supported this model of diversification for both insect herbivores and specialized pollinators. An exceptional case where contemporaneous plant insect diversification might be expected is the obligate mutualism between fig trees (Ficus species, Moraceae) and their pollinating wasps (Agaonidae, Hymenoptera). The ubiquity and ecological significance of this mutualism in tropical and subtropical ecosystems has long intrigued biologists, but the systematic challenge posed by >750 interacting species pairs has hindered progress toward understanding its evolutionary history. In particular, taxon sampling and analytical tools have been insufficient for large-scale co-phylogenetic analyses. Here, we sampled nearly 200 interacting pairs of fig and wasp species from across the globe. Two supermatrices were assembled: on average, wasps had sequences from 77% of six genes (5.6kb), figs had sequences from 60% of five genes (5.5 kb), and overall 850 new DNA sequences were generated for this study. We also developed a new analytical tool, Jane 2, for event-based phylogenetic reconciliation analysis of very large data sets. Separate Bayesian phylogenetic analyses for figs and fig wasps under relaxed molecular clock assumptions indicate Cretaceous diversification of crown groups and contemporaneous divergence for nearly half of all fig and pollinator lineages. Event-based co-phylogenetic analyses further support the co-diversification hypothesis. Biogeographic analyses indicate that the presentday distribution of fig and pollinator lineages is consistent with an Eurasian origin and subsequent dispersal, rather than with Gondwanan vicariance. Overall, our findings indicate that the fig-pollinator mutualism represents an extreme case among plant-insect interactions of coordinated dispersal and long-term co-diversification.
Resumo:
This article examines selected methodological insights that complexity theory might provide for planning. In particular, it focuses on the concept of fractals and, through this concept, how ways of organising policy domains across scales might have particular causal impacts. The aim of this article is therefore twofold: (a) to position complexity theory within social science through a ‘generalised discourse’, thereby orienting it to particular ontological and epistemological biases and (b) to reintroduce a comparatively new concept – fractals – from complexity theory in a way that is consistent with the ontological and epistemological biases argued for, and expand on the contribution that this might make to planning. Complexity theory is theoretically positioned as a neo-systems theory with reasons elaborated. Fractal systems from complexity theory are systems that exhibit self-similarity across scales. This concept (as previously introduced by the author in ‘Fractal spaces in planning and governance’) is further developed in this article to (a) illustrate the ontological and epistemological claims for complexity theory, and to (b) draw attention to ways of organising policy systems across scales to emphasise certain characteristics of the systems – certain distinctions. These distinctions when repeated across scales reinforce associated processes/values/end goals resulting in particular policy outcomes. Finally, empirical insights from two case studies in two different policy domains are presented and compared to illustrate the workings of fractals in planning practice.
Resumo:
The Minneapolis Domestic Violence Experiment (MDVE) is a randomized social experiment with imperfect compliance which has been extremely influential in how police officers respond to misdemeanor domestic violence. This paper re-examines data from the MDVE, using recent literature on partial identification to find recidivism associated with a policy that arrests misdemeanor domestic violence suspects rather than not arresting them. Using partially identified bounds on the average treatment effect I find that arresting rather than not arresting suspects can potentially reduce recidivism by more than two-and-a-half times the corresponding intent-to-treat estimate and more than two times the corresponding local average treatment effect, even when making minimal assumptions on counterfactuals.
Resumo:
This article reviews the use of complexity theory in planning theory using the theory of metaphors for theory transfer and theory construction. The introduction to the article presents the author's positioning of planning theory. The first section thereafter provides a general background of the trajectory of development of complexity theory and discusses the rationale of using the theory of metaphors for evaluating the use of complexity theory in planning. The second section introduces the workings of metaphors in general and theory-constructing metaphors in particular, drawing out an understanding of how to proceed with an evaluative approach towards an analysis of the use of complexity theory in planning. The third section presents two case studies – reviews of two articles – to illustrate how the framework might be employed. It then discusses the implications of the evaluation for the question ‘can complexity theory contribute to planning?’ The concluding section discusses the employment of the ‘theory of metaphors’ for evaluating theory transfer and draws out normative suggestions for engaging in theory transfer using the metaphorical route.
Resumo:
The nonlinearity of high-power amplifiers (HPAs) has a crucial effect on the performance of multiple-input-multiple-output (MIMO) systems. In this paper, we investigate the performance of MIMO orthogonal space-time block coding (OSTBC) systems in the presence of nonlinear HPAs. Specifically, we propose a constellation-based compensation method for HPA nonlinearity in the case with knowledge of the HPA parameters at the transmitter and receiver, where the constellation and decision regions of the distorted transmitted signal are derived in advance. Furthermore, in the scenario without knowledge of the HPA parameters, a sequential Monte Carlo (SMC)-based compensation method for the HPA nonlinearity is proposed, which first estimates the channel-gain matrix by means of the SMC method and then uses the SMC-based algorithm to detect the desired signal. The performance of the MIMO-OSTBC system under study is evaluated in terms of average symbol error probability (SEP), total degradation (TD) and system capacity, in uncorrelated Nakagami-m fading channels. Numerical and simulation results are provided and show the effects on performance of several system parameters, such as the parameters of the HPA model, output back-off (OBO) of nonlinear HPA, numbers of transmit and receive antennas, modulation order of quadrature amplitude modulation (QAM), and number of SMC samples. In particular, it is shown that the constellation-based compensation method can efficiently mitigate the effect of HPA nonlinearity with low complexity and that the SMC-based detection scheme is efficient to compensate for HPA nonlinearity in the case without knowledge of the HPA parameters.
Resumo:
The aim of this study is to assess the characteristics of the hot and cold IPO markets on the Stock Exchange of Mauritius (SEM). The results show that the hot issues exhibit, on average, a greater degree of underpricing than the cold issues, although the hot issue phenomenon is not a significant driving force in explaining this short-run underpricing. The results are consistent with the predictions of the changing risk composition hypothesis in suggesting that firms going public during hot markets are on average relatively more risky. The findings also support the time adverse selection hypothesis in that the firms’ quality dispersion is statistically different between hot and cold markets. Finally, the study concludes that firms which go public during hot markets do not underperform those going public in cold markets over the longer term.
Resumo:
The local speeds of object contours vary systematically with the cosine of the angle between the normal component of the local velocity and the global object motion direction. An array of Gabor elements whose speed changes with local spatial orientation in accordance with this pattern can appear to move as a single surface. The apparent direction of motion of plaids and Gabor arrays has variously been proposed to result from feature tracking, vector addition and vector averaging in addition to the geometrically correct global velocity as indicated by the intersection of constraints (IOC) solution. Here a new combination rule, the harmonic vector average (HVA), is introduced, as well as a new algorithm for computing the IOC solution. The vector sum can be discounted as an integration strategy as it increases with the number of elements. The vector average over local vectors that vary in direction always provides an underestimate of the true global speed. The HVA, however, provides the correct global speed and direction for an unbiased sample of local velocities with respect to the global motion direction, as is the case for a simple closed contour. The HVA over biased samples provides an aggregate velocity estimate that can still be combined through an IOC computation to give an accurate estimate of the global velocity, which is not true of the vector average. Psychophysical results for type II Gabor arrays show perceived direction and speed falls close to the IOC direction for Gabor arrays having a wide range of orientations but the IOC prediction fails as the mean orientation shifts away from the global motion direction and the orientation range narrows. In this case perceived velocity generally defaults to the HVA.
Resumo:
This study examines the long-run performance of initial public offerings on the Stock Exchange of Mauritius (SEM). The results show that the 3-year equally weighted cumulative adjusted returns average −16.5%. The magnitude of this underperformance is consistent with most reported studies in different developed and emerging markets. Based on multivariate regression models, firms with small issues and higher ex ante financial strength seem on average to experience greater long-run underperformance, supporting the divergence of opinion and overreaction hypotheses. On the other hand, Mauritian firms do not on average time their offerings to lower cost of capital and as such, there seems to be limited support for the windows of opportunity hypothesis.