15 resultados para critical path methods

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Critical loads are the basis for policies controlling emissions of acidic substances in Europe. The implementation of these policies involves large expenditures, and it is reasonable for policymakers to ask what degree of certainty can be attached to the underlying critical load and exceedance estimates. This paper is a literature review of studies which attempt to estimate the uncertainty attached to critical loads. Critical load models and uncertainty analysis are briefly outlined. Most studies have used Monte Carlo analysis of some form to investigate the propagation of uncertainties in the definition of the input parameters through to uncertainties in critical loads. Though the input parameters are often poorly known, the critical load uncertainties are typically surprisingly small because of a "compensation of errors" mechanism. These results depend on the quality of the uncertainty estimates of the input parameters, and a "pedigree" classification for these is proposed. Sensitivity analysis shows that some input parameters are more important in influencing critical load uncertainty than others, but there have not been enough studies to form a general picture. Methods used for dealing with spatial variation are briefly discussed. Application of alternative models to the same site or modifications of existing models can lead to widely differing critical loads, indicating that research into the underlying science needs to continue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports an uncertainty analysis of critical loads for acid deposition for a site in southern England, using the Steady State Mass Balance Model. The uncertainty bounds, distribution type and correlation structure for each of the 18 input parameters was considered explicitly, and overall uncertainty estimated by Monte Carlo methods. Estimates of deposition uncertainty were made from measured data and an atmospheric dispersion model, and hence the uncertainty in exceedance could also be calculated. The uncertainties of the calculated critical loads were generally much lower than those of the input parameters due to a "compensation of errors" mechanism - coefficients of variation ranged from 13% for CLmaxN to 37% for CL(A). With 1990 deposition, the probability that the critical load was exceeded was > 0.99; to reduce this probability to 0.50, a 63% reduction in deposition is required; to 0.05, an 82% reduction. With 1997 deposition, which was lower than that in 1990, exceedance probabilities declined and uncertainties in exceedance narrowed as deposition uncertainty had less effect. The parameters contributing most to the uncertainty in critical loads were weathering rates, base cation uptake rates, and choice of critical chemical value, indicating possible research priorities. However, the different critical load parameters were to some extent sensitive to different input parameters. The application of such probabilistic results to environmental regulation is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Critical loads are the basis for policies controlling emissions of acidic substances in Europe and elsewhere. They are assessed by several elaborate and ingenious models, each of which requires many parameters, and have to be applied on a spatially-distributed basis. Often the values of the input parameters are poorly known, calling into question the validity of the calculated critical loads. This paper attempts to quantify the uncertainty in the critical loads due to this "parameter uncertainty", using examples from the UK. Models used for calculating critical loads for deposition of acidity and nitrogen in forest and heathland ecosystems were tested at four contrasting sites. Uncertainty was assessed by Monte Carlo methods. Each input parameter or variable was assigned a value, range and distribution in an objective a fashion as possible. Each model was run 5000 times at each site using parameters sampled from these input distributions. Output distributions of various critical load parameters were calculated. The results were surprising. Confidence limits of the calculated critical loads were typically considerably narrower than those of most of the input parameters. This may be due to a "compensation of errors" mechanism. The range of possible critical load values at a given site is however rather wide, and the tails of the distributions are typically long. The deposition reductions required for a high level of confidence that the critical load is not exceeded are thus likely to be large. The implication for pollutant regulation is that requiring a high probability of non-exceedance is likely to carry high costs. The relative contribution of the input variables to critical load uncertainty varied from site to site: any input variable could be important, and thus it was not possible to identify variables as likely targets for research into narrowing uncertainties. Sites where a number of good measurements of input parameters were available had lower uncertainties, so use of in situ measurement could be a valuable way of reducing critical load uncertainty at particularly valuable or disputed sites. From a restricted number of samples, uncertainties in heathland critical loads appear comparable to those of coniferous forest, and nutrient nitrogen critical loads to those of acidity. It was important to include correlations between input variables in the Monte Carlo analysis, but choice of statistical distribution type was of lesser importance. Overall, the analysis provided objective support for the continued use of critical loads in policy development. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of the anomalously warm European summer of 2003 highlighted the importance of understanding the relationship between elevated atmospheric temperature and human mortality. This review is an extension of the brief evidence examining this relationship provided in the IPCC’s Assessment Reports. A comprehensive and critical review of the literature is presented, which highlights avenues for further research, and the respective merits and limitations of the methods used to analyse the relationships. In contrast to previous reviews that concentrate on the epidemiological evidence, this review acknowledges the inter-disciplinary nature of the topic and examines the evidence presented in epidemiological, environmental health, and climatological journals. As such, present temperature–mortality relationships are reviewed, followed by a discussion of how these are likely to change under climate change scenarios. The importance of uncertainty, and methods to include it in future work, are also considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Answering many of the critical questions in conservation, development and environmental management requires integrating the social and natural sciences. However, understanding the array of available quantitative methods and their associated terminology presents a major barrier to successful collaboration. We provide an overview of quantitative socio-economic methods that distils their complexity into a simple taxonomy. We outline how each has been used in conjunction with ecological models to address questions relating to the management of socio-ecological systems. We review the application of social and ecological quantitative concepts to agro-ecology and classify the approaches used to integrate the two disciplines. Our review included all published integrated models from 2003 to 2008 in 27 journals that publish agricultural modelling research. Although our focus is on agro-ecology, many of the results are broadly applicable to other fields involving an interaction between human activities and ecology. We found 36 papers that integrated social and ecological concepts in a quantitative model. Four different approaches to integration were used, depending on the scale at which human welfare was quantified. Most models viewed humans as pure profit maximizers, both when calculating welfare and predicting behaviour. Synthesis and applications. We reached two main conclusions based on our taxonomy and review. The first is that quantitative methods that extend predictions of behaviour and measurements of welfare beyond a simple market value basis are underutilized by integrated models. The second is that the accuracy of prediction for integrated models remains largely unquantified. Addressing both problems requires researchers to reach a common understanding of modelling goals and data requirements during the early stages of a project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When assessing hypotheses, the possibility and consequences of false-positive conclusions should be considered along with the avoidance of false-negative ones. A recent assessment of the system of rice intensification (SRI) by McDonald et al. [McDonald, A.J., Hobbs, P.R., Riha, S.J., 2006. Does the system of rice intensification outperform conventional best management? A synopsis of the empirical record. Field Crops Res. 96, 31-36] provides a good example where this was not done as it was preoccupied with avoiding false-positives only. It concluded, based on a desk study using secondary data assembled selectively from diverse sources and with a 95% level of confidence, that 'best management practices' (BMPs) on average produce 11% higher rice yields than SRI methods, and that, therefore, SRI has little to offer beyond what is already known by scientists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper formally derives a new path-based neural branch prediction algorithm (FPP) into blocks of size two for a lower hardware solution while maintaining similar input-output characteristic to the algorithm. The blocked solution, here referred to as B2P algorithm, is obtained using graph theory and retiming methods. Verification approaches were exercised to show that prediction performances obtained from the FPP and B2P algorithms differ within one mis-prediction per thousand instructions using a known framework for branch prediction evaluation. For a chosen FPGA device, circuits generated from the B2P algorithm showed average area savings of over 25% against circuits for the FPP algorithm with similar time performances thus making the proposed blocked predictor superior from a practical viewpoint.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops cycle-level FPGA circuits of an organization for a fast path-based neural branch predictor Our results suggest that practical sizes of prediction tables are limited to around 32 KB to 64 KB in current FPGA technology due mainly to FPGA area of logic resources to maintain the tables. However the predictor scales well in terms of prediction speed. Table sizes alone should not be used as the only metric for hardware budget when comparing neural-based predictor to predictors of totally different organizations. This paper also gives early evidence to shift the attention on to the recovery from mis-prediction latency rather than on prediction latency as the most critical factor impacting accuracy of predictions for this class of branch predictors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dual Carrier Modulation (DCM) was chosen as the higher data rate modulation scheme for MB-OFDM (Multiband Orthogonal Frequency Division Multiplexing) in the UWB (Ultra-Wide Band) radio platform ECMA-368. ECMA-368 has been chosen as the physical implementation for high data rate Wireless USB (W-USB) and Bluetooth 3.0. In this paper, different demapping methods for the DCM demapper are presented, being Soft Bit, Maximum Likely (ML) Soft Bit and Log Likelihood Ratio (LLR). Frequency diversity and Channel State Information (CSI) are further techniques to enhance demapping methods. The system performance for those DCM demapping methods simulated in realistic multi-path environments are provided and compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article summarises recent revisions to the investment development path (IDP) as postulated by Narula and Dunning (2010). The IDP provides a framework to understand the dynamic interaction between foreign direct investment (FDI) and economic development. The revisions take into account some recent changes in the global economic environment. This paper argues that studies based on the IDP should adopt a broader perspective, encompassing the idiosyncratic economic structure of countries as well as the heterogeneous nature of FDI. It is critical to understand the complex forces and interactions that determine the turning points in a country’s IDP, and to more explicitly acknowledge the role of historical, social and political circumstances in hindering or promoting FDI. We discuss some of the implications for Eastern European countries and provide some guidelines for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper reviews the leading diagramming methods employed in system dynamics to communicate the contents of models. The main ideas and historical development of the field are first outlined. Two diagramming methods—causal loop diagrams (CLDs) and stock/flow diagrams (SFDs)—are then described and their advantages and limitations discussed. A set of broad research directions is then outlined. These concern: the abilities of different diagrams to communicate different ideas, the role that diagrams have in group model building, and the question of whether diagrams can be an adequate substitute for simulation modelling. The paper closes by suggesting that although diagrams alone are insufficient, they have many benefits. However, since these benefits have emerged only as ‘craft wisdom’, a more rigorous programme of research into the diagrams' respective attributes is called for.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simultaneous scintillometer measurements at multiple wavelengths (pairing visible or infrared with millimetre or radio waves) have the potential to provide estimates of path-averaged surface fluxes of sensible and latent heat. Traditionally, the equations to deduce fluxes from measurements of the refractive index structure parameter at the two wavelengths have been formulated in terms of absolute humidity. Here, it is shown that formulation in terms of specific humidity has several advantages. Specific humidity satisfies the requirement for a conserved variable in similarity theory and inherently accounts for density effects misapportioned through the use of absolute humidity. The validity and interpretation of both formulations are assessed and the analogy with open-path infrared gas analyser density corrections is discussed. Original derivations using absolute humidity to represent the influence of water vapour are shown to misrepresent the latent heat flux. The errors in the flux, which depend on the Bowen ratio (larger for drier conditions), may be of the order of 10%. The sensible heat flux is shown to remain unchanged. It is also verified that use of a single scintillometer at optical wavelengths is essentially unaffected by these new formulations. Where it may not be possible to reprocess two-wavelength results, a density correction to the latent heat flux is proposed for scintillometry, which can be applied retrospectively to reduce the error.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From Milsom's equations, which describe the geometry of ray-path hops reflected from the ionospheric F-layer, algorithms for the simplified estimation of mirror-reflection height are developed. These allow for hop length and the effects of variations in underlying ionisation (via the ratio of the F2- and E-layer critical frequencies) and F2-layer peak height (via the M(3000)F2-factor). Separate algorithms are presented which are applicable to a range of signal frequencies about the FOT and to propagation at the MUF. The accuracies and complexities of the algorithms are compared with those inherent in the use of a procedure based on an equation developed by Shimazaki.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dominant paradigms of causal explanation for why and how Western liberal-democracies go to war in the post-Cold War era remain versions of the 'liberal peace' or 'democratic peace' thesis. Yet such explanations have been shown to rest upon deeply problematic epistemological and methodological assumptions. Of equal importance, however, is the failure of these dominant paradigms to account for the 'neoliberal revolution' that has gripped Western liberal-democracies since the 1970s. The transition from liberalism to neoliberalism remains neglected in analyses of the contemporary Western security constellation. Arguing that neoliberalism can be understood simultaneously through the Marxian concept of ideology and the Foucauldian concept of governmentality – that is, as a complementary set of 'ways of seeing' and 'ways of being' – the thesis goes on to analyse British security in policy and practice, considering it as an instantiation of a wider neoliberal way of war. In so doing, the thesis draws upon, but also challenges and develops, established critical discourse analytic methods, incorporating within its purview not only the textual data that is usually considered by discourse analysts, but also material practices of security. This analysis finds that contemporary British security policy is predicated on a neoliberal social ontology, morphology and morality – an ideology or 'way of seeing' – focused on the notion of a globalised 'network-market', and is aimed at rendering circulations through this network-market amenable to neoliberal techniques of government. It is further argued that security practices shaped by this ideology imperfectly and unevenly achieve the realisation of neoliberal 'ways of being' – especially modes of governing self and other or the 'conduct of conduct' – and the re-articulation of subjectivities in line with neoliberal principles of individualism, risk, responsibility and flexibility. The policy and practice of contemporary British 'security' is thus recontextualised as a component of a broader 'neoliberal way of war'.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Elucidating the biological and biochemical roles of proteins, and subsequently determining their interacting partners, can be difficult and time consuming using in vitro and/or in vivo methods, and consequently the majority of newly sequenced proteins will have unknown structures and functions. However, in silico methods for predicting protein–ligand binding sites and protein biochemical functions offer an alternative practical solution. The characterisation of protein–ligand binding sites is essential for investigating new functional roles, which can impact the major biological research spheres of health, food, and energy security. In this review we discuss the role in silico methods play in 3D modelling of protein–ligand binding sites, along with their role in predicting biochemical functionality. In addition, we describe in detail some of the key alternative in silico prediction approaches that are available, as well as discussing the Critical Assessment of Techniques for Protein Structure Prediction (CASP) and the Continuous Automated Model EvaluatiOn (CAMEO) projects, and their impact on developments in the field. Furthermore, we discuss the importance of protein function prediction methods for tackling 21st century problems.