47 resultados para Risk Analysis, Security Models, Counter Measures, Threat Networks
em CentAUR: Central Archive University of Reading - UK
Resumo:
Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.
Resumo:
We quantify the risks of climate-induced changes in key ecosystem processes during the 21st century by forcing a dynamic global vegetation model with multiple scenarios from 16 climate models and mapping the proportions of model runs showing forest/nonforest shifts or exceedance of natural variability in wildfire frequency and freshwater supply. Our analysis does not assign probabilities to scenarios or weights to models. Instead, we consider distribution of outcomes within three sets of model runs grouped by the amount of global warming they simulate: <2°C (including simulations in which atmospheric composition is held constant, i.e., in which the only climate change is due to greenhouse gases already emitted), 2–3°C, and >3°C. High risk of forest loss is shown for Eurasia, eastern China, Canada, Central America, and Amazonia, with forest extensions into the Arctic and semiarid savannas; more frequent wildfire in Amazonia, the far north, and many semiarid regions; more runoff north of 50°N and in tropical Africa and northwestern South America; and less runoff in West Africa, Central America, southern Europe, and the eastern U.S. Substantially larger areas are affected for global warming >3°C than for <2°C; some features appear only at higher warming levels. A land carbon sink of ≈1 Pg of C per yr is simulated for the late 20th century, but for >3°C this sink converts to a carbon source during the 21st century (implying a positive climate feedback) in 44% of cases. The risks continue increasing over the following 200 years, even with atmospheric composition held constant.
Resumo:
Three main changes to current risk analysis processes are proposed to improve their transparency, openness, and accountability. First, the addition of a formal framing stage would allow interested parties, experts and officials to work together as needed to gain an initial shared understanding of the issue, the objectives of regulatory action, and alternative risk management measures. Second, the scope of the risk assessment is expanded to include the assessment of health and environmental benefits as well as risks, and the explicit consideration of economic- and social-impacts of risk management action and their distribution. Moreover approaches were developed for deriving improved information from genomic, proteomic and metabolomic profiling methods and for probabilistic modelling of health impacts for risk assessment purposes. Third, in an added evaluation stage, interested parties, experts, and officials may compare and weigh the risks, costs, and benefits and their distribution. As part of a set of recommendations on risk communication, we propose that reports on each stage should be made public.
Resumo:
The method of entropy has been useful in evaluating inconsistency on human judgments. This paper illustrates an entropy-based decision support system called e-FDSS to the solution of multicriterion risk and decision analysis in projects of construction small and medium enterprises (SMEs). It is optimized and solved by fuzzy logic, entropy, and genetic algorithms. A case study demonstrated the use of entropy in e-FDSS on analyzing multiple risk criteria in the predevelopment stage of SME projects. Survey data studying the degree of impact of selected project risk criteria on different projects were input into the system in order to evaluate the preidentified project risks in an impartial environment. Without taking into account the amount of uncertainty embedded in the evaluation process; the results showed that all decision vectors are indeed full of bias and the deviations of decisions are finally quantified providing a more objective decision and risk assessment profile to the stakeholders of projects in order to search and screen the most profitable projects.
Resumo:
Constrained principal component analysis (CPCA) with a finite impulse response (FIR) basis set was used to reveal functionally connected networks and their temporal progression over a multistage verbal working memory trial in which memory load was varied. Four components were extracted, and all showed statistically significant sensitivity to the memory load manipulation. Additionally, two of the four components sustained this peak activity, both for approximately 3 s (Components 1 and 4). The functional networks that showed sustained activity were characterized by increased activations in the dorsal anterior cingulate cortex, right dorsolateral prefrontal cortex, and left supramarginal gyrus, and decreased activations in the primary auditory cortex and "default network" regions. The functional networks that did not show sustained activity were instead dominated by increased activation in occipital cortex, dorsal anterior cingulate cortex, sensori-motor cortical regions, and superior parietal cortex. The response shapes suggest that although all four components appear to be invoked at encoding, the two sustained-peak components are likely to be additionally involved in the delay period. Our investigation provides a unique view of the contributions made by a network of brain regions over the course of a multiple-stage working memory trial.
Resumo:
A review of current risk pricing practices in the financial, insurance and construction sectors is conducted through a comprehensive literature review. The purpose was to inform a study on risk and price in the tendering processes of contractors: specifically, how contractors take account of risk when they are calculating their bids for construction work. The reference to mainstream literature was in view of construction management research as a field of application rather than a fundamental academic discipline. Analytical models are used for risk pricing in the financial sector. Certain mathematical laws and principles of insurance are used to price risk in the insurance sector. construction contractors and practitioners are described to traditionally price allowances for project risk using mechanisms such as intuition and experience. Project risk analysis models have proliferated in recent years. However, they are rarely used because of problems practitioners face when confronted with them. A discussion of practices across the three sectors shows that the construction industry does not approach risk according to the sophisticated mechanisms of the two other sectors. This is not a poor situation in itself. However, knowledge transfer from finance and insurance can help construction practitioners. But also, formal risk models for contractors should be informed by the commercial exigencies and unique characteristics of the construction sector.
Resumo:
A wide variety of exposure models are currently employed for health risk assessments. Individual models have been developed to meet the chemical exposure assessment needs of Government, industry and academia. These existing exposure models can be broadly categorised according to the following types of exposure source: environmental, dietary, consumer product, occupational, and aggregate and cumulative. Aggregate exposure models consider multiple exposure pathways, while cumulative models consider multiple chemicals. In this paper each of these basic types of exposure model are briefly described, along with any inherent strengths or weaknesses, with the UK as a case study. Examples are given of specific exposure models that are currently used, or that have the potential for future use, and key differences in modelling approaches adopted are discussed. The use of exposure models is currently fragmentary in nature. Specific organisations with exposure assessment responsibilities tend to use a limited range of models. The modelling techniques adopted in current exposure models have evolved along distinct lines for the various types of source. In fact different organisations may be using different models for very similar exposure assessment situations. This lack of consistency between exposure modelling practices can make understanding the exposure assessment process more complex, can lead to inconsistency between organisations in how critical modelling issues are addressed (e.g. variability and uncertainty), and has the potential to communicate mixed messages to the general public. Further work should be conducted to integrate the various approaches and models, where possible and regulatory remits allow, to get a coherent and consistent exposure modelling process. We recommend the development of an overall framework for exposure and risk assessment with common approaches and methodology, a screening tool for exposure assessment, collection of better input data, probabilistic modelling, validation of model input and output and a closer working relationship between scientists and policy makers and staff from different Government departments. A much increased effort is required is required in the UK to address these issues. The result will be a more robust, transparent, valid and more comparable exposure and risk assessment process. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Technology involving genetic modification of crops has the potential to make a contribution to rural poverty reduction in many developing countries. Thus far, pesticide-producing Bacillus thuringensis (Bt) varieties of cotton have been the main GM crops under cultivation in developing nations. Several studies have evaluated the farm-level performance of Bt varieties in comparison to conventional ones by estimating production technology, and have mostly found Bt technology to be very successful in raising output and/or reducing pesticide input. However, the production risk properties of this technology have not been studied, although they are likely to be important to risk-averse smallholders. This study investigates the output risk aspects of Bt technology by estimating two 'flexible risk' production function models allowing technology to independently affect the mean and higher moments of output. The first is the popular Just-Pope model and the second is a more general 'damage control' flexible risk model. The models are applied to cross-sectional data on South African smallholders, some of whom used Bt varieties. The results show no evidence that a 'risk-reduction' claim can be made for Bt technology. Indeed, there is some evidence to support the notion that the technology increases output risk, implying that simple (expected) profit computations used in past evaluations may overstate true benefits.
Resumo:
Data from six studies with male broilers fed diets covering a wide range of energy and protein were used in the current two analyses. In the first analysis, five models, specifically re-parameterized for analysing energy balance data, were evaluated for their ability to determine metabolizable energy intake at maintenance and efficiency of utilization of metabolizable energy intake for producing gain. In addition to the straight line, two types of functional form were used. They were forms describing (i) diminishing returns behaviour (monomolecular and rectangular hyperbola) and (ii) sigmoidal behaviour with a fixed point of inflection (Gompertz and logistic). These models determined metabolizable energy requirement for maintenance to be in the range 437-573 kJ/kg of body weight/day depending on the model. The values determined for average net energy requirement for body weight gain varied from 7(.)9 to 11(.)2 kJ/g of body weight. These values show good agreement with previous studies. In the second analysis, three types of function were assessed as candidates for describing the relationship between body weight and cumulative metabolizable energy intake. The functions used were: (a) monomolecular (diminishing returns behaviour), (b) Gompertz (smooth sigmoidal behaviour with a fixed point of inflection) and (c) Lopez, France and Richards (diminishing returns and sigmoidal behaviour with a variable point of inflection). The results of this analysis demonstrated that equations capable of mimicking the law of diminishing returns describe accurately the relationship between body weight and cumulative metabolizable energy intake in broilers.
Resumo:
Technology involving genetic modification of crops has the potential to make a contribution to rural poverty reduction in many developing countries. Thus far, pesticide-producing Bacillus thuringensis (Bt) varieties of cotton have been the main GM crops under cultivation in developing nations. Several studies have evaluated the farm-level performance of Bt varieties in comparison to conventional ones by estimating production technology, and have mostly found Bt technology to be very successful in raising output and/or reducing pesticide input. However, the production risk properties of this technology have not been studied, although they are likely to be important to risk-averse smallholders. This study investigates the output risk aspects of Bt technology by estimating two 'flexible risk' production function models allowing technology to independently affect the mean and higher moments of output. The first is the popular Just-Pope model and the second is a more general 'damage control' flexible risk model. The models are applied to cross-sectional data on South African smallholders, some of whom used Bt varieties. The results show no evidence that a 'risk-reduction' claim can be made for Bt technology. Indeed, there is some evidence to support the notion that the technology increases output risk, implying that simple (expected) profit computations used in past evaluations may overstate true benefits.