58 resultados para high-stakes assessment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A full assessment of para-­virtualization is important, because without knowledge about the various overheads, users can not understand whether using virtualization is a good idea or not. In this paper we are very interested in assessing the overheads of running various benchmarks on bare-­‐metal, as well as on para-­‐virtualization. The idea is to see what the overheads of para-­‐ virtualization are, as well as looking at the overheads of turning on monitoring and logging. The knowledge from assessing various benchmarks on these different systems will help a range of users understand the use of virtualization systems. In this paper we assess the overheads of using Xen, VMware, KVM and Citrix, see Table 1. These different virtualization systems are used extensively by cloud-­‐users. We are using various Netlib1 benchmarks, which have been developed by the University of Tennessee at Knoxville (UTK), and Oak Ridge National Laboratory (ORNL). In order to assess these virtualization systems, we run the benchmarks on bare-­‐metal, then on the para-­‐virtualization, and finally we turn on monitoring and logging. The later is important as users are interested in Service Level Agreements (SLAs) used by the Cloud providers, and the use of logging is a means of assessing the services bought and used from commercial providers. In this paper we assess the virtualization systems on three different systems. We use the Thamesblue supercomputer, the Hactar cluster and IBM JS20 blade server (see Table 2), which are all servers available at the University of Reading. A functional virtualization system is multi-­‐layered and is driven by the privileged components. Virtualization systems can host multiple guest operating systems, which run on its own domain, and the system schedules virtual CPUs and memory within each Virtual Machines (VM) to make the best use of the available resources. The guest-­‐operating system schedules each application accordingly. You can deploy virtualization as full virtualization or para-­‐virtualization. Full virtualization provides a total abstraction of the underlying physical system and creates a new virtual system, where the guest operating systems can run. No modifications are needed in the guest OS or application, e.g. the guest OS or application is not aware of the virtualized environment and runs normally. Para-­‐virualization requires user modification of the guest operating systems, which runs on the virtual machines, e.g. these guest operating systems are aware that they are running on a virtual machine, and provide near-­‐native performance. You can deploy both para-­‐virtualization and full virtualization across various virtualized systems. Para-­‐virtualization is an OS-­‐assisted virtualization; where some modifications are made in the guest operating system to enable better performance. In this kind of virtualization, the guest operating system is aware of the fact that it is running on the virtualized hardware and not on the bare hardware. In para-­‐virtualization, the device drivers in the guest operating system coordinate the device drivers of host operating system and reduce the performance overheads. The use of para-­‐virtualization [0] is intended to avoid the bottleneck associated with slow hardware interrupts that exist when full virtualization is employed. It has revealed [0] that para-­‐ virtualization does not impose significant performance overhead in high performance computing, and this in turn this has implications for the use of cloud computing for hosting HPC applications. The “apparent” improvement in virtualization has led us to formulate the hypothesis that certain classes of HPC applications should be able to execute in a cloud environment, with minimal performance degradation. In order to support this hypothesis, first it is necessary to define exactly what is meant by a “class” of application, and secondly it will be necessary to observe application performance, both within a virtual machine and when executing on bare hardware. A further potential complication is associated with the need for Cloud service providers to support Service Level Agreements (SLA), so that system utilisation can be audited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a need for better links between hydrology and ecology, specifically between landscapes and riverscapes to understand how processes and factors controlling the transport and storage of environmental pollution have affected or will affect the freshwater biota. Here we show how the INCA modelling framework, specifically INCA-Sed (the Integrated Catchments model for Sediments) can be used to link sediment delivery from the landscape to sediment changes in-stream. INCA-Sed is a dynamic, process-based, daily time step model. The first complete description of the equations used in the INCA-Sed software (version 1.9.11) is presented. This is followed by an application of INCA-Sed made to the River Lugg (1077 km2) in Wales. Excess suspended sediment can negatively affect salmonid health. The Lugg has a large and potentially threatened population of both Atlantic salmon (Salmo salar) and Brown Trout (Salmo trutta). With the exception of the extreme sediment transport processes, the model satisfactorily simulated both the hydrology and the sediment dynamics in the catchment. Model results indicate that diffuse soil loss is the most important sediment generation process in the catchment. In the River Lugg, the mean annual Guideline Standard for suspended sediment concentration, proposed by UKTAG, of 25 mg l− 1 is only slightly exceeded during the simulation period (1995–2000), indicating only minimal effect on the Atlantic salmon population. However, the daily time step simulation of INCA-Sed also allows the investigation of the critical spawning period. It shows that the sediment may have a significant negative effect on the fish population in years with high sediment runoff. It is proposed that the fine settled particles probably do not affect the salmonid egg incubation process, though suspended particles may damage the gills of fish and make the area unfavourable for spawning if the conditions do not improve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Growing legislative pressures and increasing stakeholder awareness of environmental issues are pushing the property market to consider high-performance, low-impact retail buildings. The office sector is relatively advanced in its apparent appreciation of such buildings; however, the retail sector is slow to recognize these benefits. In exploring the business case for high-performance design adoption in the retail sector, this paper examines the overlaps between office and retail sector benefits and considers the potential benefits peculiar to retailers. Barriers to high-performance design adoption are then addressed through case research, interviews with key representatives from the retail property market and a questionnaire survey of FTSE listed retail company property departments. The paper concludes that information gaps are a significant hindrance to high-performance property development and that they can be reduced, to some extent, by the forthcoming introduction of the BREEAM Retail environmental assessment tool. Copyright © 2003 John Wiley & Sons, Ltd and ERP Environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High resolution descriptions of plant distribution have utility for many ecological applications but are especially useful for predictive modeling of gene flow from transgenic crops. Difficulty lies in the extrapolation errors that occur when limited ground survey data are scaled up to the landscape or national level. This problem is epitomized by the wide confidence limits generated in a previous attempt to describe the national abundance of riverside Brassica rapa (a wild relative of cultivated rapeseed) across the United Kingdom. Here, we assess the value of airborne remote sensing to locate B. rapa over large areas and so reduce the need for extrapolation. We describe results from flights over the river Nene in England acquired using Airborne Thematic Mapper (ATM) and Compact Airborne Spectrographic Imager (CASI) imagery, together with ground truth data. It proved possible to detect 97% of flowering B. rapa on the basis of spectral profiles. This included all stands of plants that occupied >2m square (>5 plants), which were detected using single-pixel classification. It also included very small populations (<5 flowering plants, 1-2m square) that generated mixed pixels, which were detected using spectral unmixing. The high detection accuracy for flowering B. rapa was coupled with a rather large false positive rate (43%). The latter could be reduced by using the image detections to target fieldwork to confirm species identity, or by acquiring additional remote sensing data such as laser altimetry or multitemporal imagery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyses of high-density single-nucleotide polymorphism (SNP) data, such as genetic mapping and linkage disequilibrium (LD) studies, require phase-known haplotypes to allow for the correlation between tightly linked loci. However, current SNP genotyping technology cannot determine phase, which must be inferred statistically. In this paper, we present a new Bayesian Markov chain Monte Carlo (MCMC) algorithm for population haplotype frequency estimation, particulary in the context of LD assessment. The novel feature of the method is the incorporation of a log-linear prior model for population haplotype frequencies. We present simulations to suggest that 1) the log-linear prior model is more appropriate than the standard coalescent process in the presence of recombination (>0.02cM between adjacent loci), and 2) there is substantial inflation in measures of LD obtained by a "two-stage" approach to the analysis by treating the "best" haplotype configuration as correct, without regard to uncertainty in the recombination process. Genet Epidemiol 25:106-114, 2003. (C) 2003 Wiley-Liss, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research shows that poor indoor air quality (IAQ) in school buildings can cause a reduction in the students’ performance assessed by short-term computer-based tests; whereas good air quality in classrooms can enhance children's concentration and also teachers’ productivity. Investigation of air quality in classrooms helps us to characterise pollutant levels and implement corrective measures. Outdoor pollution, ventilation equipment, furnishings, and human activities affect IAQ. In school classrooms, the occupancy density is high (1.8–2.4 m2/person) compared to offices (10 m2/person). Ventilation systems expend energy and there is a trend to save energy by reducing ventilation rates. We need to establish the minimum acceptable level of fresh air required for the health of the occupants. This paper describes a project, which will aim to investigate the effect of IAQ and ventilation rates on pupils’ performance and health using psychological tests. The aim is to recommend suitable ventilation rates for classrooms and examine the suitability of the air quality guidelines for classrooms. The air quality, ventilation rates and pupils’ performance in classrooms will be evaluated in parallel measurements. In addition, Visual Analogue Scales will be used to assess subjective perception of the classroom environment and SBS symptoms. Pupil performance will be measured with Computerised Assessment Tests (CAT), and Pen and Paper Performance Tasks while physical parameters of the classroom environment will be recorded using an advanced data logging system. A total number of 20 primary schools in the Reading area are expected to participate in the present investigation, and the pupils participating in this study will be within the age group of 9–11 years. On completion of the project, based on the overall data recommendations for suitable ventilation rates for schools will be formulated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As control systems have developed and the implications of poor hygienic practices have become better known, the evaluation of the hygienic status of premises has become more critical. The assessment of the overall status of premises hygiene call provide useful management data indicating whether the premises are improving or whether, whilst still meeting legal requirements, they might be failing to maintain previously high standards. Since the creation, for the United Kingdom, of the meat hygiene service (MHS), one of the aims of the service was to monitor hygiene on different premises to provide a means of comparing standards and to identify and encourage improvements. This desire led to the implementation of a scoring system known as the hygiene assessment system (HAS). This paper analyses English slaughterhouses HAS scores between 1998 and 2005 outlining the main incidents throughout this period, Although rising initially, the later results displayed a clear decrease in the general hygiene scores. These revealing results coincide with the start of a new meat inspection system where, after several years of discussion, risk based inspection is finally coming to a reality within Europe. The paper considers the implications of these changes in the way hygiene standards will be monitored in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background & aims: Long term parenteral nutrition rarely supplies the long chain n-3 polyunsaturated fatty acids (PUFA), eicosapentaenoic acid (EPA), docosapentaenoic acid (DPA) and docosahexaenoic acid (DHA). The aim of this study was to assess long chain n-3 PUFA status in patients receiving home parenteral. nutrition (HPN). Methods: Plasma phospholipid fatty acids were measured in 64 adult HPN patients and compared with 54 age, sex and BMI matched controls. Logistic regression analysis was used to identify factors related to plasma fatty acid fractions in the HPN patients, and to identify factors associated with the risk of clinical. complications. Results: Plasma phospholipid fractions of EPA, DPA and DHA were significantly tower in patients receiving HPN. Factors independently associated with tow fractions included high parenteral energy provision, tow parenteral lipid intake, tow BMI and prolonged duration of HPN. Long chain n-3 PUFA fractions were not associated with incidence of either central venous catheter associated infection or central venous thrombosis. However, the fraction of EPA were inversely associated with plasma alkaline phosphatase concentrations. Conclusions: This study demonstrates abnormal long chain n-3 PUFA profiles in patients receiving HPN. Reduced fatty acid intake may be partly responsible. Fatty acid metabolism may also be altered. (C) 2008 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Baking and 2-g mixograph analyses were performed for 55 cultivars (19 spring and 36 winter wheat) from various quality classes from the 2002 harvest in Poland. An instrumented 2-g direct-drive mixograph was used to study the mixing characteristics of the wheat cultivars. A number of parameters were extracted automatically from each mixograph trace and correlated with baking volume and flour quality parameters (protein content and high molecular weight glutenin subunit [HMW-GS] composition by SDS-PAGE) using multiple linear regression statistical analysis. Principal component analysis of the mixograph data discriminated between four flour quality classes, and predictions of baking volume were obtained using several selected mixograph parameters, chosen using a best subsets regression routine, giving R-2 values of 0.862-0.866. In particular, three new spring wheat strains (CHD 502a-c) recently registered in Poland were highly discriminated and predicted to give high baking volume on the basis of two mixograph parameters: peak bandwidth and 10-min bandwidth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The possibilities and need for adaptation and mitigation depends on uncertain future developments with respect to socio-economic factors and the climate system. Scenarios are used to explore the impacts of different strategies under uncertainty. In this chapter, some scenarios are presented that are used in the ADAM project for this purpose. One scenario explores developments with no mitigation, and thus with high temperature increase and high reliance on adaptation (leading to 4oC increase by 2100 compared to pre-industrial levels). A second scenario explores an ambitious mitigation strategy (leading to 2oC increase by 2100 compared to pre-industrial levels). In the latter scenario, stringent mitigation strategies effectively reduces the risks of climate change, but based on uncertainties in the climate system a temperature increase of 3oC or more cannot be excluded. The analysis shows that, in many cases, adaptation and mitigation are not trade-offs but supplements. For example, the number of people exposed to increased water resource stress due to climate change can be substantially reduced in the mitigation scenario, but even then adaptation will be required for the remaining large numbers of people exposed to increased stress. Another example is sea level rise, for which adaptation is more cost-effective than mitigation, but mitigation can help reduce damages and the cost of adaptation. For agriculture, finally, only the scenario based on a combination of adaptation and mitigation is able to avoid serious climate change impacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An assessment of aerosol-cloud interactions (ACI) from ground-based remote sensing under coastal stratiform clouds is presented. The assessment utilizes a long-term, high temporal resolution data set from the Atmospheric Radiation Measurement (ARM) Program deployment at Pt. Reyes, California, United States, in 2005 to provide statistically robust measures of ACI and to characterize the variability of the measures based on variability in environmental conditions and observational approaches. The average ACIN (= dlnNd/dlna, the change in cloud drop number concentration with aerosol concentration) is 0.48, within a physically plausible range of 0–1.0. Values vary between 0.18 and 0.69 with dependence on (1) the assumption of constant cloud liquid water path (LWP), (2) the relative value of cloud LWP, (3) methods for retrieving Nd, (4) aerosol size distribution, (5) updraft velocity, and (6) the scale and resolution of observations. The sensitivity of the local, diurnally averaged radiative forcing to this variability in ACIN values, assuming an aerosol perturbation of 500 c-3 relative to a background concentration of 100 cm-3, ranges betwee-4 and -9 W -2. Further characterization of ACI and its variability is required to reduce uncertainties in global radiative forcing estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SCIENTIFIC SUMMARY Globally averaged total column ozone has declined over recent decades due to the release of ozone-depleting substances (ODSs) into the atmosphere. Now, as a result of the Montreal Protocol, ozone is expected to recover from the effects of ODSs as ODS abundances decline in the coming decades. However, a number of factors in addition to ODSs have led to and will continue to lead to changes in ozone. Discriminating between the causes of past and projected ozone changes is necessary, not only to identify the progress in ozone recovery from ODSs, but also to evaluate the effectiveness of climate and ozone protection policy options. Factors Affecting Future Ozone and Surface Ultraviolet Radiation • At least for the next few decades, the decline of ODSs is expected to be the major factor affecting the anticipated increase in global total column ozone. However, several factors other than ODS will affect the future evolution of ozone in the stratosphere. These include changes in (i) stratospheric circulation and temperature due to changes in long-lived greenhouse gas (GHG) abundances, (ii) stratospheric aerosol loading, and (iii) source gases of highly reactive stratospheric hydrogen and nitrogen compounds. Factors that amplify the effects of ODSs on ozone (e.g., stratospheric aerosols) will likely decline in importance as ODSs are gradually eliminated from the atmosphere. • Increases in GHG emissions can both positively and negatively affect ozone. Carbon dioxide (CO2)-induced stratospheric cooling elevates middle and upper stratospheric ozone and decreases the time taken for ozone to return to 1980 levels, while projected GHG-induced increases in tropical upwelling decrease ozone in the tropical lower stratosphere and increase ozone in the extratropics. Increases in nitrous oxide (N2O) and methane (CH4) concentrations also directly impact ozone chemistry but the effects are different in different regions. • The Brewer-Dobson circulation (BDC) is projected to strengthen over the 21st century and thereby affect ozone amounts. Climate models consistently predict an acceleration of the BDC or, more specifically, of the upwelling mass flux in the tropical lower stratosphere of around 2% per decade as a consequence of GHG abundance increases. A stronger BDC would decrease the abundance of tropical lower stratospheric ozone, increase poleward transport of ozone, and could reduce the atmospheric lifetimes of long-lived ODSs and other trace gases. While simulations showing faster ascent in the tropical lower stratosphere to date are a robust feature of chemistry-climate models (CCMs), this has not been confirmed by observations and the responsible mechanisms remain unclear. • Substantial ozone losses could occur if stratospheric aerosol loading were to increase in the next few decades, while halogen levels are high. Stratospheric aerosol increases may be caused by sulfur contained in volcanic plumes entering the stratosphere or from human activities. The latter might include attempts to geoengineer the climate system by enhancing the stratospheric aerosol layer. The ozone losses mostly result from enhanced heterogeneous chemistry on stratospheric aerosols. Enhanced aerosol heating within the stratosphere also leads to changes in temperature and circulation that affect ozone. • Surface ultraviolet (UV) levels will not be affected solely by ozone changes but also by the effects of climate change and by air quality change in the troposphere. These tropospheric effects include changes in clouds, tropospheric aerosols, surface reflectivity, and tropospheric sulfur dioxide (SO2) and nitrogen dioxide (NO2). The uncertainties in projections of these factors are large. Projected increases in tropospheric ozone are more certain and may lead to reductions in surface erythemal (“sunburning”) irradiance of up to 10% by 2100. Changes in clouds may lead to decreases or increases in surface erythemal irradiance of up to 15% depending on latitude. Expected Future Changes in Ozone Full ozone recovery from the effects of ODSs and return of ozone to historical levels are not synonymous. In this chapter a key target date is chosen to be 1980, in part to retain the connection to previous Ozone Assessments. Noting, however, that decreases in ozone may have occurred in some regions of the atmosphere prior to 1980, 1960 return dates are also reported. The projections reported on in this chapter are taken from a recent compilation of CCM simulations. The ozone projections, which also form the basis for the UV projections, are limited in their representativeness of possible futures since they mostly come from CCM simulations based on a single GHG emissions scenario (scenario A1B of Emissions Scenarios. A Special Report of Working Group III of the Intergovernmental Panel on Climate Change, Cambridge University Press, 2000) and a single ODS emissions scenario (adjusted A1 of the previous (2006) Ozone Assessment). Throughout this century, the vertical, latitudinal, and seasonal structure of the ozone distribution will be different from what it was in 1980. For this reason, ozone changes in different regions of the atmosphere are considered separately. • The projections of changes in ozone and surface clear-sky UV are broadly consistent with those reported on in the 2006 Assessment. • The capability of making projections and attribution of future ozone changes has been improved since the 2006 Assessment. Use of CCM simulations from an increased number of models extending through the entire period of ozone depletion and recovery from ODSs (1960–2100) as well as sensitivity simulations have allowed more robust projections of long-term changes in the stratosphere and of the relative contributions of ODSs and GHGs to those changes. • Global annually averaged total column ozone is projected to return to 1980 levels before the middle of the century and earlier than when stratospheric halogen loading returns to 1980 levels. CCM projections suggest that this early return is primarily a result of GHG-induced cooling of the upper stratosphere because the effects of circulation changes on tropical and extratropical ozone largely cancel. Global (90°S–90°N) annually averaged total column ozone will likely return to 1980 levels between 2025 and 2040, well before the return of stratospheric halogens to 1980 levels between 2045 and 2060. • Simulated changes in tropical total column ozone from 1960 to 2100 are generally small. The evolution of tropical total column ozone in models depends on the balance between upper stratospheric increases and lower stratospheric decreases. The upper stratospheric increases result from declining ODSs and a slowing of ozone destruction resulting from GHG-induced cooling. Ozone decreases in the lower stratosphere mainly result from an increase in tropical upwelling. From 1960 until around 2000, a general decline is simulated, followed by a gradual increase to values typical of 1980 by midcentury. Thereafter, although total column ozone amounts decline slightly again toward the end of the century, by 2080 they are no longer expected to be affected by ODSs. Confidence in tropical ozone projections is compromised by the fact that simulated decreases in column ozone to date are not supported by observations, suggesting that significant uncertainties remain. • Midlatitude total column ozone is simulated to evolve differently in the two hemispheres. Over northern midlatitudes, annually averaged total column ozone is projected to return to 1980 values between 2015 and 2030, while for southern midlatitudes the return to 1980 values is projected to occur between 2030 and 2040. The more rapid return to 1980 values in northern midlatitudes is linked to a more pronounced strengthening of the poleward transport of ozone due to the effects of increased GHG levels, and effects of Antarctic ozone depletion on southern midlatitudes. By 2100, midlatitude total column ozone is projected to be above 1980 values in both hemispheres. • October-mean Antarctic total column ozone is projected to return to 1980 levels after midcentury, later than in any other region, and yet earlier than when stratospheric halogen loading is projected to return to 1980 levels. The slightly earlier return of ozone to 1980 levels (2045–2060) results primarily from upper stratospheric cooling and resultant increases in ozone. The return of polar halogen loading to 1980 levels (2050–2070) in CCMs is earlier than in empirical models that exclude the effects of GHG-induced changes in circulation. Our confidence in the drivers of changes in Antarctic ozone is higher than for other regions because (i) ODSs exert a strong influence on Antarctic ozone, (ii) the effects of changes in GHG abundances are comparatively small, and (iii) projections of ODS emissions are more certain than those for GHGs. Small Antarctic ozone holes (areas of ozone <220 Dobson units, DU) could persist to the end of the 21st century. • March-mean Arctic total column ozone is projected to return to 1980 levels two to three decades before polar halogen loading returns to 1980 levels, and to exceed 1980 levels thereafter. While CCM simulations project a return to 1980 levels between 2020 and 2035, most models tend not to capture observed low temperatures and thus underestimate present-day Arctic ozone loss such that it is possible that this return date is biased early. Since the strengthening of the Brewer-Dobson circulation through the 21st century leads to increases in springtime Arctic column ozone, by 2100 Arctic ozone is projected to lie well above 1960 levels. Uncertainties in Projections • Conclusions dependent on future GHG levels are less certain than those dependent on future ODS levels since ODS emissions are controlled by the Montreal Protocol. For the six GHG scenarios considered by a few CCMs, the simulated differences in stratospheric column ozone over the second half of the 21st century are largest in the northern midlatitudes and the Arctic, with maximum differences of 20–40 DU between the six scenarios in 2100. • There remain sources of uncertainty in the CCM simulations. These include the use of prescribed ODS mixing ratios instead of emission fluxes as lower boundary conditions, the range of sea surface temperatures and sea ice concentrations, missing tropospheric chemistry, model parameterizations, and model climate sensitivity. • Geoengineering schemes for mitigating climate change by continuous injections of sulfur-containing compounds into the stratosphere, if implemented, would substantially affect stratospheric ozone, particularly in polar regions. Ozone losses observed following large volcanic eruptions support this prediction. However, sporadic volcanic eruptions provide limited analogs to the effects of continuous sulfur emissions. Preliminary model simulations reveal large uncertainties in assessing the effects of continuous sulfur injections. Expected Future Changes in Surface UV. While a number of factors, in addition to ozone, affect surface UV irradiance, the focus in this chapter is on the effects of changes in stratospheric ozone on surface UV. For this reason, clear-sky surface UV irradiance is calculated from ozone projections from CCMs. • Projected increases in midlatitude ozone abundances during the 21st century, in the absence of changes in other factors, in particular clouds, tropospheric aerosols, and air pollutants, will result in decreases in surface UV irradiance. Clear-sky erythemal irradiance is projected to return to 1980 levels on average in 2025 for the northern midlatitudes, and in 2035 for the southern midlatitudes, and to fall well below 1980 values by the second half of the century. However, actual changes in surface UV will be affected by a number of factors other than ozone. • In the absence of changes in other factors, changes in tropical surface UV will be small because changes in tropical total column ozone are projected to be small. By the middle of the 21st century, the model projections suggest surface UV to be slightly higher than in the 1960s, very close to values in 1980, and slightly lower than in 2000. The projected decrease in tropical total column ozone through the latter half of the century will likely result in clear-sky surface UV remaining above 1960 levels. Average UV irradiance is already high in the tropics due to naturally occurring low total ozone columns and high solar elevations. • The magnitude of UV changes in the polar regions is larger than elsewhere because ozone changes in polar regions are larger. For the next decades, surface clear-sky UV irradiance, particularly in the Antarctic, will continue to be higher than in 1980. Future increases in ozone and decreases in clear-sky UV will occur at slower rates than those associated with the ozone decreases and UV increases that occurred before 2000. In Antarctica, surface clear-sky UV is projected to return to 1980 levels between 2040 and 2060, while in the Arctic this is projected to occur between 2020 and 2030. By 2100, October surface clear-sky erythemal irradiance in Antarctica is likely to be between 5% below to 25% above 1960 levels, with considerable uncertainty. This is consistent with multi-model-mean October Antarctic total column ozone not returning to 1960 levels by 2100. In contrast, by 2100, surface clear-sky UV in the Arctic is projected to be 0–10% below 1960 levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The method of entropy has been useful in evaluating inconsistency on human judgments. This paper illustrates an entropy-based decision support system called e-FDSS to the solution of multicriterion risk and decision analysis in projects of construction small and medium enterprises (SMEs). It is optimized and solved by fuzzy logic, entropy, and genetic algorithms. A case study demonstrated the use of entropy in e-FDSS on analyzing multiple risk criteria in the predevelopment stage of SME projects. Survey data studying the degree of impact of selected project risk criteria on different projects were input into the system in order to evaluate the preidentified project risks in an impartial environment. Without taking into account the amount of uncertainty embedded in the evaluation process; the results showed that all decision vectors are indeed full of bias and the deviations of decisions are finally quantified providing a more objective decision and risk assessment profile to the stakeholders of projects in order to search and screen the most profitable projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate controls upland habitats, soils and their associated ecosystem services; therefore, understanding possible changes in upland climatic conditions can provide a rapid assessment of climatic vulnerability over the next century. We used 3 different climatic indices that were optimised to fit the upland area classified by the EU as a Severely Disadvantaged Area (SDA) 1961–1990. Upland areas within the SDA covered all altitudinal ranges, whereas the maximum altitude of lowland areas outside of the SDA was ca. 300 m. In general, the climatic index based on the ratio between annual accumulated temperature (as a measure of growing season length) and annual precipitation predicted 96% of the SDA mapped area, which was slightly better than those indices based on annual or seasonal water deficit. Overall, all climatic indices showed that upland environments were exposed to some degree of change by 2071–2100 under UKCIP02 climate projections for high and low emissions scenarios. The projected area declined by 13 to 51% across 3 indices for the low emissions scenario and by 24 to 84% for the high emissions scenario. Mean altitude of the upland area increased by +11 to +86 m for the low scenario and +21 to +178 m for the high scenario. Low altitude areas in eastern and southern Great Britain were most vulnerable to change. These projected climatic changes are likely to affect upland habitat composition, long-term soil carbon storage and wider ecosystem service provision, although it is not yet possible to determine the rate at which this might occur.