94 resultados para Sensitivity Analysis
Resumo:
Effective disaster risk management relies on science-based solutions to close the gap between prevention and preparedness measures. The consultation on the United Nations post-2015 framework for disaster risk reduction highlights the need for cross-border early warning systems to strengthen the preparedness phases of disaster risk management, in order to save lives and property and reduce the overall impact of severe events. Continental and global scale flood forecasting systems provide vital early flood warning information to national and international civil protection authorities, who can use this information to make decisions on how to prepare for upcoming floods. Here the potential monetary benefits of early flood warnings are estimated based on the forecasts of the continental-scale European Flood Awareness System (EFAS) using existing flood damage cost information and calculations of potential avoided flood damages. The benefits are of the order of 400 Euro for every 1 Euro invested. A sensitivity analysis is performed in order to test the uncertainty in the method and develop an envelope of potential monetary benefits of EFAS warnings. The results provide clear evidence that there is likely a substantial monetary benefit in this cross-border continental-scale flood early warning system. This supports the wider drive to implement early warning systems at the continental or global scale to improve our resilience to natural hazards.
Resumo:
Partial budgeting was used to estimate the net benefit of blending Jersey milk in Holstein-Friesian milk for Cheddar cheese production. Jersey milk increases Cheddar cheese yield. However, the cost of Jersey milk is also higher; thus, determining the balance of profitability is necessary, including consideration of seasonal effects. Input variables were based on a pilot plant experiment run from 2012 to 2013 and industry milk and cheese prices during this period. When Jersey milk was used at an increasing rate with Holstein-Friesian milk (25, 50, 75, and 100% Jersey milk), it resulted in an increase of average net profit of 3.41, 6.44, 8.57, and 11.18 pence per kilogram of milk, respectively, and this additional profit was constant throughout the year. Sensitivity analysis showed that the most influential input on additional profit was cheese yield, whereas cheese price and milk price had a small effect. The minimum increase in yield, which was necessary for the use of Jersey milk to be profitable, was 2.63, 7.28, 9.95, and 12.37% at 25, 50, 75, and 100% Jersey milk, respectively. Including Jersey milk did not affect the quantity of whey butter and powder produced. Althoug further research is needed to ascertain the amount of additional profit that would be found on a commercial scale, the results indicate that using Jersey milk for Cheddar cheese making would lead to an improvement in profit for the cheese makers, especially at higher inclusion rates.
Resumo:
A mathematical model for Banana Xanthomonas Wilt (BXW) spread by insect is presented. The model incorporates inflorescence infection and vertical transmission from the mother corm to attached suckers, but not tool-based transmission by humans. Expressions for the basic reproduction number R0 are obtained and it is verified that disease persists, at a unique endemic level, when R0 > 1. From sensitivity analysis, inflorescence infection rate and roguing rate were the parameters with most influence on disease persistence and equilibrium level. Vertical transmission parameters had less effect on persistence threshold values. Parameters were approximately estimated from field data. The model indicates that single stem removal is a feasible approach to eradication if spread is mainly via inflorescence infection. This requires continuous surveillance and debudding such that a 50% reduction in inflorescence infection and 2–3 weeks interval of surveillance would eventually lead to full recovery of banana plantations and hence improved production.
Resumo:
For the tracking of extrema associated with weather systems to be applied to a broad range of fields it is necessary to remove a background field that represents the slowly varying, large spatial scales. The sensitivity of the tracking analysis to the form of background field removed is explored for the Northern Hemisphere winter storm tracks for three contrasting fields from an integration of the U. K. Met Office's (UKMO) Hadley Centre Climate Model (HadAM3). Several methods are explored for the removal of a background field from the simple subtraction of the climatology, to the more sophisticated removal of the planetary scales. Two temporal filters are also considered in the form of a 2-6-day Lanczos filter and a 20-day high-pass Fourier filter. The analysis indicates that the simple subtraction of the climatology tends to change the nature of the systems to the extent that there is a redistribution of the systems relative to the climatological background resulting in very similar statistical distributions for both positive and negative anomalies. The optimal planetary wave filter removes total wavenumbers less than or equal to a number in the range 5-7, resulting in distributions more easily related to particular types of weather system. For the temporal filters the 2-6-day bandpass filter is found to have a detrimental impact on the individual weather systems, resulting in the storm tracks having a weak waveguide type of behavior. The 20-day high-pass temporal filter is less aggressive than the 2-6-day filter and produces results falling between those of the climatological and 2-6-day filters.
Resumo:
Reanalysis data obtained from data assimilation are increasingly used for diagnostic studies of the general circulation of the atmosphere, for the validation of modelling experiments and for estimating energy and water fluxes between the Earth surface and the atmosphere. Because fluxes are not specifically observed, but determined by the data assimilation system, they are not only influenced by the utilized observations but also by model physics and dynamics and by the assimilation method. In order to better understand the relative importance of humidity observations for the determination of the hydrological cycle, in this paper we describe an assimilation experiment using the ERA40 reanalysis system where all humidity data have been excluded from the observational data base. The surprising result is that the model, driven by the time evolution of wind, temperature and surface pressure, is able to almost completely reconstitute the large-scale hydrological cycle of the control assimilation without the use of any humidity data. In addition, analysis of the individual weather systems in the extratropics and tropics using an objective feature tracking analysis indicates that the humidity data have very little impact on these systems. We include a discussion of these results and possible consequences for the way moisture information is assimilated, as well as the potential consequences for the design of observing systems for climate monitoring. It is further suggested, with support from a simple assimilation study with another model, that model physics and dynamics play a decisive role for the hydrological cycle, stressing the need to better understand these aspects of model parametrization. .
Resumo:
A new method for assessing forecast skill and predictability that involves the identification and tracking of extratropical cyclones has been developed and implemented to obtain detailed information about the prediction of cyclones that cannot be obtained from more conventional analysis methodologies. The cyclones were identified and tracked along the forecast trajectories, and statistics were generated to determine the rate at which the position and intensity of the forecasted storms diverge from the analyzed tracks as a function of forecast lead time. The results show a higher level of skill in predicting the position of extratropical cyclones than the intensity. They also show that there is potential to improve the skill in predicting the position by 1 - 1.5 days and the intensity by 2 - 3 days, via improvements to the forecast model. Further analysis shows that forecasted storms move at a slower speed than analyzed storms on average and that there is a larger error in the predicted amplitudes of intense storms than the weaker storms. The results also show that some storms can be predicted up to 3 days before they are identified as an 850-hPa vorticity center in the analyses. In general, the results show a higher level of skill in the Northern Hemisphere (NH) than the Southern Hemisphere (SH); however, the rapid growth of NH winter storms is not very well predicted. The impact that observations of different types have on the prediction of the extratropical cyclones has also been explored, using forecasts integrated from analyses that were constructed from reduced observing systems. A terrestrial, satellite, and surface-based system were investigated and the results showed that the predictive skill of the terrestrial system was superior to the satellite system in the NH. Further analysis showed that the satellite system was not very good at predicting the growth of the storms. In the SH the terrestrial system has significantly less skill than the satellite system, highlighting the dominance of satellite observations in this hemisphere. The surface system has very poor predictive skill in both hemispheres.
Resumo:
The interannual variability of the hydrological cycle is diagnosed from the Hadley Centre and Geophysical Fluid Dynamics Laboratory (GFDL) climate models, both of which are forced by observed sea surface temperatures. The models produce a similar sensitivity of clear-sky outgoing longwave radiation to surface temperature of ∼2 W m−2 K−1, indicating a consistent and positive clear-sky radiative feedback. However, differences between changes in the temperature lapse-rate and the height dependence of moisture fluctuations suggest that contrasting mechanisms bring about this result. The GFDL model appears to give a weaker water vapor feedback (i.e., changes in specific humidity). This is counteracted by a smaller upper tropospheric temperature response to surface warming, which implies a compensating positive lapse-rate feedback.
Resumo:
One of the major uncertainties in the ability to predict future climate change, and hence its impacts, is the lack of knowledge of the earth's climate sensitivity. Here, data are combined from the 1985-96 Earth Radiation Budget Experiment (ERBE) with surface temperature change information and estimates of radiative forcing to diagnose the climate sensitivity. Importantly, the estimate is completely independent of climate model results. A climate feedback parameter of 2.3 +/- 1.4 W m(-2) K-1 is found. This corresponds to a 1.0-4.1-K range for the equilibrium warming due to a doubling of carbon dioxide (assuming Gaussian errors in observable parameters, which is approximately equivalent to a uniform "prior" in feedback parameter). The uncertainty range is due to a combination of the short time period for the analysis as well as uncertainties in the surface temperature time series and radiative forcing time series, mostly the former. Radiative forcings may not all be fully accounted for; however, all argument is presented that the estimate of climate sensitivity is still likely to be representative of longer-term climate change. The methodology can be used to 1) retrieve shortwave and longwave components of climate feedback and 2) suggest clear-sky and cloud feedback terms. There is preliminary evidence of a neutral or even negative longwave feedback in the observations, suggesting that current climate models may not be representing some processes correctly if they give a net positive longwave feedback.
Resumo:
This report forms part of a larger research programme on 'Reinterpreting the Urban-Rural Continuum', which conceptualises and investigates current knowledge and research gaps concerning 'the role that ecosystems services play in the livelihoods of the poor in regions undergoing rapid change'. The report aims to conduct a baseline appraisal of water-dependant ecosystem services, the roles they play within desakota livelihood systems and their potential sensitivity to climate change. The appraisal is conducted at three spatial scales: global, regional (four consortia areas), and meso scale (case studies within the four regions). At all three scales of analysis water resources form the interweaving theme because water provides a vital provisioning service for people, supports all other ecosystem processes and because water resources are forecast to be severely affected under climate change scenarios. This report, combined with an Endnote library of over 1100 scientific papers, provides an annotated bibliography of water-dependant ecosystem services, the roles they play within desakota livelihood systems and their potential sensitivity to climate change. After an introductory, section, Section 2 of the report defines water-related ecosystem services and how these are affected by human activities. Current knowledge and research gaps are then explored in relation to global scale climate and related hydrological changes (e.g. floods, droughts, flow regimes) (section 3). The report then discusses the impacts of climate changes on the ESPA regions, emphasising potential responses of biomes to the combined effects of climate change and human activities (particularly land use and management), and how these effects coupled with water store and flow regime manipulation by humans may affect the functioning of catchments and their ecosystem services (section 4). Finally, at the meso-scale, case studies are presented from within the ESPA regions to illustrate the close coupling of human activities and catchment performance in the context of environmental change (section 5). At the end of each section, research needs are identified and justified. These research needs are then amalgamated in section 6.
Resumo:
A multivariate fit to the variation in global mean surface air temperature anomaly over the past half century is presented. The fit procedure allows for the effect of response time on the waveform, amplitude and lag of each radiative forcing input, and each is allowed to have its own time constant. It is shown that the contribution of solar variability to the temperature trend since 1987 is small and downward; the best estimate is -1.3% and the 2sigma confidence level sets the uncertainty range of -0.7 to -1.9%. The result is the same if one quantifies the solar variation using galactic cosmic ray fluxes (for which the analysis can be extended back to 1953) or the most accurate total solar irradiance data composite. The rise in the global mean air surface temperatures is predominantly associated with a linear increase that represents the combined effects of changes in anthropogenic well-mixed greenhouse gases and aerosols, although, in recent decades, there is also a considerable contribution by a relative lack of major volcanic eruptions. The best estimate is that the anthropogenic factors contribute 75% of the rise since 1987, with an uncertainty range (set by the 2sigma confidence level using an AR(1) noise model) of 49–160%; thus, the uncertainty is large, but we can state that at least half of the temperature trend comes from the linear term and that this term could explain the entire rise. The results are consistent with the intergovernmental panel on climate change (IPCC) estimates of the changes in radiative forcing (given for 1961–1995) and are here combined with those estimates to find the response times, equilibrium climate sensitivities and pertinent heat capacities (i.e. the depth into the oceans to which a given radiative forcing variation penetrates) of the quasi-periodic (decadal-scale) input forcing variations. As shown by previous studies, the decadal-scale variations do not penetrate as deeply into the oceans as the longer term drifts and have shorter response times. Hence, conclusions about the response to century-scale forcing changes (and hence the associated equilibrium climate sensitivity and the temperature rise commitment) cannot be made from studies of the response to shorter period forcing changes.
Resumo:
Ozone and its precursors were measured on board the Facility for Airborne Atmospheric Measurements (FAAM) BAe 146 Atmospheric Research Aircraft during the monsoon season 2006 as part of the African Monsoon Multidisciplinary Analysis (AMMA) campaign. One of the main features observed in the west African boundary layer is the increase of the ozone mixing ratios from 25 ppbv over the forested area (south of 12° N) up to 40 ppbv over the Sahelian area. We employ a two-dimensional (latitudinal versus vertical) meteorological model coupled with an O3-NOx-VOC chemistry scheme to simulate the distribution of trace gases over West Africa during the monsoon season and to analyse the processes involved in the establishment of such a gradient. Including an additional source of NO over the Sahelian region to account for NO emitted by soils we simulate a mean NOx concentration of 0.7 ppbv at 16° N versus 0.3 ppbv over the vegetated region further south in reasonable agreement with the observations. As a consequence, ozone is photochemically produced with a rate of 0.25 ppbv h−1 over the vegetated region whilst it reaches up to 0.75 ppbv h−1 at 16° N. We find that the modelled gradient is due to a combination of enhanced deposition to vegetation, which decreases the ozone levels by up to 11 pbbv, and the aforementioned enhanced photochemical production north of 12° N. The peroxy radicals required for this enhanced production in the north come from the oxidation of background CO and CH4 as well as from VOCs. Sensitivity studies reveal that both the background CH4 and partially oxidised VOCs, produced from the oxidation of isoprene emitted from the vegetation in the south, contribute around 5–6 ppbv to the ozone gradient. These results suggest that the northward transport of trace gases by the monsoon flux, especially during nighttime, can have a significant, though secondary, role in determining the ozone gradient in the boundary layer. Convection, anthropogenic emissions and NO produced from lightning do not contribute to the establishment of the discussed ozone gradient.
Resumo:
Techniques for obtaining quantitative values of the temperatures and concentrations of remote hot gaseous effluents from their measured passive emission spectra have been examined in laboratory experiments. The high sensitivity of the spectrometer in the vicinity of the 2397 cm-1 band head region of CO2 has allowed the gas temperature to be calculated from the relative intensity of the observed rotational lines. The spatial distribution of the CO2 in a methane flame has been reconstructed tomographically using a matrix inversion technique. The spectrometer has been calibrated against a black body source at different temperatures and a self absorption correction has been applied to the data avoiding the need to measure the transmission directly. Reconstruction artifacts have been reduced by applying a smoothing routine to the inversion matrix.
Resumo:
The paper considers meta-analysis of diagnostic studies that use a continuous score for classification of study participants into healthy or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might be confounded by a potentially unknown variation of the cut-off value. To cope with this phenomena it is suggested to use, instead, an overall estimate of the misclassification error previously suggested and used as Youden’s index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel–Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden’s index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.