130 resultados para Robustness
Resumo:
Wind generated waves at the sea surface are of outstanding importance for both their practical relevance in many aspects, such as coastal erosion, protection, or safety of navigation, and for their scientific relevance in modifying fluxes at the air-sea interface. So far long-term changes in ocean wave climate have been studied mostly from a regional perspective with global dynamical studies emerging only recently. Here a global wave climate study is presented, in which a global wave model (WAM) is driven by atmospheric forcing from a global climate model (ECHAM5) for present day and potential future climate conditions represented by the IPCC (Intergovernmental Panel for Climate Change) A1B emission scenario. It is found that changes in mean and extreme wave climate towards the end of the twenty-first century are small to moderate, with the largest signals being a poleward shift in the annual mean and extreme significant wave heights in the mid-latitudes of both hemispheres, more pronounced in the Southern Hemisphere, and most likely associated with a corresponding shift in mid-latitude storm tracks. These changes are broadly consistent with results from the few studies available so far. The projected changes in the mean wave periods, associated with the changes in the wave climate in the mid to high latitudes, are also shown, revealing a moderate increase in the equatorial eastern side of the ocean basins. This study presents a step forward towards a larger ensemble of global wave climate projections required to better assess robustness and uncertainty of potential future wave climate change.
Resumo:
In this paper we have proposed and analyzed a simple mathematical model consisting of four variables, viz., nutrient concentration, toxin producing phytoplankton (TPP), non-toxic phytoplankton (NTP), and toxin concentration. Limitation in the concentration of the extracellular nutrient has been incorporated as an environmental stress condition for the plankton population, and the liberation of toxic chemicals has been described by a monotonic function of extracellular nutrient. The model is analyzed and simulated to reproduce the experimental findings of Graneli and Johansson [Graneli, E., Johansson, N., 2003. Increase in the production of allelopathic Prymnesium parvum cells grown under N- or P-deficient conditions. Harmful Algae 2, 135–145]. The robustness of the numerical experiments are tested by a formal parameter sensitivity analysis. As the first theoretical model consistent with the experiment of Graneli and Johansson (2003), our results demonstrate that, when nutrient-deficient conditions are favorable for the TPP population to release toxic chemicals, the TPP species control the bloom of other phytoplankton species which are non-toxic. Consistent with the observations made by Graneli and Johansson (2003), our model overcomes the limitation of not incorporating the effect of nutrient-limited toxic production in several other models developed on plankton dynamics.
Resumo:
We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL’s Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60% of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.
Resumo:
Radiative forcing and climate sensitivity have been widely used as concepts to understand climate change. This work performs climate change experiments with an intermediate general circulation model (IGCM) to examine the robustness of the radiative forcing concept for carbon dioxide and solar constant changes. This IGCM has been specifically developed as a computationally fast model, but one that allows an interaction between physical processes and large-scale dynamics; the model allows many long integrations to be performed relatively quickly. It employs a fast and accurate radiative transfer scheme, as well as simple convection and surface schemes, and a slab ocean, to model the effects of climate change mechanisms on the atmospheric temperatures and dynamics with a reasonable degree of complexity. The climatology of the IGCM run at T-21 resolution with 22 levels is compared to European Centre for Medium Range Weather Forecasting Reanalysis data. The response of the model to changes in carbon dioxide and solar output are examined when these changes are applied globally and when constrained geographically (e.g. over land only). The CO2 experiments have a roughly 17% higher climate sensitivity than the solar experiments. It is also found that a forcing at high latitudes causes a 40% higher climate sensitivity than a forcing only applied at low latitudes. It is found that, despite differences in the model feedbacks, climate sensitivity is roughly constant over a range of distributions of CO2 and solar forcings. Hence, in the IGCM at least, the radiative forcing concept is capable of predicting global surface temperature changes to within 30%, for the perturbations described here. It is concluded that radiative forcing remains a useful tool for assessing the natural and anthropogenic impact of climate change mechanisms on surface temperature.
Resumo:
What determines the emergence and survival of democracy? The authors apply extreme bounds analysis to test the robustness of fifty-nine factors proposed in the literature, evaluating over three million regressions with data from 165 countries from 1976 to 2002. The most robust determinants of the transition to democracy are gross domestic product (GDP) growth (a negative effect), past transitions (a positive effect), and Organisation for Economic Co-operation and Development membership (a positive effect). There is some evidence that fuel exporters and Muslim countries are less likely to see democracy emerge, although the latter finding is driven entirely by oil-producing Muslim countries. Regarding the survival of democracy, the most robust determinants are GDP per capita (a positive effect) and past transitions (a negative effect). There is some evidence that having a former military leader as the chief executive has a negative effect, while having other democracies as neighbors has a reinforcing effect.
Resumo:
We assess the robustness of previous findings on the determinants of terrorism. Using extreme bound analysis, the three most comprehensive terrorism datasets, and focusing on the three most commonly analyzed aspects of terrorist activity, i.e., location, victim, and perpetrator, we re-assess the effect of 65 proposed correlates. Evaluating around 13.4 million regressions, we find 18 variables to be robustly associated with the number of incidents occurring in a given country-year, 15 variables with attacks against citizens from a particular country in a given year, and six variables with attacks perpetrated by citizens of a particular country in a given year.
Resumo:
Tests for business cycle asymmetries are developed for Markov-switching autoregressive models. The tests of deepness, steepness, and sharpness are Wald statistics, which have standard asymptotics. For the standard two-regime model of expansions and contractions, deepness is shown to imply sharpness (and vice versa), whereas the process is always nonsteep. Two and three-state models of U.S. GNP growth are used to illustrate the approach, along with models of U.S. investment and consumption growth. The robustness of the tests to model misspecification, and the effects of regime-dependent heteroscedasticity, are investigated.
Resumo:
A number of studies have found an asymmetric response of consumer price index inflation to the output gap in the US in simple Phillips curve models. We consider whether there are similar asymmetries in mark-up pricing models, that is, whether the mark-up over producers' costs also depends upon the sign of the (adjusted) output gap. The robustness of our findings to the price series is assessed, and also whether price-output responses in the UK are asymmetric.
Resumo:
Climate model ensembles are widely heralded for their potential to quantify uncertainties and generate probabilistic climate projections. However, such technical improvements to modeling science will do little to deliver on their ultimate promise of improving climate policymaking and adaptation unless the insights they generate can be effectively communicated to decision makers. While some of these communicative challenges are unique to climate ensembles, others are common to hydrometeorological modeling more generally, and to the tensions arising between the imperatives for saliency, robustness, and richness in risk communication. The paper reviews emerging approaches to visualizing and communicating climate ensembles and compares them to the more established and thoroughly evaluated communication methods used in the numerical weather prediction domains of day-to-day weather forecasting (in particular probabilities of precipitation), hurricane and flood warning, and seasonal forecasting. This comparative analysis informs recommendations on best practice for climate modelers, as well as prompting some further thoughts on key research challenges to improve the future communication of climate change uncertainties.
Resumo:
With advances in technology, terahertz imaging and spectroscopy are beginning to move out of the laboratory and find applications in areas as diverse as security screening, medicine, art conservation and field archaeology. Nevertheless, there is still a need to improve upon the performance of existing terahertz systems to achieve greater compactness and robustness, enhanced spatial resolution, more rapid data acquisition times and operation at greater standoff distances. This chapter will review recent technological developments in this direction that make use of nanostructures in the generation, detection and manipulation of terahertz radiation. The chapter will also explain how terahertz spectroscopy can be used as a tool to characterize the ultrafast carrier dynamics of nanomaterials.
Resumo:
This paper presents the PETS2009 outdoor crowd image analysis surveillance dataset and the performance evaluation of people counting, detection and tracking results using the dataset submitted to five IEEE Performance Evaluation of Tracking and Surveillance (PETS) workshops. The evaluation was carried out using well established metrics developed in the Video Analysis and Content Extraction (VACE) programme and the CLassification of Events, Activities, and Relationships (CLEAR) consortium. The comparative evaluation highlights the detection and tracking performance of the authors’ systems in areas such as precision, accuracy and robustness and provides a brief analysis of the metrics themselves to provide further insights into the performance of the authors’ systems.
Resumo:
In this paper, we study the role of the volatility risk premium for the forecasting performance of implied volatility. We introduce a non-parametric and parsimonious approach to adjust the model-free implied volatility for the volatility risk premium and implement this methodology using more than 20 years of options and futures data on three major energy markets. Using regression models and statistical loss functions, we find compelling evidence to suggest that the risk premium adjusted implied volatility significantly outperforms other models, including its unadjusted counterpart. Our main finding holds for different choices of volatility estimators and competing time-series models, underlying the robustness of our results.
Resumo:
Realistic representation of sea ice in ocean models involves the use of a non-linear free-surface, a real freshwater flux and observance of requisite conservation laws. We show here that these properties can be achieved in practice through use of a rescaled vertical coordinate ‘‘z*” in z-coordinate models that allows one to follow undulations in the free-surface under sea ice loading. In particular, the adoption of "z*" avoids the difficult issue of vanishing levels under thick ice. Details of the implementation within MITgcm are provided. A high resolution global ocean sea ice simulation illustrates the robustness of the z* formulation and reveals a source of oceanic variability associated with sea ice dynamics and ice-loading effects. The use of the z* coordinate allows one to achieve perfect conservation of fresh water, heat and salt, as shown in extended integration of coupled ocean sea ice atmospheric model.
Resumo:
Intense winter cyclones often lead to hazardous weather over Europe. Previous studies have pointed to a link between the North Atlantic Oscillation (NAO) and strong European windstorms. However, the robustness of this relation for cyclones of varying intensities remains largely unexplored. In this paper, the bi-directional relation between the NAO and cyclones impacting Europe is analyzed for the period 1950–2010 focusing on the sensitivity to storm intensity. Evidence is given that explosive (EC) and non-explosive cyclones (NoEC) predominantly develop under different large-scale circulation conditions over the North Atlantic. Whereas NoEC evolve more frequently under negative and neutral NAO phases, the number of EC is larger under a positive NAO phase, typically characterized by an intensified jet toward Western Europe. Important differences are also found on the dynamics of NAO evolution after peak intensity for both cyclone populations.
Resumo:
Understanding observed changes to the global water cycle is key to predicting future climate changes and their impacts. While many datasets document crucial variables such as precipitation, ocean salinity, runoff, and humidity, most are uncertain for determining long-term changes. In situ networks provide long time-series over land but are sparse in many regions, particularly the tropics. Satellite and reanalysis datasets provide global coverage, but their long-term stability is lacking. However, comparisons of changes among related variables can give insights into the robustness of observed changes. For example, ocean salinity, interpreted with an understanding of ocean processes, can help cross-validate precipitation. Observational evidence for human influences on the water cycle is emerging, but uncertainties resulting from internal variability and observational errors are too large to determine whether the observed and simulated changes are consistent. Improvements to the in situ and satellite observing networks that monitor the changing water cycle are required, yet continued data coverage is threatened by funding reductions. Uncertainty both in the role of anthropogenic aerosols, and due to large climate variability presently limits confidence in attribution of observed changes.