173 resultados para Global analysis (Mathematics)
Resumo:
Observed global ocean heat content anomalies over the past five decades agree well with an anthropogenically forced simulation using the European Center/Hamburg coupled general circulation model (GCM) ECHAM4/OPYC3 considering increasing greenhouse gas concentrations, the direct and indirect effect of sulphate aerosols, and anthropogenic changes in tropospheric ozone. An optimal detection and attribution analysis confirms that the simulated climate change signal can be detected in the observations in both the upper 300 m and 3000 m of the water column and that the observed changes in ocean heat content are consistent with those expected from the anthropogenically forced GCM integration. This suggests that anthropogenic forcing is a likely explanation for the observed global ocean warming over the past five decades.
Resumo:
Upper air observations from radiosondes and microwave satellite instruments does not indicate any global warming during the last 19 years, contrary to surface measurements, where a warming trend is supposedly being found. This result is somewhat difficult to reconcile, since climate model experiments do indicate a reverse trend, namely, that upper tropospheric air should warm faster than the surface. To contribute toward an understanding of this difficulty, we have here undertaken some specific experiments to study the effect on climate due to the decrease in stratospheric ozone and the Mount Pinatubo eruption in 1991. The associated forcing was added to the forcing from greenhouse gases, sulfate aerosols (direct and indirect effect), and tropospheric ozone, which was investigated in a separate series of experiments. Furthermore, we have undertaken an ensemble study in order to explore the natural variability of an advanced climate model exposed to such a forcing over 19 years. The result shows that the reduction of stratospheric ozone cools not only the lower stratosphere but also the troposphere, in particular, the upper and middle part. In the upper troposphere the cooling from stratospheric ozone leads to a significant reduction of greenhouse warming. The modeled stratospheric aerosols from Mount Pinatubo generate a climate response (stratospheric warming and tropospheric cooling) in good agreement with microwave satellite measurements. Finally, analysis of a series of experiments with both stratospheric ozone and the Mount Pinatubo effect shows considerable variability in climate response, suggesting that an evolution having no warming in the period is as likely as another evolution showing modest warming. However, the observed trend of no warming in the midtroposphere and clear warming at the surface is not found in the model simulations.
Resumo:
The currently available model-based global data sets of atmospheric circulation are a by-product of the daily requirement of producing initial conditions for numerical weather prediction (NWP) models. These data sets have been quite useful for studying fundamental dynamical and physical processes, and for describing the nature of the general circulation of the atmosphere. However, due to limitations in the early data assimilation systems and inconsistencies caused by numerous model changes, the available model-based global data sets may not be suitable for studying global climate change. A comprehensive analysis of global observations based on a four-dimensional data assimilation system with a realistic physical model should be undertaken to integrate space and in situ observations to produce internally consistent, homogeneous, multivariate data sets for the earth's climate system. The concept is equally applicable for producing data sets for the atmosphere, the oceans, and the biosphere, and such data sets will be quite useful for studying global climate change.
Resumo:
The societal need for reliable climate predictions and a proper assessment of their uncertainties is pressing. Uncertainties arise not only from initial conditions and forcing scenarios, but also from model formulation. Here, we identify and document three broad classes of problems, each representing what we regard to be an outstanding challenge in the area of mathematics applied to the climate system. First, there is the problem of the development and evaluation of simple physically based models of the global climate. Second, there is the problem of the development and evaluation of the components of complex models such as general circulation models. Third, there is the problem of the development and evaluation of appropriate statistical frameworks. We discuss these problems in turn, emphasizing the recent progress made by the papers presented in this Theme Issue. Many pressing challenges in climate science require closer collaboration between climate scientists, mathematicians and statisticians. We hope the papers contained in this Theme Issue will act as inspiration for such collaborations and for setting future research directions.
Resumo:
We synthesize existing sedimentary charcoal records to reconstruct Holocene fire history at regional, continental and global scales. The reconstructions are compared with the two potential controls of burning at these broad scales – changes in climate and human activities – to assess their relative importance on trends in biomass burning. Here we consider several hypotheses that have been advanced to explain the Holocene record of fire, including climate, human activities and synergies between the two. Our results suggest that 1) episodes of high fire activity were relatively common in the early Holocene and were consistent with climate changes despite low global temperatures and low levels of biomass burning globally; 2) there is little evidence from the paleofire record to support the Early Anthropocene Hypothesis of human modification of the global carbon cycle; 3) there was a nearly-global increase in fire activity from 3 to 2 ka that is difficult to explain with either climate or humans, but the widespread and synchronous nature of the increase suggests at least a partial climate forcing; and 4) burning during the past century generally decreased but was spatially variable; it declined sharply in many areas, but there were also large increases (e.g., Australia and parts of Europe). Our analysis does not exclude an important role for human activities on global biomass burning during the Holocene, but instead provides evidence for a pervasive influence of climate across multiple spatial and temporal scales.
Resumo:
The increasing use of social media, applications or platforms that allow users to interact online, ensures that this environment will provide a useful source of evidence for the forensics examiner. Current tools for the examination of digital evidence find this data problematic as they are not designed for the collection and analysis of online data. Therefore, this paper presents a framework for the forensic analysis of user interaction with social media. In particular, it presents an inter-disciplinary approach for the quantitative analysis of user engagement to identify relational and temporal dimensions of evidence relevant to an investigation. This framework enables the analysis of large data sets from which a (much smaller) group of individuals of interest can be identified. In this way, it may be used to support the identification of individuals who might be ‘instigators’ of a criminal event orchestrated via social media, or a means of potentially identifying those who might be involved in the ‘peaks’ of activity. In order to demonstrate the applicability of the framework, this paper applies it to a case study of actors posting to a social media Web site.
Resumo:
Offshoring and outsourcing in global value chains have been extensively analyzed from a strategic management perspective (Gereffi & Li, 2012; Gereffi, Humphrey & Sturgeon, 2005; Mudambi & Venzin, 2010). This paper examines these issues from an internalization theory perspective by summarizing the contribution of internalization theory to supply chain analysis; considering how a division of labor is coordinated and comparing coordination by management with coordination by the market; and discussing the formal models of supply chains developed by economists. Supply chain researchers possessing an interest in economic principles and good mathematical skills can make an important contribution to internalization theory, and it is hoped that this paper will encourage them to do so.
Resumo:
Analysis of single forcing runs from CMIP5 (the fifth Coupled Model Intercomparison Project) simulations shows that the mid-twentieth century temperature hiatus, and the coincident decrease in precipitation, is likely to have been influenced strongly by anthropogenic aerosol forcing. Models that include a representation of the indirect effect of aerosol better reproduce inter-decadal variability in historical global-mean near-surface temperatures, particularly the cooling in the 1950s and 1960s, compared to models with representation of the aerosol direct effect only. Models with the indirect effect also show a more pronounced decrease in precipitation during this period, which is in better agreement with observations, and greater inter-decadal variability in the inter-hemispheric temperature difference. This study demonstrates the importance of representing aerosols, and their indirect effects, in general circulation models, and suggests that inter-model diversity in aerosol burden and representation of aerosol–cloud interaction can produce substantial variation in simulations of climate variability on multi decadal timescales.
Resumo:
This study focuses on the analysis of winter (October-November-December-January-February-March; ONDJFM) storm events and their changes due to increased anthropogenic greenhouse gas concentrations over Europe. In order to assess uncertainties that are due to model formulation, 4 regional climate models (RCMs) with 5 high resolution experiments, and 4 global general circulation models (GCMs) are considered. Firstly, cyclone systems as synoptic scale processes in winter are investigated, as they are a principal cause of the occurrence of extreme, damage-causing wind speeds. This is achieved by use of an objective cyclone identification and tracking algorithm applied to GCMs. Secondly, changes in extreme near-surface wind speeds are analysed. Based on percentile thresholds, the studied extreme wind speed indices allow a consistent analysis over Europe that takes systematic deviations of the models into account. Relative changes in both intensity and frequency of extreme winds and their related uncertainties are assessed and related to changing patterns of extreme cyclones. A common feature of all investigated GCMs is a reduced track density over central Europe under climate change conditions, if all systems are considered. If only extreme (i.e. the strongest 5%) cyclones are taken into account, an increasing cyclone activity for western parts of central Europe is apparent; however, the climate change signal reveals a reduced spatial coherency when compared to all systems, which exposes partially contrary results. With respect to extreme wind speeds, significant positive changes in intensity and frequency are obtained over at least 3 and 20% of the European domain under study (35–72°N and 15°W–43°E), respectively. Location and extension of the affected areas (up to 60 and 50% of the domain for intensity and frequency, respectively), as well as levels of changes (up to +15 and +200% for intensity and frequency, respectively) are shown to be highly dependent on the driving GCM, whereas differences between RCMs when driven by the same GCM are relatively small.
Resumo:
Interactions between different convection modes can be investigated using an energy–cycle description under a framework of mass–flux parameterization. The present paper systematically investigates this system by taking a limit of two modes: shallow and deep convection. Shallow convection destabilizes itself as well as the other convective modes by moistening and cooling the environment, whereas deep convection stabilizes itself as well as the other modes by drying and warming the environment. As a result, shallow convection leads to a runaway growth process in its stand–alone mode, whereas deep convection simply damps out. Interaction between these two convective modes becomes a rich problem, even when it is limited to the case with no large–scale forcing, because of these opposing tendencies. Only if the two modes are coupled at a proper level can a self–sustaining system arise, exhibiting a periodic cycle. The present study establishes the conditions for self–sustaining periodic solutions. It carefully documents the behaviour of the two mode system in order to facilitate the interpretation of global model behaviours when this energy–cycle is implemented as a closure into a convection parameterization in future.
First order k-th moment finite element analysis of nonlinear operator equations with stochastic data
Resumo:
We develop and analyze a class of efficient Galerkin approximation methods for uncertainty quantification of nonlinear operator equations. The algorithms are based on sparse Galerkin discretizations of tensorized linearizations at nominal parameters. Specifically, we consider abstract, nonlinear, parametric operator equations J(\alpha ,u)=0 for random input \alpha (\omega ) with almost sure realizations in a neighborhood of a nominal input parameter \alpha _0. Under some structural assumptions on the parameter dependence, we prove existence and uniqueness of a random solution, u(\omega ) = S(\alpha (\omega )). We derive a multilinear, tensorized operator equation for the deterministic computation of k-th order statistical moments of the random solution's fluctuations u(\omega ) - S(\alpha _0). We introduce and analyse sparse tensor Galerkin discretization schemes for the efficient, deterministic computation of the k-th statistical moment equation. We prove a shift theorem for the k-point correlation equation in anisotropic smoothness scales and deduce that sparse tensor Galerkin discretizations of this equation converge in accuracy vs. complexity which equals, up to logarithmic terms, that of the Galerkin discretization of a single instance of the mean field problem. We illustrate the abstract theory for nonstationary diffusion problems in random domains.
Resumo:
Quantitative simulations of the global-scale benefits of climate change mitigation are presented, using a harmonised, self-consistent approach based on a single set of climate change scenarios. The approach draws on a synthesis of output from both physically-based and economics-based models, and incorporates uncertainty analyses. Previous studies have projected global and regional climate change and its impacts over the 21st century but have generally focused on analysis of business-as-usual scenarios, with no explicit mitigation policy included. This study finds that both the economics-based and physically-based models indicate that early, stringent mitigation would avoid a large proportion of the impacts of climate change projected for the 2080s. However, it also shows that not all the impacts can now be avoided, so that adaptation would also therefore be needed to avoid some of the potential damage. Delay in mitigation substantially reduces the percentage of impacts that can be avoided, providing strong new quantitative evidence for the need for stringent and prompt global mitigation action on greenhouse gas emissions, combined with effective adaptation, if large, widespread climate change impacts are to be avoided. Energy technology models suggest that such stringent and prompt mitigation action is technologically feasible, although the estimated costs vary depending on the specific modelling approach and assumptions.
Resumo:
In this paper, dual-hop amplify-and-forward (AF) cooperative systems in the presence of high-power amplifier (HPA) nonlinearity at semi-blind relays, are investigated. Based on the modified AF cooperative system model taking into account the HPA nonlinearity, the expression for the output signal-to-noise ratio (SNR) at the destination node is derived, where the interference due to both the AF relaying mechanism and the HPA nonlinearity is characterized. The performance of the AF cooperative system under study is evaluated in terms of average symbol error probability (SEP), which is derived using the moment-generating function (MGF) approach, considering transmissions over Nakagami-m fading channels. Numerical results are provided and show the effects of some system parameters, such as the HPA parameters, numbers of relays, quadrature amplitude modulation (QAM) order, Nakagami parameters, on performance.