995 resultados para Multivariate processes
Resumo:
Droughts tend to evolve slowly and affect large areas simultaneously, which suggests that improved understanding of spatial coherence of drought would enable better mitigation of drought impacts through enhanced monitoring and forecasting strategies. This study employs an up-to-date dataset of over 500 river flow time series from 11 European countries, along with a gridded precipitation dataset, to examine the spatial coherence of drought in Europe using regional indicators of precipitation and streamflow deficit. The drought indicators were generated for 24 homogeneous regions and, for selected regions, historical drought characteristics were corroborated with previous work. The spatial coherence of drought characteristics was then examined at a European scale. Historical droughts generally have distinctive signatures in their spatio-temporal development, so there was limited scope for using the evolution of historical events to inform forecasting. Rather, relationships were explored in time series of drought indicators between regions. Correlations were generally low, but multivariate analyses revealed broad continental-scale patterns, which appear to be related to large-scale atmospheric circulation indices (in particular, the North Atlantic Oscillation and the East Atlantic West Russia pattern). A novel methodology for forecasting was developed (and demonstrated with reference to the United Kingdom), which predicts drought from drought i.e. uses spatial coherence of drought to facilitate early warning of drought in a target region, from drought which is developing elsewhere in Europe.Whilst the skill of the methodology is relatively modest at present, this approach presents a potential new avenue for forecasting, which offers significant advantages in that it allows prediction for all seasons, and also shows some potential for forecasting the termination of drought conditions.
Resumo:
The case is made for a more careful analysis of the large time asymptotic of infinite particle systems in the thermodynamic limit beyond zero density. The insufficiency of current analysis even in the model case of free particles is demonstrated. Recent advances based on more sophisticated analytical tools like functions of mean variation and Hardy spaces are sketched.
Resumo:
This paper examines biogas innovation system and processes in two farming communities in Davao del Sur, Philippines. Innovation histories were traced through workshops, semi-structured interviews, observations and document analysis. The paper shows that there were diverse innovation actors both from public and private sectors. Restrictive attitudes and practices resulted in weak and limited interactions among actors. Multi-actor interaction was weak, signifying a lack of innovation actors that focus on creating, developing and strengthening linkages, networks and partnerships. The lack of support in the socio-organisational institutions that constitute the enabling environment within which innovation actors operate may lead to systemic failure.
Resumo:
The currently available model-based global data sets of atmospheric circulation are a by-product of the daily requirement of producing initial conditions for numerical weather prediction (NWP) models. These data sets have been quite useful for studying fundamental dynamical and physical processes, and for describing the nature of the general circulation of the atmosphere. However, due to limitations in the early data assimilation systems and inconsistencies caused by numerous model changes, the available model-based global data sets may not be suitable for studying global climate change. A comprehensive analysis of global observations based on a four-dimensional data assimilation system with a realistic physical model should be undertaken to integrate space and in situ observations to produce internally consistent, homogeneous, multivariate data sets for the earth's climate system. The concept is equally applicable for producing data sets for the atmosphere, the oceans, and the biosphere, and such data sets will be quite useful for studying global climate change.
Resumo:
The purpose of this lecture is to review recent development in data analysis, initialization and data assimilation. The development of 3-dimensional multivariate schemes has been very timely because of its suitability to handle the many different types of observations during FGGE. Great progress has taken place in the initialization of global models by the aid of non-linear normal mode technique. However, in spite of great progress, several fundamental problems are still unsatisfactorily solved. Of particular importance is the question of the initialization of the divergent wind fields in the Tropics and to find proper ways to initialize weather systems driven by non-adiabatic processes. The unsatisfactory ways in which such processes are being initialized are leading to excessively long spin-up times.
Resumo:
In this paper we report on a study conducted using the Middle Atmospheric Nitrogen TRend Assessment (MANTRA) balloon measurements of stratospheric constituents and temperature and the Canadian Middle Atmosphere Model (CMAM). Three different kinds of data are used to assess the inter-consistency of the combined dataset: single profiles of long-lived species from MANTRA 1998, sparse climatologies from the ozonesonde measurements during the four MANTRA campaigns and from HALOE satellite measurements, and the CMAM climatology. In doing so, we evaluate the ability of the model to reproduce the measured fields and to thereby test our ability to describe mid-latitude summertime stratospheric processes. The MANTRA campaigns were conducted at Vanscoy, Saskatchewan, Canada (52◦ N, 107◦ W)in late August and early September of 1998, 2000, 2002 and 2004. During late summer at mid-latitudes, the stratosphere is close to photochemical control, providing an ideal scenario for the study reported here. From this analysis we find that: (1) reducing the value for the vertical diffusion coefficient in CMAM to a more physically reasonable value results in the model better reproducing the measured profiles of long-lived species; (2) the existence of compact correlations among the constituents, as expected from independent measurements in the literature and from models, confirms the self-consistency of the MANTRA measurements; and (3) the 1998 measurements show structures in the chemical species profiles that can be associated with transport, adding to the growing evidence that the summertime stratosphere can be much more disturbed than anticipated. The mechanisms responsible for such disturbances need to be understood in order to assess the representativeness of the measurements and to isolate longterm trends.
Resumo:
We discuss the modeling of dielectric responses of electromagnetically excited networks which are composed of a mixture of capacitors and resistors. Such networks can be employed as lumped-parameter circuits to model the response of composite materials containing conductive and insulating grains. The dynamics of the excited network systems are studied using a state space model derived from a randomized incidence matrix. Time and frequency domain responses from synthetic data sets generated from state space models are analyzed for the purpose of estimating the fraction of capacitors in the network. Good results were obtained by using either the time-domain response to a pulse excitation or impedance data at selected frequencies. A chemometric framework based on a Successive Projections Algorithm (SPA) enables the construction of multiple linear regression (MLR) models which can efficiently determine the ratio of conductive to insulating components in composite material samples. The proposed method avoids restrictions commonly associated with Archie’s law, the application of percolation theory or Kohlrausch-Williams-Watts models and is applicable to experimental results generated by either time domain transient spectrometers or continuous-wave instruments. Furthermore, it is quite generic and applicable to tomography, acoustics as well as other spectroscopies such as nuclear magnetic resonance, electron paramagnetic resonance and, therefore, should be of general interest across the dielectrics community.
Resumo:
Anthropogenic midden deposits are remarkably well preserved at the Neolithic settlement of atalhöyük and provide significant archaeological information on the types and nature of activities occurring at the site. To decipher their complex stratigraphy and to investigate formation processes, a combination of geoarchaeological techniques was used. Deposits were investigated from the early ceramic to late Neolithic levels, targeting continuous sequences to examine high resolution and broader scale changes in deposition. Thin-section micromorphology combined with targeted phytolith and geochemical analyses indicates they are composed of a diverse range of ashes and other charred and siliceous plant materials, with inputs of decayed plants and organic matter, fecal waste, and sedimentary aggregates, each with diverse depositional pathways. Activities identified include in situ burning, with a range of different fuel types that may be associated with different activities. The complexity and heterogeneity of the midden deposits, and thus the necessity of employing an integrated microstratigraphic approach is demonstrated, as a prerequisite for cultural and palaeoenvironmental reconstructions.
Resumo:
During long-range transport, many distinct processes – including photochemistry, deposition, emissions and mixing – contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio (Δ O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing (Δ O3chem) over the whole simulation is greater than net physical processing (Δ O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on Δ O3chem than Δ O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is −5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.
Resumo:
The agility of inter-organizational process represents the ability of virtual enterprise to respond rapidly to the changing market environment. Many theories and methodologies about inter-organizational process have been developed but the dynamic agility has seldom been addressed. A virtual enterprise whose process has a high dynamic agility will be able to adjust with the changing environment in short time and low cost. This paper analyzes the agility of inter-organizational process from a dynamic perspective. Two indexes are proposed to evaluate the dynamic agility: time and cost. Furthermore, the method to measure the dynamic agility using simulation is studied. Finally, a case study is given to illustrate the method to measure the dynamic agility.
Resumo:
A novel version of the classical surface pressure tendency equation (PTE) is applied to ERA-Interim reanalysis data to quantitatively assess the contribution of diabatic processes to the deepening of extratropical cyclones relative to effects of temperature advection and vertical motions. The five cyclone cases selected, Lothar and Martin in December 1999, Kyrill in January 2007, Klaus in January 2009, and Xynthia in February 2010, all showed explosive deepening and brought considerable damage to parts of Europe. For Xynthia, Klaus and Lothar diabatic processes contribute more to the observed surface pressure fall than horizontal temperature advection during their respective explosive deepening phases, while Kyrill and Martin appear to be more baroclinically driven storms. The powerful new diagnostic tool presented here can easily be applied to large numbers of cyclones and will help to better understand the role of diabatic processes in future changes in extratropical storminess.
Resumo:
For a Lévy process ξ=(ξt)t≥0 drifting to −∞, we define the so-called exponential functional as follows: Formula Under mild conditions on ξ, we show that the following factorization of exponential functionals: Formula holds, where × stands for the product of independent random variables, H− is the descending ladder height process of ξ and Y is a spectrally positive Lévy process with a negative mean constructed from its ascending ladder height process. As a by-product, we generate an integral or power series representation for the law of Iξ for a large class of Lévy processes with two-sided jumps and also derive some new distributional properties. The proof of our main result relies on a fine Markovian study of a class of generalized Ornstein–Uhlenbeck processes, which is itself of independent interest. We use and refine an alternative approach of studying the stationary measure of a Markov process which avoids some technicalities and difficulties that appear in the classical method of employing the generator of the dual Markov process.
Resumo:
We prove Chung-type laws of the iterated logarithm for general Lévy processes at zero. In particular, we provide tools to translate small deviation estimates directly into laws of the iterated logarithm. This reveals laws of the iterated logarithm for Lévy processes at small times in many concrete examples. In some cases, exotic norming functions are derived.
Resumo:
It is widely accepted that some of the most accurate Value-at-Risk (VaR) estimates are based on an appropriately specified GARCH process. But when the forecast horizon is greater than the frequency of the GARCH model, such predictions have typically required time-consuming simulations of the aggregated returns distributions. This paper shows that fast, quasi-analytic GARCH VaR calculations can be based on new formulae for the first four moments of aggregated GARCH returns. Our extensive empirical study compares the Cornish–Fisher expansion with the Johnson SU distribution for fitting distributions to analytic moments of normal and Student t, symmetric and asymmetric (GJR) GARCH processes to returns data on different financial assets, for the purpose of deriving accurate GARCH VaR forecasts over multiple horizons and significance levels.