880 resultados para Forecasting and replenishment (CPFR)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prediction of the Sun's magnetic activity is important because of its effect on space environment and climate. However, recent efforts to predict the amplitude of the solar cycle have resulted in diverging forecasts with no consensus. Yeates et al. have shown that the dynamical memory of the solar dynamo mechanism governs predictability, and this memory is different for advection- and diffusion-dominated solar convection zones. By utilizing stochastically forced, kinematic dynamo simulations, we demonstrate that the inclusion of downward turbulent pumping of magnetic flux reduces the memory of both advection- and diffusion-dominated solar dynamos to only one cycle; stronger pumping degrades this memory further. Thus, our results reconcile the diverging dynamo-model-based forecasts for the amplitude of solar cycle 24. We conclude that reliable predictions for the maximum of solar activity can be made only at the preceding minimum-allowing about five years of advance planning for space weather. For more accurate predictions, sequential data assimilation would be necessary in forecasting models to account for the Sun's short memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research has been undertaken to ascertain the predictability of non-stationary time series using wavelet and Empirical Mode Decomposition (EMD) based time series models. Methods have been developed in the past to decompose a time series into components. Forecasting of these components combined with random component could yield predictions. Using this ideology, wavelet and EMD analyses have been incorporated separately which decomposes a time series into independent orthogonal components with both time and frequency localizations. The component series are fit with specific auto-regressive models to obtain forecasts which are later combined to obtain the actual predictions. Four non-stationary streamflow sites (USGS data resources) of monthly total volumes and two non-stationary gridded rainfall sites (IMD) of monthly total rainfall are considered for the study. The predictability is checked for six and twelve months ahead forecasts across both the methodologies. Based on performance measures, it is observed that wavelet based method has better prediction capabilities over EMD based method despite some of the limitations of time series methods and the manner in which decomposition takes place. Finally, the study concludes that the wavelet based time series algorithm can be used to model events such as droughts with reasonable accuracy. Also, some modifications that can be made in the model have been discussed that could extend the scope of applicability to other areas in the field of hydrology. (C) 2013 Elesvier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Streamflow forecasts at daily time scale are necessary for effective management of water resources systems. Typical applications include flood control, water quality management, water supply to multiple stakeholders, hydropower and irrigation systems. Conventionally physically based conceptual models and data-driven models are used for forecasting streamflows. Conceptual models require detailed understanding of physical processes governing the system being modeled. Major constraints in developing effective conceptual models are sparse hydrometric gauge network and short historical records that limit our understanding of physical processes. On the other hand, data-driven models rely solely on previous hydrological and meteorological data without directly taking into account the underlying physical processes. Among various data driven models Auto Regressive Integrated Moving Average (ARIMA), Artificial Neural Networks (ANNs) are most widely used techniques. The present study assesses performance of ARIMA and ANNs methods in arriving at one-to seven-day ahead forecast of daily streamflows at Basantpur streamgauge site that is situated at upstream of Hirakud Dam in Mahanadi river basin, India. The ANNs considered include Feed-Forward back propagation Neural Network (FFNN) and Radial Basis Neural Network (RBNN). Daily streamflow forecasts at Basantpur site find use in management of water from Hirakud reservoir. (C) 2015 The Authors. Published by Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines differences in the surface black carbon (BC) aerosol loading between the Bay of Bengal (BoB) and the Arabian Sea (AS) and identifies dominant sources of BC in South Asia and surrounding regions during March-May 2006 (Integrated Campaign for Aerosols, Gases and Radiation Budget, ICARB) period. A total of 13 BC tracers are introduced in the Weather Research and Forecasting Model coupled with Chemistry to address these objectives. The model reproduced the temporal and spatial variability of BC distribution observed over the AS and the BoB during the ICARB ship cruise and captured spatial variability at the inland sites. In general, the model underestimates the observed BC mass concentrations. However, the model-observation discrepancy in this study is smaller compared to previous studies. Model results show that ICARB measurements were fairly well representative of the AS and the BoB during the pre-monsoon season. Elevated BC mass concentrations in the BoB are due to 5 times stronger influence of anthropogenic emissions on the BoB compared to the AS. Biomass burning in Burma also affects the BoB much more strongly than the AS. Results show that anthropogenic and biomass burning emissions, respectively, accounted for 60 and 37% of the average +/- standard deviation (representing spatial and temporal variability) BC mass concentration (1341 +/- 2353 ng m(-3)) in South Asia. BC emissions from residential (61 %) and industrial (23 %) sectors are the major anthropogenic sources, except in the Himalayas where vehicular emissions dominate. We find that regional-scale transport of anthropogenic emissions contributes up to 25% of BC mass concentrations in western and eastern India, suggesting that surface BC mass concentrations cannot be linked directly to the local emissions in different regions of South Asia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Northeast India and its adjoining areas are characterized by very high seismic activity. According to the Indian seismic code, the region falls under seismic zone V, which represents the highest seismic-hazard level in the country. This region has experienced a number of great earthquakes, such as the Assam (1950) and Shillong (1897) earthquakes, that caused huge devastation in the entire northeast and adjacent areas by flooding, landslides, liquefaction, and damage to roads and buildings. In this study, an attempt has been made to find the probability of occurrence of a major earthquake (M-w > 6) in this region using an updated earthquake catalog collected from different sources. Thereafter, dividing the catalog into six different seismic regions based on different tectonic features and seismogenic factors, the probability of occurrences was estimated using three models: the lognormal, Weibull, and gamma distributions. We calculated the logarithmic probability of the likelihood function (ln L) for all six regions and the entire northeast for all three stochastic models. A higher value of ln L suggests a better model, and a lower value shows a worse model. The results show different model suits for different seismic zones, but the majority follows lognormal, which is better for forecasting magnitude size. According to the results, Weibull shows the highest conditional probabilities among the three models for small as well as large elapsed time T and time intervals t, whereas the lognormal model shows the lowest and the gamma model shows intermediate probabilities. Only for elapsed time T = 0, the lognormal model shows the highest conditional probabilities among the three models at a smaller time interval (t = 3-15 yrs). The opposite result is observed at larger time intervals (t = 15-25 yrs), which show the highest probabilities for the Weibull model. However, based on this study, the IndoBurma Range and Eastern Himalaya show a high probability of occurrence in the 5 yr period 2012-2017 with >90% probability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Both earthquake prediction and failure prediction of disordered brittle media are difficult and complicated problems and they might have something in common. In order to search for clues for earthquake prediction, the common features of failure in a simple nonlinear dynamical model resembling disordered brittle media are examined. It is found that the failure manifests evolution-induced catastrophe (EIC), i.e., the abrupt transition from globally stable (GS) accumulation of damage to catastrophic failure. A distinct feature is the significant uncertainty of catastrophe, called sample-specificity. Consequently, it is impossible to make a deterministic prediction macroscopically. This is similar to the question of predictability of earthquakes. However, our model shows that strong stress fluctuations may be an immediate precursor of catastrophic failure statistically. This might provide clues for earthquake forecasting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rupture in the heterogeneous crust appears to be a catastrophe transition. Catastrophic rupture sensitively depends on the details of heterogeneity and stress transfer on multiple scales. These are difficult to identify and deal with. As a result, the threshold of earthquake-like rupture presents uncertainty. This may be the root of the difficulty of earthquake prediction. Based on a coupled pattern mapping model, we represent critical sensitivity and trans-scale fluctuations associated with catastrophic rupture. Critical sensitivity means that a system may become significantly sensitive near catastrophe transition. Trans-scale fluctuations mean that the level of stress fluctuations increases strongly and the spatial scale of stress and damage fluctuations evolves from the mesoscopic heterogeneity scale to the macroscopic scale as the catastrophe regime is approached. The underlying mechanism behind critical sensitivity and trans-scale fluctuations is the coupling effect between heterogeneity and dynamical nonlinearity. Such features may provide clues for prediction of catastrophic rupture, like material failure and great earthquakes. Critical sensitivity may be the physical mechanism underlying a promising earthquake forecasting method, the load-unload response ratio (LURR).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a conceptual model for the in-plane physics of an earthquake fault. The model employs cellular automaton techniques to simulate tectonic loading, earthquake rupture, and strain redistribution. The impact of a hypothetical crustal elastodynamic Green's function is approximated by a long-range strain redistribution law with a r(-p) dependance. We investigate the influence of the effective elastodynamic interaction range upon the dynamical behaviour of the model by conducting experiments with different values of the exponent (p). The results indicate that this model has two distinct, stable modes of behaviour. The first mode produces a characteristic earthquake distribution with moderate to large events preceeded by an interval of time in which the rate of energy release accelerates. A correlation function analysis reveals that accelerating sequences are associated with a systematic, global evolution of strain energy correlations within the system. The second stable mode produces Gutenberg-Richter statistics, with near-linear energy release and no significant global correlation evolution. A model with effectively short-range interactions preferentially displays Gutenberg-Richter behaviour. However, models with long-range interactions appear to switch between the characteristic and GR modes. As the range of elastodynamic interactions is increased, characteristic behaviour begins to dominate GR behaviour. These models demonstrate that evolution of strain energy correlations may occur within systems with a fixed elastodynamic interaction range. Supposing that similar mode-switching dynamical behaviour occurs within earthquake faults then intermediate-term forecasting of large earthquakes may be feasible for some earthquakes but not for others, in alignment with certain empirical seismological observations. Further numerical investigation of dynamical models of this type may lead to advances in earthquake forecasting research and theoretical seismology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Published also as: Documento de Trabajo Banco de España 0504/2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this paper is to analyse the value of information contained in prices of options on the IBEX 35 index at the Spanish Stock Exchange Market. The forward looking information is extracted using implied risk-neutral density functions estimated by a mixture of two-lognormals and three alternative risk-adjustments: the classic power and exponential utility functions and a habit-based specification that allows for a counter-cyclical variation of risk aversion. Our results show that at four-week horizon we can reject the hypothesis that between October 1996 and March 2000 the risk-neutral densities provide accurate predictions of the distributions of future realisations of the IBEX 35 index at a four-week horizon. When forecasting through risk-adjusted densities the performance of this period is statistically improved and we no longer reject that hypothesis. All risk-adjusted densities generate similar forecasting statistics. Then, at least for a horizon of four-weeks, the actual risk adjustment does not seem to be the issue. By contrast, at the one-week horizon risk-adjusted densities do not improve the forecasting ability of the risk-neutral counterparts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past four decades, the state of Hawaii has developed a system of eleven Marine Life Conservation Districts (MLCDs) to conserve and replenish marine resources around the state. Initially established to provide opportunities for public interaction with the marine environment, these MLCDs vary in size, habitat quality, and management regimes, providing an excellent opportunity to test hypotheses concerning marine protected area (MPA) design and function using multiple discreet sampling units. NOAA/NOS/NCCOS/Center for Coastal Monitoring and Assessment’s Biogeography Team developed digital benthic habitat maps for all MLCD and adjacent habitats. These maps were used to evaluate the efficacy of existing MLCDs for biodiversity conservation and fisheries replenishment, using a spatially explicit stratified random sampling design. Coupling the distribution of habitats and species habitat affinities using GIS technology elucidates species habitat utilization patterns at scales that are commensurate with ecosystem processes and is useful in defining essential fish habitat and biologically relevant boundaries for MPAs. Analysis of benthic cover validated the a priori classification of habitat types and provided justification for using these habitat strata to conduct stratified random sampling and analyses of fish habitat utilization patterns. Results showed that the abundance and distribution of species and assemblages exhibited strong correlations with habitat types. Fish assemblages in the colonized and uncolonized hardbottom habitats were found to be most similar among all of the habitat types. Much of the macroalgae habitat sampled was macroalgae growing on hard substrate, and as a result showed similarities with the other hardbottom assemblages. The fish assemblages in the sand habitats were highly variable but distinct from the other habitat types. Management regime also played an important role in the abundance and distribution of fish assemblages. MLCDs had higher values for most fish assemblage characteristics (e.g. biomass, size, diversity) compared with adjacent fished areas and Fisheries Management Areas (FMAs) across all habitat types. In addition, apex predators and other targeted resources species were more abundant and larger in the MLCDs, illustrating the effectiveness of these closures in conserving fish populations. Habitat complexity, quality, size and level of protection from fishing were important determinates of MLCD effectiveness with respect to their associated fish assemblages. (PDF contains 217 pages)