97 resultados para Event Procedure
em CentAUR: Central Archive University of Reading - UK
Resumo:
In this paper we introduce a new testing procedure for evaluating the rationality of fixed-event forecasts based on a pseudo-maximum likelihood estimator. The procedure is designed to be robust to departures in the normality assumption. A model is introduced to show that such departures are likely when forecasters experience a credibility loss when they make large changes to their forecasts. The test is illustrated using monthly fixed-event forecasts produced by four UK institutions. Use of the robust test leads to the conclusion that certain forecasts are rational while use of the Gaussian-based test implies that certain forecasts are irrational. The difference in the results is due to the nature of the underlying data. Copyright © 2001 John Wiley & Sons, Ltd.
Resumo:
The stratospheric sudden warming in the Southern Hemisphere (SH) in September 2002 was unexpected for two reasons. First, planetary wave activity in the Southern Hemisphere is very weak, and midwinter warmings have never been observed, at least not since observations of the upper stratosphere became regularly available. Second, the warming occurred in a west phase of the quasi-biennial oscillation (QBO) in the lower stratosphere. This is unexpected because warmings are usually considered to be more likely in the east phase of the QBO, when a zero wind line is present in the winter subtropics and hence confines planetary wave propagation to higher latitudes closer to the polar vortex. At first, this evidence suggests that the sudden warming must therefore be simply a result of anomalously strong planetary wave forcing from the troposphere. However, recent model studies have suggested that the midwinter polar vortex may also be sensitive to the equatorial winds in the upper stratosphere, the region dominated by the semiannual oscillation. In this paper, the time series of equatorial zonal winds from two different data sources, the 40-yr ECMWF Re-Analysis (ERA) and the Met Office assimilated dataset, are reviewed. Both suggest that the equatorial winds in the upper stratosphere above 10 hPa were anomalously easterly in 2002. Idealized model experiments are described in which the modeled equatorial winds were relaxed toward these observations for various years to examine whether the anomalous easterlies in 2002 could influence the timing of a warming event. It is found that the 2002 equatorial winds speed up the evolution of a warming event in the model. Therefore, this study suggests that the anomalous easterlies in the 1–10-hPa region may have been a contributory factor in the development of the observed SH warming. However, it is concluded that it is unlikely that the anomalous equatorial winds alone can explain the 2002 warming event.
Resumo:
The performance of a 2D numerical model of flood hydraulics is tested for a major event in Carlisle, UK, in 2005. This event is associated with a unique data set, with GPS surveyed wrack lines and flood extent surveyed 3 weeks after the flood. The Simple Finite Volume (SFV) model is used to solve the 2D Saint-Venant equations over an unstructured mesh of 30000 elements representing channel and floodplain, and allowing detailed hydraulics of flow around bridge piers and other influential features to be represented. The SFV model is also used to corroborate flows recorded for the event at two gauging stations. Calibration of Manning's n is performed with a two stage strategy, with channel values determined by calibration of the gauging station models, and floodplain values determined by optimising the fit between model results and observed water levels and flood extent for the 2005 event. RMS error for the calibrated model compared with surveyed water levels is ~±0.4m, the same order of magnitude as the estimated error in the survey data. The study demonstrates the ability of unstructured mesh hydraulic models to represent important hydraulic processes across a range of scales, with potential applications to flood risk management.
Resumo:
Our understanding of the ancient ocean-atmosphere system has focused on oceanic proxies. However, the study of terrestrial proxies is equally necessary to constrain our understanding of ancient climates and linkages between the terrestrial and oceanic carbon reservoirs. We have analyzed carbon-isotope ratios from fossil plant material through the Valanginian and Lower Hauterivian from a shallow-marine, ammonite-constrained succession in the Crimean Peninsula of the southern Ukraine in order to determine if the Upper Valanginian positive carbon-isotope excursion is expressed in the atmosphere. delta(13)C(plant) values fluctuate around -23% to -22% for the Valanginian-Hauterivian, except during the Upper Valanginian where delta(13)C(plant) values record a positive excursion to similar to-18%. Based upon ammonite biostratigraphy from Crimea, and in conjunction with a composite Tethyan marine delta(13)C(carb) curve, several conclusions can be drawn: (1) the delta(13)C(plant) record indicates that the atmospheric carbon reservoir was affected; (2) the defined ammonite correlations between Europe and Crimea are synchronous; and (3) a change in photosynthetic carbon-isotope fractionation, caused by a decrease in atmospheric PCO2, occurred during the Upper Valanginian Positive delta(13)C excursion. Our new data, combined with other paleoenvironmental and paleoclimatic information, indicate that the Upper Valanginian was a cool period (icehouse) and highlights that the Cretaceous period was interrupted by periods of cooling and was not an equable climate as previously thought. (C) 2005 Elsevier B.V. All rights reserved.
Resumo:
The problem of modeling solar energetic particle (SEP) events is important to both space weather research and forecasting, and yet it has seen relatively little progress. Most important SEP events are associated with coronal mass ejections (CMEs) that drive coronal and interplanetary shocks. These shocks can continuously produce accelerated particles from the ambient medium to well beyond 1 AU. This paper describes an effort to model real SEP events using a Center for Integrated Space weather Modeling (CISM) MHD solar wind simulation including a cone model of CMEs to initiate the related shocks. In addition to providing observation-inspired shock geometry and characteristics, this MHD simulation describes the time-dependent observer field line connections to the shock source. As a first approximation, we assume a shock jump-parameterized source strength and spectrum, and that scatter-free transport occurs outside of the shock source, thus emphasizing the role the shock evolution plays in determining the modeled SEP event profile. Three halo CME events on May 12, 1997, November 4, 1997 and December 13, 2006 are used to test the modeling approach. While challenges arise in the identification and characterization of the shocks in the MHD model results, this approach illustrates the importance to SEP event modeling of globally simulating the underlying heliospheric event. The results also suggest the potential utility of such a model for forcasting and for interpretation of separated multipoint measurements such as those expected from the STEREO mission.
Resumo:
One of the primary goals of the Center for Integrated Space Weather Modeling (CISM) effort is to assess and improve prediction of the solar wind conditions in near‐Earth space, arising from both quasi‐steady and transient structures. We compare 8 years of L1 in situ observations to predictions of the solar wind speed made by the Wang‐Sheeley‐Arge (WSA) empirical model. The mean‐square error (MSE) between the observed and model predictions is used to reach a number of useful conclusions: there is no systematic lag in the WSA predictions, the MSE is found to be highest at solar minimum and lowest during the rise to solar maximum, and the optimal lead time for 1 AU solar wind speed predictions is found to be 3 days. However, MSE is shown to frequently be an inadequate “figure of merit” for assessing solar wind speed predictions. A complementary, event‐based analysis technique is developed in which high‐speed enhancements (HSEs) are systematically selected and associated from observed and model time series. WSA model is validated using comparisons of the number of hit, missed, and false HSEs, along with the timing and speed magnitude errors between the forecasted and observed events. Morphological differences between the different HSE populations are investigated to aid interpretation of the results and improvements to the model. Finally, by defining discrete events in the time series, model predictions from above and below the ecliptic plane can be used to estimate an uncertainty in the predicted HSE arrival times.
Resumo:
Despite the success of studies attempting to integrate remotely sensed data and flood modelling and the need to provide near-real time data routinely on a global scale as well as setting up online data archives, there is to date a lack of spatially and temporally distributed hydraulic parameters to support ongoing efforts in modelling. Therefore, the objective of this project is to provide a global evaluation and benchmark data set of floodplain water stages with uncertainties and assimilation in a large scale flood model using space-borne radar imagery. An algorithm is developed for automated retrieval of water stages with uncertainties from a sequence of radar imagery and data are assimilated in a flood model using the Tewkesbury 2007 flood event as a feasibility study. The retrieval method that we employ is based on possibility theory which is an extension of fuzzy sets and that encompasses probability theory. In our case we first attempt to identify main sources of uncertainty in the retrieval of water stages from radar imagery for which we define physically meaningful ranges of parameter values. Possibilities of values are then computed for each parameter using a triangular ‘membership’ function. This procedure allows the computation of possible values of water stages at maximum flood extents along a river at many different locations. At a later stage in the project these data are then used in assimilation, calibration or validation of a flood model. The application is subsequently extended to a global scale using wide swath radar imagery and a simple global flood forecasting model thereby providing improved river discharge estimates to update the latter.
Resumo:
We use proper orthogonal decomposition (POD) to study a transient teleconnection event at the onset of the 2001 planet-encircling dust storm on Mars, in terms of empirical orthogonal functions (EOFs). There are several differences between this and previous studies of atmospheric events using EOFs. First, instead of using a single variable such as surface pressure or geopotential height on a given pressure surface, we use a dataset describing the evolution in time of global and fully three-dimensional atmospheric fields such as horizontal velocity and temperature. These fields are produced by assimilating Thermal Emission Spectrometer observations from NASA's Mars Global Surveyor spacecraft into a Mars general circulation model. We use total atmospheric energy (TE) as a physically meaningful quantity which weights the state variables. Second, instead of adopting the EOFs to define teleconnection patterns as planetary-scale correlations that explain a large portion of long time-scale variability, we use EOFs to understand transient processes due to localised heating perturbations that have implications for the atmospheric circulation over distant regions. The localised perturbation is given by anomalous heating due to the enhanced presence of dust around the northern edge of the Hellas Planitia basin on Mars. We show that the localised disturbance is seemingly restricted to a small number (a few tens) of EOFs. These can be classified as low-order, transitional, or high-order EOFs according to the TE amount they explain throughout the event. Despite the global character of the EOFs, they show the capability of accounting for the localised effects of the perturbation via the presence of specific centres of action. We finally discuss possible applications for the study of terrestrial phenomena with similar characteristics.
Resumo:
We perform a numerical study of the evolution of a Coronal Mass Ejection (CME) and its interaction with the coronal magnetic field based on the 12 May 1997, CME event using a global MagnetoHydroDynamic (MHD) model for the solar corona. The ambient solar wind steady-state solution is driven by photospheric magnetic field data, while the solar eruption is obtained by superimposing an unstable flux rope onto the steady-state solution. During the initial stage of CME expansion, the core flux rope reconnects with the neighboring field, which facilitates lateral expansion of the CME footprint in the low corona. The flux rope field also reconnects with the oppositely orientated overlying magnetic field in the manner of the breakout model. During this stage of the eruption, the simulated CME rotates counter-clockwise to achieve an orientation that is in agreement with the interplanetary flux rope observed at 1 AU. A significant component of the CME that expands into interplanetary space comprises one of the side lobes created mainly as a result of reconnection with the overlying field. Within 3 hours, reconnection effectively modifies the CME connectivity from the initial condition where both footpoints are rooted in the active region to a situation where one footpoint is displaced into the quiet Sun, at a significant distance (≈1R ) from the original source region. The expansion and rotation due to interaction with the overlying magnetic field stops when the CME reaches the outer edge of the helmet streamer belt, where the field is organized on a global scale. The simulation thus offers a new view of the role reconnection plays in rotating a CME flux rope and transporting its footpoints while preserving its core structure.
Resumo:
Reducing carbon conversion of ruminally degraded feed into methane increases feed efficiency and reduces emission of this potent greenhouse gas into the environment. Accurate, yet simple, predictions of methane production of ruminants on any feeding regime are important in the nutrition of ruminants, and in modeling methane produced by them. The current work investigated feed intake, digestibility and methane production by open-circuit respiration measurements in sheep fed 15 untreated, sodium hydroxide (NaOH) treated and anhydrous ammonia (NH3) treated wheat, barley and oat straws. In vitro fermentation characteristics of straws were obtained from incubations using the Hohenheim gas production system that measured gas production, true substrate degradability, short-chain fatty acid production and efficiency of microbial production from the ratio of truly degraded substrate to gas volume. In the 15 straws, organic matter (OM) intake and in vivo OM digestibility ranged from 563 to 1201 g and from 0.464 to 0.643, respectively. Total daily methane production ranged from 13.0 to 34.4 l, whereas methane produced/kg OM matter apparently digested in vivo varied from 35.0 to 61.8 l. The OM intake was positively related to total methane production (R2 = 0.81, P<0.0001), and in vivo OM digestibility was also positively associated with methane production (R2 = 0.67, P<0.001), but negatively associated with methane production/kg digestible OM intake (R2 = 0.61, P<0.001). In the in vitro incubations of the 15 straws, the ratio of acetate to propionate ranged from 2.3 to 2.8 (P<0.05) and efficiencies of microbial production ranged from 0.21 to 0.37 (P<0.05) at half asymptotic gas production. Total daily methane production, calculated from in vitro fermentation characteristics (i.e., true degradability, SCFA ratio and efficiency of microbial production) and OM intake, compared well with methane measured in the open-circuit respiration chamber (y = 2.5 + 0.86x, R2 = 0.89, P<0.0001, Sy.x = 2.3). Methane production from forage fed ruminants can be predicted accurately by simple in vitro incubations combining true substrate degradability and gas volume measurements, if feed intake is known.
Resumo:
Fixed transactions costs that prohibit exchange engender bias in supply analysis due to censoring of the sample observations. The associated bias in conventional regression procedures applied to censored data and the construction of robust methods for mitigating bias have been preoccupations of applied economists since Tobin [Econometrica 26 (1958) 24]. This literature assumes that the true point of censoring in the data is zero and, when this is not the case, imparts a bias to parameter estimates of the censored regression model. We conjecture that this bias can be significant; affirm this from experiments; and suggest techniques for mitigating this bias using Bayesian procedures. The bias-mitigating procedures are based on modifications of the key step that facilitates Bayesian estimation of the censored regression model; are easy to implement; work well in both small and large samples; and lead to significantly improved inference in the censored regression model. These findings are important in light of the widespread use of the zero-censored Tobit regression and we investigate their consequences using data on milk-market participation in the Ethiopian highlands. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
This article illustrates the usefulness of applying bootstrap procedures to total factor productivity Malmquist indices, derived with data envelopment analysis (DEA), for a sample of 250 Polish farms during 1996-2000. The confidence intervals constructed as in Simar and Wilson suggest that the common portrayal of productivity decline in Polish agriculture may be misleading. However, a cluster analysis based on bootstrap confidence intervals reveals that important policy conclusions can be drawn regarding productivity enhancement.
Resumo:
The conventional method for the assessment of acute dermal toxicity (OECD Test Guideline 402, 1987) uses death of animals as an endpoint to identify the median lethal dose (LD50). A new OECD Testing Guideline called the dermal fixed dose procedure (dermal FDP) is being prepared to provide an alternative to Test Guideline 402. In contrast to Test Guideline 402, the dermal FDP does not provide a point estimate of the LD50, but aims to identify that dose of the substance under investigation that causes clear signs of nonlethal toxicity. This is then used to assign classification according to the new Globally Harmonised System of Classification and Labelling scheme (GHS). The dermal FDP has been validated using statistical modelling rather than by in vivo testing. The statistical modelling approach enables calculation of the probability of each GHS classification and the expected numbers of deaths and animals used in the test for imaginary substances with a range of LD50 values and dose-response curve slopes. This paper describes the dermal FDP and reports the results from the statistical evaluation. It is shown that the procedure will be completed with considerably less death and suffering than guideline 402, and will classify substances either in the same or a more stringent GHS class than that assigned on the basis of the LD50 value.