53 resultados para Decomposition of Ranked Models
Resumo:
Various methods of assessment have been applied to the One Dimensional Time to Explosion (ODTX) apparatus and experiments with the aim of allowing an estimate of the comparative violence of the explosion event to be made. Non-mechanical methods used were a simple visual inspection, measuring the increase in the void volume of the anvils following an explosion and measuring the velocity of the sound produced by the explosion over 1 metre. Mechanical methods used included monitoring piezo-electric devices inserted in the frame of the machine and measuring the rotational velocity of a rotating bar placed on the top of the anvils after it had been displaced by the shock wave. This last method, which resembles original Hopkinson Bar experiments, seemed the easiest to apply and analyse, giving relative rankings of violence and the possibility of the calculation of a “detonation” pressure.
Resumo:
A One-Dimensional Time to Explosion (ODTX) apparatus has been used to study the times to explosion of a number of compositions based on RDX and HMX over a range of contact temperatures. The times to explosion at any given temperature tend to increase from RDX to HMX and with the proportion of HMX in the composition. Thermal ignition theory has been applied to time to explosion data to calculate kinetic parameters. The apparent activation energy for all of the compositions lay between 127 kJ mol−1 and 146 kJ mol−1. There were big differences in the pre-exponential factor and this controlled the time to explosion rather than the activation energy for the process.
Resumo:
This study evaluated the effects of fat and sugar levels on the surface properties of Lactobacillus rhamnosus GG during storage in food model systems, simulating yogurt and ice cream, and related them with the ability of the bacterial cells to adhere to Caco-2 cells. Freeze-dried L. rhamnosus GG cells were added to the model food systems and stored for 7 days. The bacterial cells were analyzed for cell viability, hydrophobicity, ζ potential, and their ability to adhere to Caco-2 cells. The results indicated that the food type and its composition affected the surface and adhesion properties of the bacterial cells during storage, with yogurt being a better delivery vehicle than ice cream in terms of bacterial adhesion to Caco-2 cells. The most important factor influencing bacterial adhesion was the storage time rather than the levels of fats and sugars, indicating that conformational changes were taking place on the surface of the bacterial cells during storage.
Resumo:
Tropical Cyclone (TC) is normally not studied at the individual level with Global Climate Models (GCMs), because the coarse grid spacing is often deemed insufficient for a realistic representation of the basic underlying processes. GCMs are indeed routinely deployed at low resolution, in order to enable sufficiently long integrations, which means that only large-scale TC proxies are diagnosed. A new class of GCMs is emerging, however, which is capable of simulating TC-type vortexes by retaining a horizontal resolution similar to that of operational NWP GCMs; their integration on the latest supercomputers enables the completion of long-term integrations. The UK-Japan Climate Collaboration and the UK-HiGEM projects have developed climate GCMs which can be run routinely for decades (with grid spacing of 60 km) or centuries (with grid spacing of 90 km); when coupled to the ocean GCM, a mesh of 1/3 degrees provides eddy-permitting resolution. The 90 km resolution model has been developed entirely by the UK-HiGEM consortium (together with its 1/3 degree ocean component); the 60 km atmospheric GCM has been developed by UJCC, in collaboration with the Met Office Hadley Centre.
Resumo:
This paper extends the singular value decomposition to a path of matricesE(t). An analytic singular value decomposition of a path of matricesE(t) is an analytic path of factorizationsE(t)=X(t)S(t)Y(t) T whereX(t) andY(t) are orthogonal andS(t) is diagonal. To maintain differentiability the diagonal entries ofS(t) are allowed to be either positive or negative and to appear in any order. This paper investigates existence and uniqueness of analytic SVD's and develops an algorithm for computing them. We show that a real analytic pathE(t) always admits a real analytic SVD, a full-rank, smooth pathE(t) with distinct singular values admits a smooth SVD. We derive a differential equation for the left factor, develop Euler-like and extrapolated Euler-like numerical methods for approximating an analytic SVD and prove that the Euler-like method converges.
Resumo:
Scoring rules are an important tool for evaluating the performance of probabilistic forecasting schemes. A scoring rule is called strictly proper if its expectation is optimal if and only if the forecast probability represents the true distribution of the target. In the binary case, strictly proper scoring rules allow for a decomposition into terms related to the resolution and the reliability of a forecast. This fact is particularly well known for the Brier Score. In this article, this result is extended to forecasts for finite-valued targets. Both resolution and reliability are shown to have a positive effect on the score. It is demonstrated that resolution and reliability are directly related to forecast attributes that are desirable on grounds independent of the notion of scores. This finding can be considered an epistemological justification of measuring forecast quality by proper scoring rules. A link is provided to the original work of DeGroot and Fienberg, extending their concepts of sufficiency and refinement. The relation to the conjectured sharpness principle of Gneiting, et al., is elucidated.
Resumo:
References (20)Cited By (1)Export CitationAboutAbstract Proper scoring rules provide a useful means to evaluate probabilistic forecasts. Independent from scoring rules, it has been argued that reliability and resolution are desirable forecast attributes. The mathematical expectation value of the score allows for a decomposition into reliability and resolution related terms, demonstrating a relationship between scoring rules and reliability/resolution. A similar decomposition holds for the empirical (i.e. sample average) score over an archive of forecast–observation pairs. This empirical decomposition though provides a too optimistic estimate of the potential score (i.e. the optimum score which could be obtained through recalibration), showing that a forecast assessment based solely on the empirical resolution and reliability terms will be misleading. The differences between the theoretical and empirical decomposition are investigated, and specific recommendations are given how to obtain better estimators of reliability and resolution in the case of the Brier and Ignorance scoring rule.
Resumo:
The extra-tropical response to El Niño in configurations of a coupled model with increased horizontal resolution in the oceanic component is shown to be more realistic than in configurations with a low resolution oceanic component. This general conclusion is independent of the atmospheric resolution. Resolving small-scale processes in the ocean produces a more realistic oceanic mean state, with a reduced cold tongue bias, which in turn allows the atmospheric model component to be forced more realistically. A realistic atmospheric basic state is critical in order to represent Rossby wave propagation in response to El Niño, and hence the extra-tropical response to El Niño. Through the use of high and low resolution configurations of the forced atmospheric-only model component we show that, in isolation, atmospheric resolution does not significantly affect the simulation of the extra-tropical response to El Niño. It is demonstrated, through perturbations to the SST forcing of the atmospheric model component, that biases in the climatological SST field typical of coupled model configurations with low oceanic resolution can account for the erroneous atmospheric basic state seen in these coupled model configurations. These results highlight the importance of resolving small-scale oceanic processes in producing a realistic large-scale mean climate in coupled models, and suggest that it might may be possible to “squeeze out” valuable extra performance from coupled models through increases to oceanic resolution alone.
Resumo:
There is large uncertainty about the magnitude of warming and how rainfall patterns will change in response to any given scenario of future changes in atmospheric composition and land use. The models used for future climate projections were developed and calibrated using climate observations from the past 40 years. The geologic record of environmental responses to climate changes provides a unique opportunity to test model performance outside this limited climate range. Evaluation of model simulations against palaeodata shows that models reproduce the direction and large-scale patterns of past changes in climate, but tend to underestimate the magnitude of regional changes. As part of the effort to reduce model-related uncertainty and produce more reliable estimates of twenty-first century climate, the Palaeoclimate Modelling Intercomparison Project is systematically applying palaeoevaluation techniques to simulations of the past run with the models used to make future projections. This evaluation will provide assessments of model performance, including whether a model is sufficiently sensitive to changes in atmospheric composition, as well as providing estimates of the strength of biosphere and other feedbacks that could amplify the model response to these changes and modify the characteristics of climate variability.
Resumo:
The ability of the climate models participating in phase 5 of the Coupled Model Intercomparison Project (CMIP5) to simulate North Atlantic extratropical cyclones in winter [December–February (DJF)] and summer [June–August (JJA)] is investigated in detail. Cyclones are identified as maxima in T42 vorticity at 850 hPa and their propagation is tracked using an objective feature-tracking algorithm. By comparing the historical CMIP5 simulations (1976–2005) and the ECMWF Interim Re-Analysis (ERA-Interim; 1979–2008), the authors find that systematic biases affect the number and intensity of North Atlantic cyclones in CMIP5 models. In DJF, the North Atlantic storm track tends to be either too zonal or displaced southward, thus leading to too few and weak cyclones over the Norwegian Sea and too many cyclones in central Europe. In JJA, the position of the North Atlantic storm track is generally well captured but some CMIP5 models underestimate the total number of cyclones. The dynamical intensity of cyclones, as measured by either T42 vorticity at 850 hPa or mean sea level pressure, is too weak in both DJF and JJA. The intensity bias has a hemispheric character, and it cannot be simply attributed to the representation of the North Atlantic large- scale atmospheric state. Despite these biases, the representation of Northern Hemisphere (NH) storm tracks has improved since CMIP3 and some CMIP5 models are able of representing well both the number and the intensity of North Atlantic cyclones. In particular, some of the higher-atmospheric-resolution models tend to have a better representation of the tilt of the North Atlantic storm track and of the intensity of cyclones in DJF.
Resumo:
We utilize energy budget diagnostics from the Coupled Model Intercomparison Project phase 5 (CMIP5) to evaluate the models' climate forcing since preindustrial times employing an established regression technique. The climate forcing evaluated this way, termed the adjusted forcing (AF), includes a rapid adjustment term associated with cloud changes and other tropospheric and land-surface changes. We estimate a 2010 total anthropogenic and natural AF from CMIP5 models of 1.9 ± 0.9 W m−2 (5–95% range). The projected AF of the Representative Concentration Pathway simulations are lower than their expected radiative forcing (RF) in 2095 but agree well with efficacy weighted forcings from integrated assessment models. The smaller AF, compared to RF, is likely due to cloud adjustment. Multimodel time series of temperature change and AF from 1850 to 2100 have large intermodel spreads throughout the period. The intermodel spread of temperature change is principally driven by forcing differences in the present day and climate feedback differences in 2095, although forcing differences are still important for model spread at 2095. We find no significant relationship between the equilibrium climate sensitivity (ECS) of a model and its 2003 AF, in contrast to that found in older models where higher ECS models generally had less forcing. Given the large present-day model spread, there is no indication of any tendency by modelling groups to adjust their aerosol forcing in order to produce observed trends. Instead, some CMIP5 models have a relatively large positive forcing and overestimate the observed temperature change.
Resumo:
Longitudinal flow bursts observed by the European Incoherent Scatter (EISCAT) radar, in association with dayside auroral transients observed from Svalbard, have been interpreted as resulting from pulses of enhanced reconnection at the dayside magnetopause. However, an alternative model has recently been proposed for a steady rate of magnetopause reconnection, in which the bursts of longitudinal flow are due to increases in the field line curvature force, associated with the By component of the magnetosheath field. We here evaluate these two models, using observations on January 20, 1990, by EISCAT and a 630-nm all-sky camera at Ny Ålesund. For both models, we predict the behavior of both the dayside flows and the 630-nm emissions on newly opened field lines. It is shown that the signatures of steady reconnection and magnetosheath By changes could possibly resemble the observed 630-nm auroral events, but only for certain locations of the observing site, relative to the ionospheric projection of the reconnection X line: however, in such cases, the flow bursts would be seen between the 630-nm transients and not within them. On the other hand, the model of reconnection rate pulses predicts that the flows will be enhanced within each 630-nm transient auroral event. The observations on January 20, 1990, are shown to be consistent with the model of enhanced reconnection rate pulses over a background level and inconsistent with the effects of periodic enhancements of the magnitude of the magnetosheath By component. We estimate that the reconnection rate within the pulses would have to be at least an order of magnitude larger than the background level between the pulses.
Resumo:
Little is known about the effect of edaphic conditions on the decomposition of buried mammalian tissues. To address this, we set up a replicated incubation study with three fresh soils of contrasting pH: a Podsol (acidic), a Cambisol (neutral), and a Rendzina (alkaline), in which skeletal muscle tissue (SMT) of known mass was allowed to decompose. Our results clearly demonstrated that soil type had a considerable effect on the decomposition of SMT buried in soil. Differences in the rate of decomposition were up to three times greater in the Podsol compared with the Rendzina. The rate of microbial respiration was correlated to the rate of soft tissue loss, which suggests that the decomposition of SMT is dependent on the microbial community present in the soil. Decompositional by-products caused the pH of the immediate soil environment to change, becoming more alkaline at first, before acidifying. Our results demonstrate the need for greater consideration of soil type in future taphonomic studies.
Resumo:
The ecology of soils associated with dead mammals (i.e. cadavers) is poorly understood. Although temperature and soil type are well known to influence the decomposition of other organic resource patches, the effect of these variables on the degradation of cadavers in soil has received little experimental investigation. To address this, cadavers of juvenile rats (Rattus rattus) were buried in one of three contrasting soils (Sodosol, Rudosol, and Vertosol) from tropical savanna ecosystems in Queensland, Australia and incubated at 29 °C, 22 °C, or 15 °C in a laboratory setting. Cadavers and soils were destructively sampled at intervals of 7 days over an incubation period of 28 days. Measurements of decomposition included cadaver mass loss, carbon dioxide–carbon (CO2–C) evolution, microbial biomass carbon (MBC), protease activity, phosphodiesterase activity, and soil pH, which were all significantly positively affected by cadaver burial. A temperature effect was observed where peaks or differences in decomposition that at occurred at higher temperature would occur at later sample periods at lower temperature. Soil type also had an important effect on some measured parameters. These findings have important implications for a largely unexplored area of soil ecology and nutrient cycling, which are significant for forensic science, cemetery planning and livestock carcass disposal.