27 resultados para Fixed smeared crack model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Successful quantitative precipitation forecasts under convectively unstable conditions depend on the ability of the model to capture the location, timing and intensity of convection. Ensemble forecasts of two mesoscale convective outbreaks over the UK are examined with a view to understanding the nature and extent of their predictability. In addition to a control forecast, twelve ensemble members are run for each case with the same boundary conditions but with perturbations added to the boundary layer. The intention is to introduce perturbations of appropriate magnitude and scale so that the large-scale behaviour of the simulations is not changed. In one case, convection is in statistical equilibrium with the large-scale flow. This places a constraint on the total precipitation, but the location and intensity of individual storms varied. In contrast, the other case was characterised by a large-scale capping inversion. As a result, the location of individual storms was fixed, but their intensities and the total precipitation varied strongly. The ensemble shows case-to-case variability in the nature of predictability of convection in a mesoscale model, and provides additional useful information for quantitative precipitation forecasting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we introduce a new testing procedure for evaluating the rationality of fixed-event forecasts based on a pseudo-maximum likelihood estimator. The procedure is designed to be robust to departures in the normality assumption. A model is introduced to show that such departures are likely when forecasters experience a credibility loss when they make large changes to their forecasts. The test is illustrated using monthly fixed-event forecasts produced by four UK institutions. Use of the robust test leads to the conclusion that certain forecasts are rational while use of the Gaussian-based test implies that certain forecasts are irrational. The difference in the results is due to the nature of the underlying data. Copyright © 2001 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper ensembles of forecasts (of up to six hours) are studied from a convection-permitting model with a representation of model error due to unresolved processes. The ensemble prediction system (EPS) used is an experimental convection-permitting version of the UK Met Office’s 24- member Global and Regional Ensemble Prediction System (MOGREPS). The method of representing model error variability, which perturbs parameters within the model’s parameterisation schemes, has been modified and we investigate the impact of applying this scheme in different ways. These are: a control ensemble where all ensemble members have the same parameter values; an ensemble where the parameters are different between members, but fixed in time; and ensembles where the parameters are updated randomly every 30 or 60 min. The choice of parameters and their ranges of variability have been determined from expert opinion and parameter sensitivity tests. A case of frontal rain over the southern UK has been chosen, which has a multi-banded rainfall structure. The consequences of including model error variability in the case studied are mixed and are summarised as follows. The multiple banding, evident in the radar, is not captured for any single member. However, the single band is positioned in some members where a secondary band is present in the radar. This is found for all ensembles studied. Adding model error variability with fixed parameters in time does increase the ensemble spread for near-surface variables like wind and temperature, but can actually decrease the spread of the rainfall. Perturbing the parameters periodically throughout the forecast does not further increase the spread and exhibits “jumpiness” in the spread at times when the parameters are perturbed. Adding model error variability gives an improvement in forecast skill after the first 2–3 h of the forecast for near-surface temperature and relative humidity. For precipitation skill scores, adding model error variability has the effect of improving the skill in the first 1–2 h of the forecast, but then of reducing the skill after that. Complementary experiments were performed where the only difference between members was the set of parameter values (i.e. no initial condition variability). The resulting spread was found to be significantly less than the spread from initial condition variability alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years both developed and developing countries have experienced an increasing number of government initiatives dedicated to reducing the administrative costs (AC) imposed on businesses by regulation. We use a bi-linear fixed-effects model to analyze the extent to which government initiatives to reduce AC through the Standard Cost Model (SCM) attract Foreign Direct Investment (FDI) among 32 developing countries. Controlling for standard determinants of the SCM, we find that the SCM in most cases leads to higher FDI and that the benefits are more significant where the SCM has been implemented for a longer period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The MATLAB model is contained within the compressed folders (versions are available as .zip and .tgz). This model uses MERRA reanalysis data (>34 years available) to estimate the hourly aggregated wind power generation for a predefined (fixed) distribution of wind farms. A ready made example is included for the wind farm distribution of Great Britain, April 2014 ("CF.dat"). This consists of an hourly time series of GB-total capacity factor spanning the period 1980-2013 inclusive. Given the global nature of reanalysis data, the model can be applied to any specified distribution of wind farms in any region of the world. Users are, however, strongly advised to bear in mind the limitations of reanalysis data when using this model/data. This is discussed in our paper: Cannon, Brayshaw, Methven, Coker, Lenaghan. "Using reanalysis data to quantify extreme wind power generation statistics: a 33 year case study in Great Britain". Submitted to Renewable Energy in March, 2014. Additional information about the model is contained in the model code itself, in the accompanying ReadMe file, and on our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A numerical model embodying the concepts of the Cowley-Lockwood (Cowley and Lockwood, 1992, 1997) paradigm has been used to produce a simple Cowley– Lockwood type expanding flow pattern and to calculate the resulting change in ion temperature. Cross-correlation, fixed threshold analysis and threshold relative to peak are used to determine the phase speed of the change in convection pattern, in response to a change in applied reconnection. Each of these methods fails to fully recover the expansion of the onset of the convection response that is inherent in the simulations. The results of this study indicate that any expansion of the convection pattern will be best observed in time-series data using a threshold which is a fixed fraction of the peak response. We show that these methods used to determine the expansion velocity can be used to discriminate between the two main models for the convection response to a change in reconnection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a simple, generic model of annual tree growth, called "T". This model accepts input from a first-principles light-use efficiency model (the "P" model). The P model provides values for gross primary production (GPP) per unit of absorbed photosynthetically active radiation (PAR). Absorbed PAR is estimated from the current leaf area. GPP is allocated to foliage, transport tissue, and fine-root production and respiration in such a way as to satisfy well-understood dimensional and functional relationships. Our approach thereby integrates two modelling approaches separately developed in the global carbon-cycle and forest-science literature. The T model can represent both ontogenetic effects (the impact of ageing) and the effects of environmental variations and trends (climate and CO2) on growth. Driven by local climate records, the model was applied to simulate ring widths during the period 1958–2006 for multiple trees of Pinus koraiensis from the Changbai Mountains in northeastern China. Each tree was initialised at its actual diameter at the time when local climate records started. The model produces realistic simulations of the interannual variability in ring width for different age cohorts (young, mature, and old). Both the simulations and observations show a significant positive response of tree-ring width to growing-season total photosynthetically active radiation (PAR0) and the ratio of actual to potential evapotranspiration (α), and a significant negative response to mean annual temperature (MAT). The slopes of the simulated and observed relationships with PAR0 and α are similar; the negative response to MAT is underestimated by the model. Comparison of simulations with fixed and changing atmospheric CO2 concentration shows that CO2 fertilisation over the past 50 years is too small to be distinguished in the ring-width data, given ontogenetic trends and interannual variability in climate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A FTC-DOJ study argues that state laws and regulations may inhibit the unbundling of real estate brokerage services in response to new technology. Our data show that 18 states have changed laws in ways that promote unbundling since 2000. We model brokerage costs as measured by number of agents in a state-level annual panel vector autoregressive framework, a novel way of analyzing wasteful competition. Our findings support a positive relationship between brokerage costs and lagged house price and transactions. We find that change in full-service brokers responds negatively (by well over two percentage points per year) to legal changes facilitating unbundling

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Initializing the ocean for decadal predictability studies is a challenge, as it requires reconstructing the little observed subsurface trajectory of ocean variability. In this study we explore to what extent surface nudging using well-observed sea surface temperature (SST) can reconstruct the deeper ocean variations for the 1949–2005 period. An ensemble made with a nudged version of the IPSLCM5A model and compared to ocean reanalyses and reconstructed datasets. The SST is restored to observations using a physically-based relaxation coefficient, in contrast to earlier studies, which use a much larger value. The assessment is restricted to the regions where the ocean reanalyses agree, i.e. in the upper 500 m of the ocean, although this can be latitude and basin dependent. Significant reconstruction of the subsurface is achieved in specific regions, namely region of subduction in the subtropical Atlantic, below the thermocline in the equatorial Pacific and, in some cases, in the North Atlantic deep convection regions. Beyond the mean correlations, ocean integrals are used to explore the time evolution of the correlation over 20-year windows. Classical fixed depth heat content diagnostics do not exhibit any significant reconstruction between the different existing observation-based references and can therefore not be used to assess global average time-varying correlations in the nudged simulations. Using the physically based average temperature above an isotherm (14 °C) alleviates this issue in the tropics and subtropics and shows significant reconstruction of these quantities in the nudged simulations for several decades. This skill is attributed to the wind stress reconstruction in the tropics, as already demonstrated in a perfect model study using the same model. Thus, we also show here the robustness of this result in an historical and observational context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genome-wide association studies (GWAS) have been widely used in genetic dissection of complex traits. However, common methods are all based on a fixed-SNP-effect mixed linear model (MLM) and single marker analysis, such as efficient mixed model analysis (EMMA). These methods require Bonferroni correction for multiple tests, which often is too conservative when the number of markers is extremely large. To address this concern, we proposed a random-SNP-effect MLM (RMLM) and a multi-locus RMLM (MRMLM) for GWAS. The RMLM simply treats the SNP-effect as random, but it allows a modified Bonferroni correction to be used to calculate the threshold p value for significance tests. The MRMLM is a multi-locus model including markers selected from the RMLM method with a less stringent selection criterion. Due to the multi-locus nature, no multiple test correction is needed. Simulation studies show that the MRMLM is more powerful in QTN detection and more accurate in QTN effect estimation than the RMLM, which in turn is more powerful and accurate than the EMMA. To demonstrate the new methods, we analyzed six flowering time related traits in Arabidopsis thaliana and detected more genes than previous reported using the EMMA. Therefore, the MRMLM provides an alternative for multi-locus GWAS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact of extreme sea ice initial conditions on modelled climate is analysed for a fully coupled atmosphere ocean sea ice general circulation model, the Hadley Centre climate model HadCM3. A control run is chosen as reference experiment with greenhouse gas concentration fixed at preindustrial conditions. Sensitivity experiments show an almost complete recovery from total removal or strong increase of sea ice after four years. Thus, uncertainties in initial sea ice conditions seem to be unimportant for climate modelling on decadal or longer time scales. When the initial conditions of the ocean mixed layer were adjusted to ice-free conditions, a few substantial differences remained for more than 15 model years. But these differences are clearly smaller than the uncertainty of the HadCM3 run and all the other 19 IPCC fourth assessment report climate model preindustrial runs. It is an important task to improve climate models in simulating the past sea ice variability to enable them to make reliable projections for the 21st century.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Field observations of new particle formation and the subsequent particle growth are typically only possible at a fixed measurement location, and hence do not follow the temporal evolution of an air parcel in a Lagrangian sense. Standard analysis for determining formation and growth rates requires that the time-dependent formation rate and growth rate of the particles are spatially invariant; air parcel advection means that the observed temporal evolution of the particle size distribution at a fixed measurement location may not represent the true evolution if there are spatial variations in the formation and growth rates. Here we present a zero-dimensional aerosol box model coupled with one-dimensional atmospheric flow to describe the impact of advection on the evolution of simulated new particle formation events. Wind speed, particle formation rates and growth rates are input parameters that can vary as a function of time and location, using wind speed to connect location to time. The output simulates measurements at a fixed location; formation and growth rates of the particle mode can then be calculated from the simulated observations at a stationary point for different scenarios and be compared with the ‘true’ input parameters. Hence, we can investigate how spatial variations in the formation and growth rates of new particles would appear in observations of particle number size distributions at a fixed measurement site. We show that the particle size distribution and growth rate at a fixed location is dependent on the formation and growth parameters upwind, even if local conditions do not vary. We also show that different input parameters used may result in very similar simulated measurements. Erroneous interpretation of observations in terms of particle formation and growth rates, and the time span and areal extent of new particle formation, is possible if the spatial effects are not accounted for.