916 resultados para Multilevel Linear Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new blind equalisation algorithm for the pulse amplitude modulation (PAM) data transmitted through nonminimum phase (NMP) channels. The algorithm itself is based on a noncausal AR model of communication channels and the second- and fourth-order cumulants of the received data series, where only the diagonal slices of cumulants are used. The AR parameters are adjusted at each sample by using a successive over-relaxation (SOR) scheme, a variety of the ordinary LMS scheme, but with a faster convergence rate and a greater robustness to the selection of the ‘step-size’ in iterations. Computer simulations are implemented for both linear time-invariant (LTI) and linear time-variant (LTV) NMP channels, and the results show that the algorithm proposed in this paper has a fast convergence rate and a potential capability to track the LTV NMP channels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzes the use of linear and neural network models for financial distress classification, with emphasis on the issues of input variable selection and model pruning. A data-driven method for selecting input variables (financial ratios, in this case) is proposed. A case study involving 60 British firms in the period 1997-2000 is used for illustration. It is shown that the use of the Optimal Brain Damage pruning technique can considerably improve the generalization ability of a neural model. Moreover, the set of financial ratios obtained with the proposed selection procedure is shown to be an appropriate alternative to the ratios usually employed by practitioners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant challenge in the prediction of climate change impacts on ecosystems and biodiversity is quantifying the sources of uncertainty that emerge within and between different models. Statistical species niche models have grown in popularity, yet no single best technique has been identified reflecting differing performance in different situations. Our aim was to quantify uncertainties associated with the application of 2 complimentary modelling techniques. Generalised linear mixed models (GLMM) and generalised additive mixed models (GAMM) were used to model the realised niche of ombrotrophic Sphagnum species in British peatlands. These models were then used to predict changes in Sphagnum cover between 2020 and 2050 based on projections of climate change and atmospheric deposition of nitrogen and sulphur. Over 90% of the variation in the GLMM predictions was due to niche model parameter uncertainty, dropping to 14% for the GAMM. After having covaried out other factors, average variation in predicted values of Sphagnum cover across UK peatlands was the next largest source of variation (8% for the GLMM and 86% for the GAMM). The better performance of the GAMM needs to be weighed against its tendency to overfit the training data. While our niche models are only a first approximation, we used them to undertake a preliminary evaluation of the relative importance of climate change and nitrogen and sulphur deposition and the geographic locations of the largest expected changes in Sphagnum cover. Predicted changes in cover were all small (generally <1% in an average 4 m2 unit area) but also highly uncertain. Peatlands expected to be most affected by climate change in combination with atmospheric pollution were Dartmoor, Brecon Beacons and the western Lake District.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New ways of combining observations with numerical models are discussed in which the size of the state space can be very large, and the model can be highly nonlinear. Also the observations of the system can be related to the model variables in highly nonlinear ways, making this data-assimilation (or inverse) problem highly nonlinear. First we discuss the connection between data assimilation and inverse problems, including regularization. We explore the choice of proposal density in a Particle Filter and show how the ’curse of dimensionality’ might be beaten. In the standard Particle Filter ensembles of model runs are propagated forward in time until observations are encountered, rendering it a pure Monte-Carlo method. In large-dimensional systems this is very inefficient and very large numbers of model runs are needed to solve the data-assimilation problem realistically. In our approach we steer all model runs towards the observations resulting in a much more efficient method. By further ’ensuring almost equal weight’ we avoid performing model runs that are useless in the end. Results are shown for the 40 and 1000 dimensional Lorenz 1995 model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrated simulation models can be useful tools in farming system research. This chapter reviews three commonly used approaches, i.e. linear programming, system dynamics and agent-based models. Applications of each approach are presented and strengths and drawbacks discussed. We argue that, despite some challenges, mainly related to the integration of different approaches, model validation and the representation of human agents, integrated simulation models contribute important insights to the analysis of farming systems. They help unravelling the complex and dynamic interactions and feedbacks among bio-physical, socio-economic, and institutional components across scales and levels in farming systems. In addition, they can provide a platform for integrative research, and can support transdisciplinary research by functioning as learning platforms in participatory processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trends in the position of the DJF Austral jet have been analysed for multi-model ensemble simulations of a subset of high- and low-top models for the periods 1960-2000, 2000-2050, and 2050-2098 under the CMIP5 historical, RCP4.5, and RCP8.5 scenarios. Comparison with ERA-Interim, CFSR and the NCEP/NCAR reanalysis shows that the DJF and annual mean jet positions in CMIP5 models are equatorward of reanalyses for the 1979-2006 mean. Under the RCP8.5 scenario, the mean jet position in the high-top models moves 3 degrees poleward of its 1860-1900 position by 2098, compared to just over 2 degrees for the low-top models. Changes in jet position are linked to changes in the meridional temperature gradient. Compared to low-top models, the high-top models predict greater warming in the tropical upper troposphere due to increased greenhouse gases for all periods considered: up to 0.28 K/decade more in the period 2050-2098 under the RCP8.5 scenario. Larger polar lower-stratospheric cooling is seen in high-top models: -1.64 K/decade compared to -1.40 K/decade in the period 1960-2000, mainly in response to ozone depletion, and -0.41 K/decade compared to -0.12 K/decade in the period 2050-2098, mainly in response to increases in greenhouse gases. Analysis suggests that there may be a linear relationship between the trend in jet position and meridional temperature gradient, even under strong forcing. There were no clear indications of an approach to a geometric limit on the absolute magnitude of the poleward shift by 2100.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The estimation of the long-term wind resource at a prospective site based on a relatively short on-site measurement campaign is an indispensable task in the development of a commercial wind farm. The typical industry approach is based on the measure-correlate-predict �MCP� method where a relational model between the site wind velocity data and the data obtained from a suitable reference site is built from concurrent records. In a subsequent step, a long-term prediction for the prospective site is obtained from a combination of the relational model and the historic reference data. In the present paper, a systematic study is presented where three new MCP models, together with two published reference models �a simple linear regression and the variance ratio method�, have been evaluated based on concurrent synthetic wind speed time series for two sites, simulating the prospective and the reference site. The synthetic method has the advantage of generating time series with the desired statistical properties, including Weibull scale and shape factors, required to evaluate the five methods under all plausible conditions. In this work, first a systematic discussion of the statistical fundamentals behind MCP methods is provided and three new models, one based on a nonlinear regression and two �termed kernel methods� derived from the use of conditional probability density functions, are proposed. All models are evaluated by using five metrics under a wide range of values of the correlation coefficient, the Weibull scale, and the Weibull shape factor. Only one of all models, a kernel method based on bivariate Weibull probability functions, is capable of accurately predicting all performance metrics studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional Mediterranean diet is thought to represent a healthy lifestyle; especially given the incidence of several cancers including colorectal cancer is lower in Mediterranean countries compared to Northern Europe. Olive oil, a central component of the Mediterranean diet, is believed to beneficially affect numerous biological processes. We used phenols extracted from virgin olive oil on a series of in vitro systems that model important stages of colon carcinogenesis. The effect the extract on DNA damage induced by hydrogen peroxide was measured in HT29 cells using single cell microgel-electrophoresis. A significant anti-genotoxic linear trend (p=0.011) was observed when HT29 cells were pre-incubated with olive oil phenols (0, 5, 10, 25, 50, 75, 100 microg/ml) for 24 hr, then challenged with hydrogen peroxide. The olive oil phenols (50, 100 microg/ml) significantly (p=0.004, p=0.002) improved barrier function of CACO2 cells after 48 hr as measured by trans-epithelial resistance. Significant inhibition of HT115 invasion (p<0.01) was observed at olive oil phenols concentrations of 25, 50, 75, 100 microg/ml using the matrigel invasion assay. No effect was observed on HT115 viability over the concentration range 0, 25, 50 75, 100 microg/ml after 24 hr, although 75 and 100 microg/ml olive oil phenols significantly inhibited HT115 cell attachment (p=0.011, p=0.006). Olive oil phenols had no significant effect on metastasis-related gene expression in HT115 cells. We have demonstrated that phenols extracted from virgin olive oil are capable of inhibiting several stages in colon carcinogenesis in vitro.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the relation between so called continuous localization models—i.e. non-linear stochastic Schrödinger evolutions—and the discrete GRW-model of wave function collapse. The former can be understood as scaling limit of the GRW process. The proof relies on a stochastic Trotter formula, which is of interest in its own right. Our Trotter formula also allows to complement results on existence theory of stochastic Schrödinger evolutions by Holevo and Mora/Rebolledo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Logistic models are studied as a tool to convert dynamical forecast information (deterministic and ensemble) into probability forecasts. A logistic model is obtained by setting the logarithmic odds ratio equal to a linear combination of the inputs. As with any statistical model, logistic models will suffer from overfitting if the number of inputs is comparable to the number of forecast instances. Computational approaches to avoid overfitting by regularization are discussed, and efficient techniques for model assessment and selection are presented. A logit version of the lasso (originally a linear regression technique), is discussed. In lasso models, less important inputs are identified and the corresponding coefficient is set to zero, providing an efficient and automatic model reduction procedure. For the same reason, lasso models are particularly appealing for diagnostic purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human-made transformations to the environment, and in particular the land surface, are having a large impact on the distribution (in both time and space) of rainfall, upon which all life is reliant. Focusing on precipitation, soil moisture and near-surface temperature, we compare data from Phase 5 of the Climate Modelling Intercomparison Project (CMIP5), as well as blended observational–satellite data, to see how the interaction between rainfall and the land surface differs (or agrees) between the models and reality, at daily timescales. As expected, the results suggest a strong positive relationship between precipitation and soil moisture when precipitation leads and is concurrent with soil moisture estimates, for the tropics as a whole. Conversely a negative relationship is shown when soil moisture leads rainfall by a day or more. A weak positive relationship between precipitation and temperature is shown when either leads by one day, whereas a weak negative relationship is shown over the same time period between soil moisture and temperature. Temporally, in terms of lag and lead relationships, the models appear to be in agreement on the overall patterns of correlation between rainfall and soil moisture. However, in terms of spatial patterns, a comparison of these relationships across all available models reveals considerable variability in the ability of the models to reproduce the correlations between precipitation and soil moisture. There is also a difference in the timings of the correlations, with some models showing the highest positive correlations when precipitation leads soil moisture by one day. Finally, the results suggest that there are 'hotspots' of high linear gradients between precipitation and soil moisture, corresponding to regions experiencing heavy rainfall. These results point to an inability of the CMIP5 models to simulate a positive feedback between soil moisture and precipitation at daily timescales. Longer timescale comparisons and experiments at higher spatial resolutions, where the impact of the spatial heterogeneity of rainfall on the initiation of convection and supply of moisture is included, would be expected to improve process understanding further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method to solve a quasi-geostrophic two-layer model including the variation of static stability is presented. The divergent part of the wind is incorporated by means of an iterative procedure. The procedure is rather fast and the time of computation is only 60–70% longer than for the usual two-layer model. The method of solution is justified by the conservation of the difference between the gross static stability and the kinetic energy. To eliminate the side-boundary conditions the experiments have been performed on a zonal channel model. The investigation falls mainly into three parts: The first part (section 5) contains a discussion of the significance of some physically inconsistent approximations. It is shown that physical inconsistencies are rather serious and for these inconsistent models which were studied the total kinetic energy increased faster than the gross static stability. In the next part (section 6) we are studying the effect of a Jacobian difference operator which conserves the total kinetic energy. The use of this operator in two-layer models will give a slight improvement but probably does not have any practical use in short periodic forecasts. It is also shown that the energy-conservative operator will change the wave-speed in an erroneous way if the wave-number or the grid-length is large in the meridional direction. In the final part (section 7) we investigate the behaviour of baroclinic waves for some different initial states and for two energy-consistent models, one with constant and one with variable static stability. According to the linear theory the waves adjust rather rapidly in such a way that the temperature wave will lag behind the pressure wave independent of the initial configuration. Thus, both models give rise to a baroclinic development even if the initial state is quasi-barotropic. The effect of the variation of static stability is very small, qualitative differences in the development are only observed during the first 12 hours. For an amplifying wave we will get a stabilization over the troughs and an instabilization over the ridges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neural field models describe the coarse-grained activity of populations of interacting neurons. Because of the laminar structure of real cortical tissue they are often studied in two spatial dimensions, where they are well known to generate rich patterns of spatiotemporal activity. Such patterns have been interpreted in a variety of contexts ranging from the understanding of visual hallucinations to the generation of electroencephalographic signals. Typical patterns include localized solutions in the form of traveling spots, as well as intricate labyrinthine structures. These patterns are naturally defined by the interface between low and high states of neural activity. Here we derive the equations of motion for such interfaces and show, for a Heaviside firing rate, that the normal velocity of an interface is given in terms of a non-local Biot-Savart type interaction over the boundaries of the high activity regions. This exact, but dimensionally reduced, system of equations is solved numerically and shown to be in excellent agreement with the full nonlinear integral equation defining the neural field. We develop a linear stability analysis for the interface dynamics that allows us to understand the mechanisms of pattern formation that arise from instabilities of spots, rings, stripes and fronts. We further show how to analyze neural field models with linear adaptation currents, and determine the conditions for the dynamic instability of spots that can give rise to breathers and traveling waves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to understand the physical processes causing the large spread in the storm track projections of the CMIP5 climate models. In particular, the relationship between the climate change responses of the storm tracks, as measured by the 2–6 day mean sea level pressure variance, and the equator-to-pole temperature differences at upper- and lower-tropospheric levels is investigated. In the southern hemisphere the responses of the upper- and lower-tropospheric temperature differences are correlated across the models and as a result they share similar associations with the storm track responses. There are large regions in which the storm track responses are correlated with the temperature difference responses, and a simple linear regression model based on the temperature differences at either level captures the spatial pattern of the mean storm track response as well explaining between 30 and 60 % of the inter-model variance of the storm track responses. In the northern hemisphere the responses of the two temperature differences are not significantly correlated and their associations with the storm track responses are more complicated. In summer, the responses of the lower-tropospheric temperature differences dominate the inter-model spread of the storm track responses. In winter, the responses of the upper- and lower-temperature differences both play a role. The results suggest that there is potential to reduce the spread in storm track responses by constraining the relative magnitudes of the warming in the tropical and polar regions.