939 resultados para General linear models


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Lately, several researchers have pointed out that climate change is expected to increase temperatures and lower rainfall in Mediterranean regions, simultaneously increasing the intensity of extreme rainfall events. These changes could have consequences regarding rainfall regime, erosion, sediment transport and water quality, soil management, and new designs in diversion ditches. Climate change is expected to result in increasingly unpredictable and variable rainfall, in amount and timing, changing seasonal patterns and increasing the frequency of extreme weather events. Consequently, the evolution of frequency and intensity of drought periods is of most important as in agro-ecosystems many processes will be affected by them. Realising the complex and important consequences of an increasing frequency of extreme droughts at the Ebro River basin, our aim is to study the evolution of drought events at this site statistically, with emphasis on the occurrence and intensity of them. For this purpose, fourteen meteorological stations were selected based on the length of the rainfall series and the climatic classification to obtain a representative untreated dataset from the river basin. Daily rainfall series from 1957 to 2002 were obtained from each meteorological station and no-rain period frequency as the consecutive numbers of days were extracted. Based on this data, we study changes in the probability distribution in several sub-periods. Moreover we used the Standardized Precipitation Index (SPI) for identification of drought events in a year scale and then we use this index to fit log-linear models to the contingency tables between the SPI index and the sub-periods, this adjusted is carried out with the help of ANOVA inference.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Two experiments were conducted to estimate the standardized ileal digestible (SID) Trp:Lys ratio requirement for growth performance of nursery pigs. Experimental diets were formulated to ensure that lysine was the second limiting AA throughout the experiments. In Exp. 1 (6 to 10 kg BW), 255 nursery pigs (PIC 327 × 1050, initially 6.3 ± 0.15 kg, mean ± SD) arranged in pens of 6 or 7 pigs were blocked by pen weight and assigned to experimental diets (7 pens/diet) consisting of SID Trp:Lys ratios of 14.7%, 16.5%, 18.4%, 20.3%, 22.1%, and 24.0% for 14 d with 1.30% SID Lys. In Exp. 2 (11 to 20 kg BW), 1,088 pigs (PIC 337 × 1050, initially 11.2 kg ± 1.35 BW, mean ± SD) arranged in pens of 24 to 27 pigs were blocked by average pig weight and assigned to experimental diets (6 pens/diet) consisting of SID Trp:Lys ratios of 14.5%, 16.5%, 18.0%, 19.5%, 21.0%, 22.5%, and 24.5% for 21 d with 30% dried distillers grains with solubles and 0.97% SID Lys. Each experiment was analyzed using general linear mixed models with heterogeneous residual variances. Competing heteroskedastic models included broken-line linear (BLL), broken-line quadratic (BLQ), and quadratic polynomial (QP). For each response, the best-fitting model was selected using Bayesian information criterion. In Exp. 1 (6 to 10 kg BW), increasing SID Trp:Lys ratio linearly increased (P < 0.05) ADG and G:F. For ADG, the best-fitting model was a QP in which the maximum ADG was estimated at 23.9% (95% confidence interval [CI]: [<14.7%, >24.0%]) SID Trp:Lys ratio. For G:F, the best-fitting model was a BLL in which the maximum G:F was estimated at 20.4% (95% CI: [14.3%, 26.5%]) SID Trp:Lys. In Exp. 2 (11 to 20 kg BW), increasing SID Trp:Lys ratio increased (P < 0.05) ADG and G:F in a quadratic manner. For ADG, the best-fitting model was a QP in which the maximum ADG was estimated at 21.2% (95% CI: [20.5%, 21.9%]) SID Trp:Lys. For G:F, BLL and BLQ models had comparable fit and estimated SID Trp:Lys requirements at 16.6% (95% CI: [16.0%, 17.3%]) and 17.1% (95% CI: [16.6%, 17.7%]), respectively. In conclusion, the estimated SID Trp:Lys requirement in Exp. 1 ranged from 20.4% for maximum G:F to 23.9% for maximum ADG, whereas in Exp. 2 it ranged from 16.6% for maximum G:F to 21.2% for maximum ADG. These results suggest that standard NRC (2012) recommendations may underestimate the SID Trp:Lys requirement for nursery pigs from 11 to 20 kg BW.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The visual responses of neurons in the cerebral cortex were first adequately characterized in the 1960s by D. H. Hubel and T. N. Wiesel [(1962) J. Physiol. (London) 160, 106-154; (1968) J. Physiol. (London) 195, 215-243] using qualitative analyses based on simple geometric visual targets. Over the past 30 years, it has become common to consider the properties of these neurons by attempting to make formal descriptions of these transformations they execute on the visual image. Most such models have their roots in linear-systems approaches pioneered in the retina by C. Enroth-Cugell and J. R. Robson [(1966) J. Physiol. (London) 187, 517-552], but it is clear that purely linear models of cortical neurons are inadequate. We present two related models: one designed to account for the responses of simple cells in primary visual cortex (V1) and one designed to account for the responses of pattern direction selective cells in MT (or V5), an extrastriate visual area thought to be involved in the analysis of visual motion. These models share a common structure that operates in the same way on different kinds of input, and instantiate the widely held view that computational strategies are similar throughout the cerebral cortex. Implementations of these models for Macintosh microcomputers are available and can be used to explore the models' properties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper describes how factor markets are presented in applied equilibrium models and how we plan to improve and to extend the presentation of factor markets in two specific models: MAGNET and ESIM. We do not argue that partial equilibrium models should become more ‘general’ in the sense of integrating all factor markets, but that the shift of agricultural income policies to decoupled payments linked to land in the EU necessitates the inclusion of land markets in policy-relevant modelling tools. To this end, this paper outlines options to integrate land markets in partial equilibrium models. A special feature of general equilibrium models is the inclusion of fully integrated factor markets in the system of equations to describe the functionality of a single country or a group of countries. Thus, this paper focuses on the implementation and improved representation of agricultural factor markets (land, labour and capital) in computable general equilibrium (CGE) models. This paper outlines the presentation of factor markets with an overview of currently applied CGE models and describes selected options to improve and extend the current factor market modelling in the MAGNET model, which also uses the results and empirical findings of our partners in this FP project.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Coral reefs are the most diverse marine ecosystem and embrace possibly millions of plant, animal and protist species. Mutualistic symbioses are a fundamental feature of coral reefs that have been used to explain their structure, biodiversity and existence. Complex inter-relationships between hosts, habitats and symbionts belie closely coupled nutrient and community dynamics that create the circumstances for something from nothing (or the oasis in a nutrient desert). The flip side of these dynamics is a close dependency between species, which results in a series of non-linear relationships as conditions change. These responses are being highlighted as anthropogenic influences increase across the world's tropical and subtropical coastlines. Caribbean as well as Indo-Pacific coral populations are now in a serious decline in many parts of the world. This has resulted in a significant reorganization of how coral reef ecosystems function. Among the spectra of changes brought about by humans is rapid climate change. Mass coral bleaching - the loss of the dinoflagellate symbionts from reef-building corals - and mortality has affected the world's coral reefs with increasing frequency and intensity since the late 1970s. Mass bleaching events, which often cover thousands of square kilometres of coral reefs, are triggered by small increases (+1-3degreesC) in water temperature. These increases in sea temperature are often seen during warm phase weather conditions (e.g. ENSO) and are increasing in size and magnitude. The loss of living coral cover (e.g. 16% globally in 1998, an exceptionally warm year) is resulting in an as yet unspecified reduction in the abundance of a myriad of other species. Projections from general circulation models (GCM) used to project changes in global temperature indicate that conditions even under the mildest greenhouse gas emission scenarios may exceed the thermal tolerances of most reef-building coral communities. Research must now explore key issues such as the extent to which the thermal tolerances of corals and their symbionts are dynamic if bleaching and disease are linked; how the loss of high densities of reef-building coral will affect other dependent species; and, how the loss of coral populations will affect the millions of people globally who depend on coral reefs for their daily survival.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Determining the dimensionality of G provides an important perspective on the genetic basis of a multivariate suite of traits. Since the introduction of Fisher's geometric model, the number of genetically independent traits underlying a set of functionally related phenotypic traits has been recognized as an important factor influencing the response to selection. Here, we show how the effective dimensionality of G can be established, using a method for the determination of the dimensionality of the effect space from a multivariate general linear model introduced by AMEMIYA (1985). We compare this approach with two other available methods, factor-analytic modeling and bootstrapping, using a half-sib experiment that estimated G for eight cuticular hydrocarbons of Drosophila serrata. In our example, eight pheromone traits were shown to be adequately represented by only two underlying genetic dimensions by Amemiya's approach and factor-analytic modeling of the covariance structure at the sire level. In, contrast, bootstrapping identified four dimensions with significant genetic variance. A simulation study indicated that while the performance of Amemiya's method was more sensitive to power constraints, it performed as well or better than factor-analytic modeling in correctly identifying the original genetic dimensions at moderate to high levels of heritability. The bootstrap approach consistently overestimated the number of dimensions in all cases and performed less well than Amemiya's method at subspace recovery.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The developments of models in Earth Sciences, e.g. for earthquake prediction and for the simulation of mantel convection, are fare from being finalized. Therefore there is a need for a modelling environment that allows scientist to implement and test new models in an easy but flexible way. After been verified, the models should be easy to apply within its scope, typically by setting input parameters through a GUI or web services. It should be possible to link certain parameters to external data sources, such as databases and other simulation codes. Moreover, as typically large-scale meshes have to be used to achieve appropriate resolutions, the computational efficiency of the underlying numerical methods is important. Conceptional this leads to a software system with three major layers: the application layer, the mathematical layer, and the numerical algorithm layer. The latter is implemented as a C/C++ library to solve a basic, computational intensive linear problem, such as a linear partial differential equation. The mathematical layer allows the model developer to define his model and to implement high level solution algorithms (e.g. Newton-Raphson scheme, Crank-Nicholson scheme) or choose these algorithms form an algorithm library. The kernels of the model are generic, typically linear, solvers provided through the numerical algorithm layer. Finally, to provide an easy-to-use application environment, a web interface is (semi-automatically) built to edit the XML input file for the modelling code. In the talk, we will discuss the advantages and disadvantages of this concept in more details. We will also present the modelling environment escript which is a prototype implementation toward such a software system in Python (see www.python.org). Key components of escript are the Data class and the PDE class. Objects of the Data class allow generating, holding, accessing, and manipulating data, in such a way that the actual, in the particular context best, representation is transparent to the user. They are also the key to establish connections with external data sources. PDE class objects are describing (linear) partial differential equation objects to be solved by a numerical library. The current implementation of escript has been linked to the finite element code Finley to solve general linear partial differential equations. We will give a few simple examples which will illustrate the usage escript. Moreover, we show the usage of escript together with Finley for the modelling of interacting fault systems and for the simulation of mantel convection.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis addresses data assimilation, which typically refers to the estimation of the state of a physical system given a model and observations, and its application to short-term precipitation forecasting. A general introduction to data assimilation is given, both from a deterministic and' stochastic point of view. Data assimilation algorithms are reviewed, in the static case (when no dynamics are involved), then in the dynamic case. A double experiment on two non-linear models, the Lorenz 63 and the Lorenz 96 models, is run and the comparative performance of the methods is discussed in terms of quality of the assimilation, robustness "in the non-linear regime and computational time. Following the general review and analysis, data assimilation is discussed in the particular context of very short-term rainfall forecasting (nowcasting) using radar images. An extended Bayesian precipitation nowcasting model is introduced. The model is stochastic in nature and relies on the spatial decomposition of the rainfall field into rain "cells". Radar observations are assimilated using a Variational Bayesian method in which the true posterior distribution of the parameters is approximated by a more tractable distribution. The motion of the cells is captured by a 20 Gaussian process. The model is tested on two precipitation events, the first dominated by convective showers, the second by precipitation fronts. Several deterministic and probabilistic validation methods are applied and the model is shown to retain reasonable prediction skill at up to 3 hours lead time. Extensions to the model are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background - The binding between peptide epitopes and major histocompatibility complex proteins (MHCs) is an important event in the cellular immune response. Accurate prediction of the binding between short peptides and the MHC molecules has long been a principal challenge for immunoinformatics. Recently, the modeling of MHC-peptide binding has come to emphasize quantitative predictions: instead of categorizing peptides as "binders" or "non-binders" or as "strong binders" and "weak binders", recent methods seek to make predictions about precise binding affinities. Results - We developed a quantitative support vector machine regression (SVR) approach, called SVRMHC, to model peptide-MHC binding affinities. As a non-linear method, SVRMHC was able to generate models that out-performed existing linear models, such as the "additive method". By adopting a new "11-factor encoding" scheme, SVRMHC takes into account similarities in the physicochemical properties of the amino acids constituting the input peptides. When applied to MHC-peptide binding data for three mouse class I MHC alleles, the SVRMHC models produced more accurate predictions than those produced previously. Furthermore, comparisons based on Receiver Operating Characteristic (ROC) analysis indicated that SVRMHC was able to out-perform several prominent methods in identifying strongly binding peptides. Conclusion - As a method with demonstrated performance in the quantitative modeling of MHC-peptide binding and in identifying strong binders, SVRMHC is a promising immunoinformatics tool with not inconsiderable future potential.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Prognostic procedures can be based on ranked linear models. Ranked regression type models are designed on the basis of feature vectors combined with set of relations defined on selected pairs of these vectors. Feature vectors are composed of numerical results of measurements on particular objects or events. Ranked relations defined on selected pairs of feature vectors represent additional knowledge and can reflect experts' opinion about considered objects. Ranked models have the form of linear transformations of feature vectors on a line which preserve a given set of relations in the best manner possible. Ranked models can be designed through the minimization of a special type of convex and piecewise linear (CPL) criterion functions. Some sets of ranked relations cannot be well represented by one ranked model. Decomposition of global model into a family of local ranked models could improve representation. A procedures of ranked models decomposition is described in this paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62H12, 62P99

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Analysis of risk measures associated with price series data movements and its predictions are of strategic importance in the financial markets as well as to policy makers in particular for short- and longterm planning for setting up economic growth targets. For example, oilprice risk-management focuses primarily on when and how an organization can best prevent the costly exposure to price risk. Value-at-Risk (VaR) is the commonly practised instrument to measure risk and is evaluated by analysing the negative/positive tail of the probability distributions of the returns (profit or loss). In modelling applications, least-squares estimation (LSE)-based linear regression models are often employed for modeling and analyzing correlated data. These linear models are optimal and perform relatively well under conditions such as errors following normal or approximately normal distributions, being free of large size outliers and satisfying the Gauss-Markov assumptions. However, often in practical situations, the LSE-based linear regression models fail to provide optimal results, for instance, in non-Gaussian situations especially when the errors follow distributions with fat tails and error terms possess a finite variance. This is the situation in case of risk analysis which involves analyzing tail distributions. Thus, applications of the LSE-based regression models may be questioned for appropriateness and may have limited applicability. We have carried out the risk analysis of Iranian crude oil price data based on the Lp-norm regression models and have noted that the LSE-based models do not always perform the best. We discuss results from the L1, L2 and L∞-norm based linear regression models. ACM Computing Classification System (1998): B.1.2, F.1.3, F.2.3, G.3, J.2.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study was to analyze the behavior of Sell-Side analysts and analysts propose a classification, considering the performance of the price forecasts and recom- mendations (sell-hold-buy) in the Brazilian stock market. For this, the first step was to analyze the consensus of analysts to understand the importance of this collective interven- tion in the market; the second was to analyze the analysts individually to understand how improve their analysis in time. Third was to understand how are the main methods of ranking used in markets. Finally, propose a form of classification that reflects the previous aspects discussed. To investigate the hypotheses proposed in the study were used linear models for panel to capture elements in time. The data of price forecasts and analyst recommendations individually and consensus, in the period 2005-2013 were obtained from Bloomberg R ○ . The main results were: (i) superior performance of consensus recommen- dations, compared with the individual analyzes; (ii) associating the number of analysts issuing recommendations with improved accuracy allows supposing that this number may be associated with increased consensus strength and hence accuracy; (iii) the anchoring effect of the analysts consensus revisions makes his predictions are biased, overvaluating the assets; (iv) analysts need to have greater caution in times of economic turbulence, noting also foreign markets such as the USA. For these may result changes in bias between optimism and pessimism; (v) effects due to changes in bias, as increased pessimism can cause excessive increase in purchase recommendations number. In this case, analysts can should be more cautious in analysis, mainly for consistency between recommendation and the expected price; (vi) the experience of the analyst with the asset economic sector and the asset contributes to the improvement of forecasts, however, the overall experience showed opposite evidence; (vii) the optimism associated with the overall experience, over time, shows a similar behavior to an excess of confidence, which could cause reduction of accuracy; (viii) the conflicting effect of general experience between the accuracy and the observed return shows evidence that, over time, the analyst has effects similar to the endowment bias on assets, which would result in a conflict analysis of recommendations and forecasts ; (ix) despite the focus on fewer sectors contribute to the quality of accuracy, the same does not occur with the focus on assets. So it is possible that analysts may have economies of scale when cover more assets within the same industry; and finally, (x) was possible to develop a proposal for classification analysts to consider both returns and the consistency of these predictions, called Analysis coefficient. This ranking resulted better results, considering the return / standard deviation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Several theories, such as the biological width formation, the inflammatory reactions due to the implant-abutment microgap contamination, and the periimplant stress/strain concentration causing bone microdamage accumulation, have been suggested to explain early periimplant bone loss. However, it is yet not well understood to which extent the implant-abutment connection type may influence the remodeling process around dental implants. Aim: to evaluate clinical, bacteriological, and biomechanical parameters related to periimplant bone loss at the crestal region, comparing external hexagon (EH) and Morse-taper (MT) connections. Materials and methods: Twelve patients with totally edentulous mandibles received four custom made Ø 3.8 x 13 mm implants in the interforaminal region of the mandible, with the same design, but different prosthetic connections (two of them EH or MT, randomly placed based on a split-mouth design), and a immediate implant- supported prosthesis. Clinical parameters (periimplant probing pocket depth, modified gingival index and mucosal thickness) were evaluated at 6 sites around the implants, at a 12 month follow-up. The distance from the top of the implant to the first bone-to-implant contact – IT-FBIC was evaluated on standardized digital peri-apical radiographs acquired at 1, 3, 6 and 12 months follow-up. Samples of the subgingival microbiota were collected 1, 3 and 6 months after implant loading. DNA were extracted and used for the quantification of Tanerella forsythia, Porphyromonas gingivalis, Aggragatibacter actinomycetemcomitans, Prevotella intermedia and Fusobacterium nucleatum. Comparison among multiple periods of observation were performed using repeated-measures Analysis of Variance (ANOVA), followed by a Tukey post-hoc test, while two-period based comparisons were made using paired t- test. Further, 36 computer-tomographic based finite element (FE) models were accomplished, simulating each patient in 3 loading conditions. The results for the peak EQV strain in periimplant bone were interpreted by means of a general linear model (ANOVA). Results: The variation in periimplant bone loss assessed by means of radiographs was significantly different between the connection types (P<0.001). Mean IT-FBIC was 1.17±0.44 mm for EH, and 0.17±0.54 mm for MT, considering all evaluated time periods. All clinical parameters presented not significant differences. No significant microbiological differences could be observed between both connection types. Most of the collected samples had very few pathogens, meaning that these regions were healthy from a microbiological point of view. In FE analysis, a significantly higher peak of EQV strain (P=0.005) was found for EH (mean 3438.65 µ∑) compared to MT (mean 840.98 µ∑) connection. Conclusions: Varying implant-abutment connection type will result in diverse periimplant bone remodeling, regardless of clinical and microbiological conditions. This fact is more likely attributed to the singular loading transmission through different implant-abutment connections to the periimplant bone. The present findings suggest that Morse-taper connection is more efficient to prevent periimplant bone loss, compared to an external hexagon connection.