916 resultados para Multilevel Linear Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interaction effects are usually modeled by means of moderated regression analysis. Structural equation models with non-linear constraints make it possible to estimate interaction effects while correcting formeasurement error. From the various specifications, Jöreskog and Yang's(1996, 1998), likely the most parsimonious, has been chosen and further simplified. Up to now, only direct effects have been specified, thus wasting much of the capability of the structural equation approach. This paper presents and discusses an extension of Jöreskog and Yang's specification that can handle direct, indirect and interaction effects simultaneously. The model is illustrated by a study of the effects of an interactive style of use of budgets on both company innovation and performance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We want to shed some light on the development of person mobility by analysing the repeated cross-sectional data of the four National Travel Surveys (NTS) that were conducted in Germany since the mid seventies. The above mentioned driving forces operate on different levels of the system that generates the spatial behaviour we observe: Travel demand is derived from the needs and desires of individuals to participate in spatially separated activities. Individuals organise their lives in an interactive process within the context they live in, using given infrastructure. Essential determinants of their demand are the individual's socio-demographic characteristics, but also the opportunities and constraints defined by the household and the environment are relevant for the behaviour which ultimately can be realised. In order to fully capture the context which determines individual behaviour, the (nested) hierarchy of persons within households within spatial settings has to be considered. The data we will use for our analysis contains information on these three levels. With the analysis of this micro-data we attempt to improve our understanding of the afore subsumed macro developments. In addition we will investigate the prediction power of a few classic sociodemographic variables for the daily travel distance of individuals in the four NTS data sets, with a focus on the evolution of this predictive power. The additional task to correctly measure distances travelled by means of the NTS is threatened by the fact that although these surveys measure the same variables, different sampling designs and data collection procedures were used. So the aim of the analysis is also to detect variables whose control corrects for the known measurement error, as a prerequisite to apply appropriate models in order to better understand the development of individual travel behaviour in a multilevel context. This task is complicated by the fact that variables that inform on survey procedures and outcomes are only provided with the data set for 2002 (see Infas and DIW Berlin, 2003).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we develop a viscoelastic bar element that can handle multiple rheo- logical laws with non-linear elastic and non-linear viscous material models. The bar element is built by joining in series an elastic and viscous bar, constraining the middle node position to the bar axis with a reduction method, and stati- cally condensing the internal degrees of freedom. We apply the methodology to the modelling of reversible softening with sti ness recovery both in 2D and 3D, a phenomenology also experimentally observed during stretching cycles on epithelial lung cell monolayers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article aims to verify the factors associated with the development of human resource management (HRM) competences in foreign subsidiaries of Brazilian multinationals. These competences are essential in that they allow foreign units to adopt HRM practices that are consistent with the countries or markets in which they operate. A multilevel research was conducted, involving headquarters and subsidiaries of major Brazilian companies; the empirical analysis employed hierarchical linear modelling. Despite the recurrent debate on global standardisation versus local adaptation, it was identified that the integration of international HRM policies (addressing simultaneously global guidelines and local response) may stimulate competences development. In addition, interaction in external networks in the host country may enhance the development of HRM competences in the subsidiaries. However, specific cultural factors of the company may inhibit development activity in units abroad.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An active strain formulation for orthotropic constitutive laws arising in cardiac mechanics modeling is introduced and studied. The passive mechanical properties of the tissue are described by the Holzapfel-Ogden relation. In the active strain formulation, the Euler-Lagrange equations for minimizing the total energy are written in terms of active and passive deformation factors, where the active part is assumed to depend, at the cell level, on the electrodynamics and on the specific orientation of the cardiac cells. The well-posedness of the linear system derived from a generic Newton iteration of the original problem is analyzed and different mechanical activation functions are considered. In addition, the active strain formulation is compared with the classical active stress formulation from both numerical and modeling perspectives. Taylor-Hood and MINI finite elements are employed to discretize the mechanical problem. The results of several numerical experiments show that the proposed formulation is mathematically consistent and is able to represent the main key features of the phenomenon, while allowing savings in computational costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Sitting between your past and your future doesn't mean you are in the present. Dakota Skye Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts. In the first part, we study cellular automata, which are a simple paradigm for distributed computation. Cellular automata are made of basic Boolean computational units, the cells; relying on simple rules and information from- the surrounding cells to perform a global task. The limited visibility of the cells can be modeled as a network, where interactions amongst cells are governed by an underlying structure, usually a regular one. In order to increase the performance of cellular automata, we chose to change its topology. We applied computational principles inspired by Darwinian evolution, called evolutionary algorithms, to alter the system's topological structure starting from either a regular or a random one. The outcome is remarkable, as the resulting topologies find themselves sharing properties of both regular and random network, and display similitudes Watts-Strogtz's small-world network found in social systems. Moreover, the performance and tolerance to probabilistic faults of our small-world like cellular automata surpasses that of regular ones. In the second part, we use the context of biological genetic regulatory networks and, in particular, Kauffman's random Boolean networks model. In some ways, this model is close to cellular automata, although is not expected to perform any task. Instead, it simulates the time-evolution of genetic regulation within living organisms under strict conditions. The original model, though very attractive by it's simplicity, suffered from important shortcomings unveiled by the recent advances in genetics and biology. We propose to use these new discoveries to improve the original model. Firstly, we have used artificial topologies believed to be closer to that of gene regulatory networks. We have also studied actual biological organisms, and used parts of their genetic regulatory networks in our models. Secondly, we have addressed the improbable full synchronicity of the event taking place on. Boolean networks and proposed a more biologically plausible cascading scheme. Finally, we tackled the actual Boolean functions of the model, i.e. the specifics of how genes activate according to the activity of upstream genes, and presented a new update function that takes into account the actual promoting and repressing effects of one gene on another. Our improved models demonstrate the expected, biologically sound, behavior of previous GRN model, yet with superior resistance to perturbations. We believe they are one step closer to the biological reality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper proposes an approach aimed at detecting optimal model parameter combinations to achieve the most representative description of uncertainty in the model performance. A classification problem is posed to find the regions of good fitting models according to the values of a cost function. Support Vector Machine (SVM) classification in the parameter space is applied to decide if a forward model simulation is to be computed for a particular generated model. SVM is particularly designed to tackle classification problems in high-dimensional space in a non-parametric and non-linear way. SVM decision boundaries determine the regions that are subject to the largest uncertainty in the cost function classification, and, therefore, provide guidelines for further iterative exploration of the model space. The proposed approach is illustrated by a synthetic example of fluid flow through porous media, which features highly variable response due to the parameter values' combination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the application of normal theory methods to the estimation and testing of a general type of multivariate regressionmodels with errors--in--variables, in the case where various data setsare merged into a single analysis and the observable variables deviatepossibly from normality. The various samples to be merged can differ on the set of observable variables available. We show that there is a convenient way to parameterize the model so that, despite the possiblenon--normality of the data, normal--theory methods yield correct inferencesfor the parameters of interest and for the goodness--of--fit test. Thetheory described encompasses both the functional and structural modelcases, and can be implemented using standard software for structuralequations models, such as LISREL, EQS, LISCOMP, among others. An illustration with Monte Carlo data is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard methods for the analysis of linear latent variable models oftenrely on the assumption that the vector of observed variables is normallydistributed. This normality assumption (NA) plays a crucial role inassessingoptimality of estimates, in computing standard errors, and in designinganasymptotic chi-square goodness-of-fit test. The asymptotic validity of NAinferences when the data deviates from normality has been calledasymptoticrobustness. In the present paper we extend previous work on asymptoticrobustnessto a general context of multi-sample analysis of linear latent variablemodels,with a latent component of the model allowed to be fixed across(hypothetical)sample replications, and with the asymptotic covariance matrix of thesamplemoments not necessarily finite. We will show that, under certainconditions,the matrix $\Gamma$ of asymptotic variances of the analyzed samplemomentscan be substituted by a matrix $\Omega$ that is a function only of thecross-product moments of the observed variables. The main advantage of thisis thatinferences based on $\Omega$ are readily available in standard softwareforcovariance structure analysis, and do not require to compute samplefourth-order moments. An illustration with simulated data in the context ofregressionwith errors in variables will be presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two main approaches are commonly used to empirically evaluate linear factor pricingmodels: regression and SDF methods, with centred and uncentred versions of the latter.We show that unlike standard two-step or iterated GMM procedures, single-step estimatorssuch as continuously updated GMM yield numerically identical values for prices of risk,pricing errors, Jensen s alphas and overidentifying restrictions tests irrespective of the modelvalidity. Therefore, there is arguably a single approach regardless of the factors being tradedor not, or the use of excess or gross returns. We illustrate our results by revisiting Lustigand Verdelhan s (2007) empirical analysis of currency returns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Cardiac 18F-FDG PET is considered as the gold standard to assess myocardial metabolism and infarct size. The myocardial demand for glucose can be influenced by fasting and/or following pharmacological preparation. In the rat, it has been previously shown that fasting combined with preconditioning with acipimox, a nicotinic acid derivate and lipidlowering agent, increased dramatically 18F-FDG uptake in the myocardium. Strategies aimed at reducing infarct scar are evaluated in a variety of mouse models. PET would particularly useful for assessing cardiac viability in the mouse. However, prior knowledge of the best preparation protocol is a prerequisite for accurate measurement of glucose uptake in mice. Therefore, we studied the effect of different protocols on 18F-FDG uptake in the mouse heart.Methods: Mice (n = 15) were separated into three treatment groups according to preconditioning and underwent a 18FDG PET scan. Group 1: No preconditioning (n = 3); Group 2: Overnight fasting (n = 8); and Group 3: Overnight fasting and acipimox (25mg/kg SC) (n = 4). MicroPET images were processed with PMOD to determine 18F-FDG mean standard uptake value (SUV) at 30 min for the whole left ventricle (LV) and for each region of the 17-segments AHA model. For comparisons, we used Mann-Whitney test and multilevel mixed-effects linear regression (Stata 11.0).Results: In total, 27 microPET were performed successfully in 15 animals. Overnight fasting led to a dramatic increase in LV-SUV compared to mice without preconditioning (8.6±0.7g/mL vs. 3.7±1.1g/mL, P<0.001). In addition, LV-SUV was slightly but not significantly higher in animals treated with acipimox compared to animals with overnight fasting alone (10.2±0.5 g/mL, P = 0.06). Fastening increased segmental SUV by 5.1±0.5g/mL as compared to free-feeding mice (from 3.7±0.8g/mL to 8.8±0.4g/mL, P<0.001); segmental-SUV also significantly increased after administration of acipimox (from 8.8±0.4g/mL to 10.1±0.4g/mL, P<0.001).Conclusion: Overnight fasting led to myocardial glucose deprivation and increases 18F-FDG myocardial uptake. Additional administration of acipimox enhances myocardial 18F-FDG uptake, at least at the segmental level. Thus, preconditioning with acipimox may provide better image quality that may help for assessing segmental myocardial metabolism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective The aim is to analyze and compare individual BMI growth patterns of adults from Switzerland and the U.S. Methods The analyses are based on data from two population representative longitudinal household surveys, one from Switzerland, the other from the U.S. Each data set contains up to four data points for each adult individual. We use multilevel models for growth. Results It can be shown that growth patterns are different in different cohorts in the two countries: there are only small growth differences in the youngest and oldest, but large differences in the middle ages. The individual BMI increase of the middle age Swiss amounts to only half of that in the comparable U.S. individuals. Conclusion Given the much higher BMI level especially in the youngest cohort, this points to severe obesity problems in the U.S. middle aged population in the near future. A positive correlation between individual BMI level and growth may aggravate this fact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical models allow the representation of data sets and the estimation and/or prediction of the behavior of a given variable through its interaction with the other variables involved in a phenomenon. Among other different statistical models, are the autoregressive state-space models (ARSS) and the linear regression models (LR), which allow the quantification of the relationships among soil-plant-atmosphere system variables. To compare the quality of the ARSS and LR models for the modeling of the relationships between soybean yield and soil physical properties, Akaike's Information Criterion, which provides a coefficient for the selection of the best model, was used in this study. The data sets were sampled in a Rhodic Acrudox soil, along a spatial transect with 84 points spaced 3 m apart. At each sampling point, soybean samples were collected for yield quantification. At the same site, soil penetration resistance was also measured and soil samples were collected to measure soil bulk density in the 0-0.10 m and 0.10-0.20 m layers. Results showed autocorrelation and a cross correlation structure of soybean yield and soil penetration resistance data. Soil bulk density data, however, were only autocorrelated in the 0-0.10 m layer and not cross correlated with soybean yield. The results showed the higher efficiency of the autoregressive space-state models in relation to the equivalent simple and multiple linear regression models using Akaike's Information Criterion. The resulting values were comparatively lower than the values obtained by the regression models, for all combinations of explanatory variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systematic trends in the properties of a linear split-gate heterojunction are studied by solving iteratively the Poisson and Schrödinger equations for different gate potentials and temperatures. A two-dimensional approximation is presented that is much simpler in the numerical implementation and that accurately reproduces all significant trends. In deriving this approximation, we provide a rigorous and quantitative basis for the formulation of models that assumes a two-dimensional character for the electron gas at the junction.