952 resultados para Mixed-effect models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this paper is to present procedures for automatically creating idealised finite element models from the 3D CAD solid geometry of a component. The procedures produce an accurate and efficient analysis model with little effort on the part of the user. The technique is applicable to thin walled components with local complex features and automatically creates analysis models where 3D elements representing the complex regions in the component are embedded in an efficient shell mesh representing the mid-faces of the thin sheet regions. As the resulting models contain elements of more than one dimension, they are referred to as mixed dimensional models. Although these models are computationally more expensive than some of the idealisation techniques currently employed in industry, they do allow the structural behaviour of the model to be analysed more accurately, which is essential if appropriate design decisions are to be made. Also, using these procedures, analysis models can be created automatically whereas the current idealisation techniques are mostly manual, have long preparation times, and are based on engineering judgement. In the paper the idealisation approach is first applied to 2D models that are used to approximate axisymmetric components for analysis. For these models 2D elements representing the complex regions are embedded in a 1D mesh representing the midline of the cross section of the thin sheet regions. Also discussed is the coupling, which is necessary to link the elements of different dimensionality together. Analysis results from a 3D mixed dimensional model created using the techniques in this paper are compared to those from a stiffened shell model and a 3D solid model to demonstrate the improved accuracy of the new approach. At the end of the paper a quantitative analysis of the reduction in computational cost due to shell meshing thin sheet regions demonstrates that the reduction in degrees of freedom is proportional to the square of the aspect ratio of the region, and for long slender solids, the reduction can be proportional to the aspect ratio of the region if appropriate meshing algorithms are used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As in any field of scientific inquiry, advancements in the field of second language acquisition (SLA) rely in part on the interpretation and generalizability of study findings using quantitative data analysis and inferential statistics. While statistical techniques such as ANOVA and t-tests are widely used in second language research, this review article provides a review of a class of newer statistical models that have not yet been widely adopted in the field, but have garnered interest in other fields of language research. The class of statistical models called mixed-effects models are introduced, and the potential benefits of these models for the second language researcher are discussed. A simple example of mixed-effects data analysis using the statistical software package R (R Development Core Team, 2011) is provided as an introduction to the use of these statistical techniques, and to exemplify how such analyses can be reported in research articles. It is concluded that mixed-effects models provide the second language researcher with a powerful tool for the analysis of a variety of types of second language acquisition data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear mixed effects models are frequently used to analyse longitudinal data, due to their flexibility in modelling the covariance structure between and within observations. Further, it is easy to deal with unbalanced data, either with respect to the number of observations per subject or per time period, and with varying time intervals between observations. In most applications of mixed models to biological sciences, a normal distribution is assumed both for the random effects and for the residuals. This, however, makes inferences vulnerable to the presence of outliers. Here, linear mixed models employing thick-tailed distributions for robust inferences in longitudinal data analysis are described. Specific distributions discussed include the Student-t, the slash and the contaminated normal. A Bayesian framework is adopted, and the Gibbs sampler and the Metropolis-Hastings algorithms are used to carry out the posterior analyses. An example with data on orthodontic distance growth in children is discussed to illustrate the methodology. Analyses based on either the Student-t distribution or on the usual Gaussian assumption are contrasted. The thick-tailed distributions provide an appealing robust alternative to the Gaussian process for modelling distributions of the random effects and of residuals in linear mixed models, and the MCMC implementation allows the computations to be performed in a flexible manner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear mixed effects models have been widely used in analysis of data where responses are clustered around some random effects, so it is not reasonable to assume independence between observations in the same cluster. In most biological applications, it is assumed that the distributions of the random effects and of the residuals are Gaussian. This makes inferences vulnerable to the presence of outliers. Here, linear mixed effects models with normal/independent residual distributions for robust inferences are described. Specific distributions examined include univariate and multivariate versions of the Student-t, the slash and the contaminated normal. A Bayesian framework is adopted and Markov chain Monte Carlo is used to carry out the posterior analysis. The procedures are illustrated using birth weight data on rats in a texicological experiment. Results from the Gaussian and robust models are contrasted, and it is shown how the implementation can be used for outlier detection. The thick-tailed distributions provide an appealing robust alternative to the Gaussian process in linear mixed models, and they are easily implemented using data augmentation and MCMC techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we extend semiparametric mixed linear models with normal errors to elliptical errors in order to permit distributions with heavier and lighter tails than the normal ones. Penalized likelihood equations are applied to derive the maximum penalized likelihood estimates (MPLEs) which appear to be robust against outlying observations in the sense of the Mahalanobis distance. A reweighed iterative process based on the back-fitting method is proposed for the parameter estimation and the local influence curvatures are derived under some usual perturbation schemes to study the sensitivity of the MPLEs. Two motivating examples preliminarily analyzed under normal errors are reanalyzed considering some appropriate elliptical errors. The local influence approach is used to compare the sensitivity of the model estimates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND There are concerns about the effects of in utero exposure to antiretroviral drugs (ARVs) on the development of HIV-exposed but uninfected (HEU) children. The aim of this study was to evaluate whether in utero exposure to ARVs is associated with lower birth weight/height and reduced growth during the first 2 years of life. METHODS This cohort study was conducted among HEU infants born between 1996 and 2010 in Tertiary children's hospital in Rio de Janeiro, Brazil. Weight was measured by mechanical scale, and height was measured by measuring board. Z-scores for weight-for-age (WAZ), length-for-age (LAZ) and weight-for-length were calculated. We modeled trajectories by mixed-effects models and adjusted for mother's age, CD4 cell count, viral load, year of birth and family income. RESULTS A total of 588 HEU infants were included of whom 155 (26%) were not exposed to ARVs, 114 (19%) were exposed early (first trimester) and 319 (54%) later. WAZ were lower among infants exposed early compared with infants exposed later: adjusted differences were -0.52 (95% confidence interval [CI]: -0.99 to -0.04, P = 0.02) at birth and -0.22 (95% CI: -0.47 to 0.04, P = 0.10) during follow-up. LAZ were lower during follow-up: -0.35 (95% CI: -0.63 to -0.08, P = 0.01). There were no differences in weight-for-length scores. Z-scores of infants exposed late during pregnancy were similar to unexposed infants. CONCLUSIONS In HEU children, early exposure to ARVs was associated with lower WAZ at birth and lower LAZ up to 2 years of life. Growth of HEU children needs to be monitored closely.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62H12, 62P99

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Poisson distribution has often been used for count like accident data. Negative Binomial (NB) distribution has been adopted in the count data to take care of the over-dispersion problem. However, Poisson and NB distributions are incapable of taking into account some unobserved heterogeneities due to spatial and temporal effects of accident data. To overcome this problem, Random Effect models have been developed. Again another challenge with existing traffic accident prediction models is the distribution of excess zero accident observations in some accident data. Although Zero-Inflated Poisson (ZIP) model is capable of handling the dual-state system in accident data with excess zero observations, it does not accommodate the within-location correlation and between-location correlation heterogeneities which are the basic motivations for the need of the Random Effect models. This paper proposes an effective way of fitting ZIP model with location specific random effects and for model calibration and assessment the Bayesian analysis is recommended.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In order to drive sustainable financial profitability, service firms make significant investments in creating service environments that consumers will prefer over the environments of their competitors. To date, servicescape research is over-focused on understanding consumers’ emotional and physiological responses to servicescape attributes, rather than taking a holistic view of how consumers cognitively interpret servicescapes. This thesis argues that consumers will cognitively ascribe symbolic meanings to servicescapes and then evaluate if those meanings are congruent with their sense of Self in order to form a preference for a servicescape. Consequently, this thesis takes a Self Theory approach to servicescape symbolism to address the following broad research question: How do ascribed symbolic meanings influence servicescape preference? Using a three-study, mixed-method approach, this thesis investigates the symbolic meanings consumers ascribe to servicescapes and empirically tests whether the joint effects of congruence between consumer Self and the symbolic meanings ascribed to servicescapes influence consumers’ servicescape preference. First, Study One identifies the symbolic meanings ascribed to salient servicescape attributes using a combination of repertory tests and laddering techniques within 19 semi-structured individual depth interviews. Study Two modifies an existing scale to create a symbolic servicescape meaning scale in order to measure the symbolic meanings ascribed to servicescapes. Finally, Study Three utilises the Self-Congruity Model to empirically examine the joint effects of consumer Self and servicescape on consumers’ preference for servicescapes. Using polynomial regression with response surface analysis, 14 joint effect models demonstrate that both Self-Servicescape incongruity and congruity influence consumers’ preference for servicescapes. Combined, the findings of three studies suggest that the symbolic meanings ascribed to servicescapes and their (in)congruities with consumers’ sense of self can be used to predict consumers’ preferences for servicescapes. These findings have several key theoretical and practical contributions to services marketing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Associations between sitting-time and physical activity (PA) with depression are unclear. Purpose: To examine concurrent and prospective associations between both sitting-time and PA with prevalent depressive symptoms in mid-aged Australian women. Methods: Data were from 8,950 women, aged 50-55 years in 2001, who completed mail surveys in 2001, 2004, 2007 and 2010. Depressive symptoms were assessed using the Center for Epidemiological Studies Depression questionnaire. Associations between sitting-time (≤4, >4-7, >7 hrs/day) and PA (none, some, meeting guidelines) with depressive symptoms (symptoms/no symptoms) were examined in 2011 in concurrent and lagged mixed effect logistic modeling. Both main effects and interaction models were developed. Results: In main effects modeling, women who sat >7 hrs/day (OR 1.47, 95%CI 1.29-1.67) and women who did no PA (OR 1.99, 95%CI 1.75-2.27) were more likely to have depressive symptoms than women who sat ≤4 hrs/day and who met PA guidelines, respectively. In interaction modeling, the likelihood of depressive symptoms in women who sat >7 hrs/day and did no PA was triple that of women who sat ≤4 hrs/day and met PA guidelines (OR 2.96, 95%CI 2.37-3.69). In prospective main effects and interaction modeling, sitting-time was not associated with depressive symptoms, but women who did no PA were more likely than those who met PA guidelines to have future depressive symptoms (OR 1.26, 95%CI 1.08-1.47). Conclusions: Increasing PA to a level commensurate with PA guidelines can alleviate current depression symptoms and prevent future symptoms in mid-aged women. Reducing sitting-time may ameliorate current symptoms.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction: Built environment interventions designed to reduce non-communicable diseases and health inequity, complement urban planning agendas focused on creating more ‘liveable’, compact, pedestrian-friendly, less automobile dependent and more socially inclusive cities.However, what constitutes a ‘liveable’ community is not well defined. Moreover, there appears to be a gap between the concept and delivery of ‘liveable’ communities. The recently funded NHMRC Centre of Research Excellence (CRE) in Healthy Liveable Communities established in early 2014, has defined ‘liveability’ from a social determinants of health perspective. Using purpose-designed multilevel longitudinal data sets, it addresses five themes that address key evidence-base gaps for building healthy and liveable communities. The CRE in Healthy Liveable Communities seeks to generate and exchange new knowledge about: 1) measurement of policy-relevant built environment features associated with leading non-communicable disease risk factors (physical activity, obesity) and health outcomes (cardiovascular disease, diabetes) and mental health; 2) causal relationships and thresholds for built environment interventions using data from longitudinal studies and natural experiments; 3) thresholds for built environment interventions; 4) economic benefits of built environment interventions designed to influence health and wellbeing outcomes; and 5) factors, tools, and interventions that facilitate the translation of research into policy and practice. This evidence is critical to inform future policy and practice in health, land use, and transport planning. Moreover, to ensure policy-relevance and facilitate research translation, the CRE in Healthy Liveable Communities builds upon ongoing, and has established new, multi-sector collaborations with national and state policy-makers and practitioners. The symposium will commence with a brief introduction to embed the research within an Australian health and urban planning context, as well as providing an overall outline of the CRE in Healthy Liveable Communities, its structure and team. Next, an overview of the five research themes will be presented. Following these presentations, the Discussant will consider the implications of the research and opportunities for translation and knowledge exchange. Theme 2 will establish whether and to what extent the neighbourhood environment (built and social) is causally related to physical and mental health and associated behaviours and risk factors. In particular, research conducted as part of this theme will use data from large-scale, longitudinal-multilevel studies (HABITAT, RESIDE, AusDiab) to examine relationships that meet causality criteria via statistical methods such as longitudinal mixed-effect and fixed-effect models, multilevel and structural equation models; analyse data on residential preferences to investigate confounding due to neighbourhood self-selection and to use measurement and analysis tools such as propensity score matching and ‘within-person’ change modelling to address confounding; analyse data about individual-level factors that might confound, mediate or modify relationships between the neighbourhood environment and health and well-being (e.g., psychosocial factors, knowledge, perceptions, attitudes, functional status), and; analyse data on both objective neighbourhood characteristics and residents’ perceptions of these objective features to more accurately assess the relative contribution of objective and perceptual factors to outcomes such as health and well-being, physical activity, active transport, obesity, and sedentary behaviour. At the completion of the Theme 2, we will have demonstrated and applied statistical methods appropriate for determining causality and generated evidence about causal relationships between the neighbourhood environment, health, and related outcomes. This will provide planners and policy makers with a more robust (valid and reliable) basis on which to design healthy communities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Permeation of gases through single surfactant stabilized aqueous films has previously been studied in view of the potentiality of foam to separate gaseous mixtures. The earlier analysis assumed that the gas phase was well mixed and that the mass-transfer process was completely controlled by the liquid film. Permeabilities evaluated from single film data based on such analysis failed to predict the mass-transfer data obtained on permeation through two films. It is shown that the neglect of gas-phase resistance and the effect of film movement is the reason for the failure of the well-mixed gas models. An exact analysis of diffusion through two films is presented. It successfully predicts the experimental data on two films based on parameters evaluated from single film data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article presents the details of estimation of fracture parameters for high strength concrete (HSC, HSC1) and ultra high strength concrete (UHSC). Brief details about characterization of ingredients of HSC, HSC1 and UHSC have been provided. Experiments have been carried out on beams made up of HSC, HSC1 and UHSC considering various sizes and notch depths. Fracture characteristics such as size independent fracture energy (G(f)), size of fracture process zone (C-f), fracture toughness (K-IC) and crack tip opening displacement (CTODc) have been estimated based on the experimental observations. From the studies, it is observed that (i) UHSC has high fracture energy and ductility inspite of having a very low value of C-f; (ii) relatively much more homogeneous than other concretes, because of absence of coarse aggregates and well-graded smaller size particles; (iii) the critical SIF (K-IC) values are increasing with increase of beam depth and decreasing with increase of notch depth. Generally, it can be noted that there is significant increase in fracture toughness and CTODc. They are about 7 times in HSC1 and about 10 times in UHSC compared to those in HSC; (iv) for notch-to-depth ratio 0.1, Bazant's size effect model slightly overestimates the maximum failure loads compared to experimental observations and Karihaloo's model slightly underestimates the maximum failure loads. For the notch-to-depth ratio ranging from 0.2 to 0.4 for the case of UHSC, it can be observed that, both the size effect models predict more or less similar maximum failure loads compared to corresponding experimental values.