906 resultados para computational models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In epidemiological work, outcomes are frequently non-normal, sample sizes may be large, and effects are often small. To relate health outcomes to geographic risk factors, fast and powerful methods for fitting spatial models, particularly for non-normal data, are required. We focus on binary outcomes, with the risk surface a smooth function of space. We compare penalized likelihood models, including the penalized quasi-likelihood (PQL) approach, and Bayesian models based on fit, speed, and ease of implementation. A Bayesian model using a spectral basis representation of the spatial surface provides the best tradeoff of sensitivity and specificity in simulations, detecting real spatial features while limiting overfitting and being more efficient computationally than other Bayesian approaches. One of the contributions of this work is further development of this underused representation. The spectral basis model outperforms the penalized likelihood methods, which are prone to overfitting, but is slower to fit and not as easily implemented. Conclusions based on a real dataset of cancer cases in Taiwan are similar albeit less conclusive with respect to comparing the approaches. The success of the spectral basis with binary data and similar results with count data suggest that it may be generally useful in spatial models and more complicated hierarchical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Generalized linear mixed models (GLMMs) provide an elegant framework for the analysis of correlated data. Due to the non-closed form of the likelihood, GLMMs are often fit by computational procedures like penalized quasi-likelihood (PQL). Special cases of these models are generalized linear models (GLMs), which are often fit using algorithms like iterative weighted least squares (IWLS). High computational costs and memory space constraints often make it difficult to apply these iterative procedures to data sets with very large number of cases. This paper proposes a computationally efficient strategy based on the Gauss-Seidel algorithm that iteratively fits sub-models of the GLMM to subsetted versions of the data. Additional gains in efficiency are achieved for Poisson models, commonly used in disease mapping problems, because of their special collapsibility property which allows data reduction through summaries. Convergence of the proposed iterative procedure is guaranteed for canonical link functions. The strategy is applied to investigate the relationship between ischemic heart disease, socioeconomic status and age/gender category in New South Wales, Australia, based on outcome data consisting of approximately 33 million records. A simulation study demonstrates the algorithm's reliability in analyzing a data set with 12 million records for a (non-collapsible) logistic regression model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the simultaneous estimation of a large number of related quantities, multilevel models provide a formal mechanism for efficiently making use of the ensemble of information for deriving individual estimates. In this article we investigate the ability of the likelihood to identify the relationship between signal and noise in multilevel linear mixed models. Specifically, we consider the ability of the likelihood to diagnose conjugacy or independence between the signals and noises. Our work was motivated by the analysis of data from high-throughput experiments in genomics. The proposed model leads to a more flexible family. However, we further demonstrate that adequately capitalizing on the benefits of a well fitting fully-specified likelihood in the terms of gene ranking is difficult.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In linear mixed models, model selection frequently includes the selection of random effects. Two versions of the Akaike information criterion (AIC) have been used, based either on the marginal or on the conditional distribution. We show that the marginal AIC is no longer an asymptotically unbiased estimator of the Akaike information, and in fact favours smaller models without random effects. For the conditional AIC, we show that ignoring estimation uncertainty in the random effects covariance matrix, as is common practice, induces a bias that leads to the selection of any random effect not predicted to be exactly zero. We derive an analytic representation of a corrected version of the conditional AIC, which avoids the high computational cost and imprecision of available numerical approximations. An implementation in an R package is provided. All theoretical results are illustrated in simulation studies, and their impact in practice is investigated in an analysis of childhood malnutrition in Zambia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clustered data analysis is characterized by the need to describe both systematic variation in a mean model and cluster-dependent random variation in an association model. Marginalized multilevel models embrace the robustness and interpretations of a marginal mean model, while retaining the likelihood inference capabilities and flexible dependence structures of a conditional association model. Although there has been increasing recognition of the attractiveness of marginalized multilevel models, there has been a gap in their practical application arising from a lack of readily available estimation procedures. We extend the marginalized multilevel model to allow for nonlinear functions in both the mean and association aspects. We then formulate marginal models through conditional specifications to facilitate estimation with mixed model computational solutions already in place. We illustrate this approach on a cerebrovascular deficiency crossover trial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The degree of polarization of a refected field from active laser illumination can be used for object identifcation and classifcation. The goal of this study is to investigate methods for estimating the degree of polarization for refected fields with active laser illumination, which involves the measurement and processing of two orthogonal field components (complex amplitudes), two orthogonal intensity components, and the total field intensity. We propose to replace interferometric optical apparatuses with a computational approach for estimating the degree of polarization from two orthogonal intensity data and total intensity data. Cramer-Rao bounds for each of the three sensing modalities with various noise models are computed. Algebraic estimators and maximum-likelihood (ML) estimators are proposed. Active-set algorithm and expectation-maximization (EM) algorithm are used to compute ML estimates. The performances of the estimators are compared with each other and with their corresponding Cramer-Rao bounds. Estimators for four-channel polarimeter (intensity interferometer) sensing have a better performance than orthogonal intensities estimators and total intensity estimators. Processing the four intensities data from polarimeter, however, requires complicated optical devices, alignment, and four CCD detectors. It only requires one or two detectors and a computer to process orthogonal intensities data and total intensity data, and the bounds and estimator performances demonstrate that reasonable estimates may still be obtained from orthogonal intensities or total intensity data. Computational sensing is a promising way to estimate the degree of polarization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experimental studies on epoxies report that the microstructure consists of highly crosslinked localized regions connected with a dispersed phase of low crosslink density. The various thermo-mechanical properties of epoxies might be affected by the crosslink distribution. But as experiments cannot report the exact number of crosslinked covalent bonds present in the structure, molecular dynamics is thus being used in this work to determine the influence of crosslink distribution on thermo-mechanical properties. Molecular dynamics and molecular mechanics simulations are used to establish wellequilibrated molecular models of EPON 862-DETDA epoxy system with a range of crosslink densities and various crosslink distributions. Crosslink distributions are being varied by forming differently crosslinked localized clusters and then by forming different number of crosslinks interconnecting the clusters. Simulations are subsequently used to predict the volume shrinkage, thermal expansion coefficients, and elastic properties of each of the crosslinked systems. The results indicate that elastic properties increase with increasing levels of overall crosslink density and the thermal expansion coefficient decreases with overall crosslink density, both above and below the glass transition temperature. Elastic moduli and coefficients of linear thermal expansion values were found to be different for systems with same overall crosslink density but having different crosslink distributions, thus indicating an effect of the epoxy nanostructure on physical properties. The values of thermo-mechanical properties for all the crosslinked systems are within the range of values reported in literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As an important Civil Engineering material, asphalt concrete (AC) is commonly used to build road surfaces, airports, and parking lots. With traditional laboratory tests and theoretical equations, it is a challenge to fully understand such a random composite material. Based on the discrete element method (DEM), this research seeks to develop and implement computer models as research approaches for improving understandings of AC microstructure-based mechanics. In this research, three categories of approaches were developed or employed to simulate microstructures of AC materials, namely the randomly-generated models, the idealized models, and image-based models. The image-based models were recommended for accurately predicting AC performance, while the other models were recommended as research tools to obtain deep insight into the AC microstructure-based mechanics. A viscoelastic micromechanical model was developed to capture viscoelastic interactions within the AC microstructure. Four types of constitutive models were built to address the four categories of interactions within an AC specimen. Each of the constitutive models consists of three parts which represent three different interaction behaviors: a stiffness model (force-displace relation), a bonding model (shear and tensile strengths), and a slip model (frictional property). Three techniques were developed to reduce the computational time for AC viscoelastic simulations. It was found that the computational time was significantly reduced to days or hours from years or months for typical three-dimensional models. Dynamic modulus and creep stiffness tests were simulated and methodologies were developed to determine the viscoelastic parameters. It was found that the DE models could successfully predict dynamic modulus, phase angles, and creep stiffness in a wide range of frequencies, temperatures, and time spans. Mineral aggregate morphology characteristics (sphericity, orientation, and angularity) were studied to investigate their impacts on AC creep stiffness. It was found that aggregate characteristics significantly impact creep stiffness. Pavement responses and pavement-vehicle interactions were investigated by simulating pavement sections under a rolling wheel. It was found that wheel acceleration, steadily moving, and deceleration significantly impact contact forces. Additionally, summary and recommendations were provided in the last chapter and part of computer programming codes wree provided in the appendixes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ferroic materials, as notable members of smart materials, have been widely used in applications that perform sensing, actuation and control. The macroscopic property change of ferroic materials may become remarkably large during ferroic phase transition, leading to the fact that the macroscopic properties can be tuned by carefully applying a suitable external field (electric, magnetic, stress). To obtain an enhancement in physical and/or mechanical properties, different kinds of ferroic composites have been fabricated. The properties of a ferroic composite are determined not only by the properties and relative amounts of the constituent phases, but also by the microstructure of individual phase such as the phase connectivity, phase size, shape and spatial arrangement. This dissertation mainly focuses on the computational study of microstructure – property – mechanism relations in two representative ferroic composites, i.e., two-phase particulate magnetoelectric (ME) composite and polymer matrix ferroelectric composite. The former is a great example of ferroic composite exhibiting a new property and functionality that neither of the constituent phases possesses individually. The latter well represents the kind of ferroic composites having property combinations that are better than the existing materials. Phase field modeling was employed as the computing tool, and the required models for ferroic composites were developed based on existing models for monolithic materials. Extensive computational simulations were performed to investigate the microstructure-property relations and the underlying mechanism in ferroic composites. In particulate, it is found that for ME composite 0-3 connectivity (isolated magnetostrictive phase) is necessary to exhibit ME effect, and small but finite electrical conductivity of isolated magnetic phase can beneficially enhance ME effect. It is revealed that longitudinal and transverse ME coefficients of isotropic 0-3 particulate composites can be effectively tailored by controlling magnetic domain structures without resort to anisotropic two-phase microstructures. Simulations also show that the macroscopic properties of the ferroelectricpolymer composites critically depend on the ferroelectric phase connectivity while are not sensitive to the sizes and internal grain structures of the ceramic particles. Texturing is found critical to exploit the paraelectric«ferroelectric phase transition and nonlinear polarization behavior in paraelectric polycrystal and its polymer matrix composite. Additionally, a Diffuse Interface Field model was developed to simulate packing and motion in liquid phase which is promising for studying the fabrication of particulatepolymer composites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What does it mean for curriculum to be interactive? It encourages student engagement and active participation in both individual and group work. It offers teachers a coherent set of materials to choose from that can enhance their classes. It is the product of on-going development and continuous improvement based on research and feedback from the field. This paper will introduce work in progress from the Center for Excellence in Education, Science, and Technology (CELEST), an NSF Science of Learning Center. Among its many goals, CELEST is developing a unique educational curriculum, an interactive curriculum based upon models of mind and brain. Teachers, administrators, and governments are naturally concerned with how students learn. Students are greatly concerned about how minds work, including how to learn. CELEST aims to introduce curricula that not only meet current U.S. standards in mathematics, science, and psychology but also influence plans to improve those standards. Software and support materials are in development and available at http://cns.bu.edu/celest/private/. Interested parties are invited to contact the author for access.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.