916 resultados para Computational Catastrophes


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper presents a theory for modeling flow in anisotropic, viscous rock. This theory has originally been developed for the simulation of large deformation processes including the folding and kinking of multi-layered visco-elastic rock (Muhlhaus et al. [1,2]). The orientation of slip planes in the context of crystallographic slip is determined by the normal vector - the director - of these surfaces. The model is applied to simulate anisotropic mantle convection. We compare the evolution of flow patterns, Nusselt number and director orientations for isotropic and anisotropic rheologies. In the simulations we utilize two different finite element methodologies: The Lagrangian Integration Point Method Moresi et al [8] and an Eulerian formulation, which we implemented into the finite element based pde solver Fastflo (www.cmis.csiro.au/Fastflo/). The reason for utilizing two different finite element codes was firstly to study the influence of an anisotropic power law rheology which currently is not implemented into the Lagrangian Integration point scheme [8] and secondly to study the numerical performance of Eulerian (Fastflo)- and Lagrangian integration schemes [8]. It turned out that whereas in the Lagrangian method the Nusselt number vs time plot reached only a quasi steady state where the Nusselt number oscillates around a steady state value the Eulerian scheme reaches exact steady states and produces a high degree of alignment (director orientation locally orthogonal to velocity vector almost everywhere in the computational domain). In the simulations emergent anisotropy was strongest in terms of modulus contrast in the up and down-welling plumes. Mechanisms for anisotropic material behavior in the mantle dynamics context are discussed by Christensen [3]. The dominant mineral phases in the mantle generally do not exhibit strong elastic anisotropy but they still may be oriented by the convective flow. Thus viscous anisotropy (the main focus of this paper) may or may not correlate with elastic or seismic anisotropy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduced a spectral clustering algorithm based on the bipartite graph model for the Manufacturing Cell Formation problem in [Oliveira S, Ribeiro JFF, Seok SC. A spectral clustering algorithm for manufacturing cell formation. Computers and Industrial Engineering. 2007 [submitted for publication]]. It constructs two similarity matrices; one for parts and one for machines. The algorithm executes a spectral clustering algorithm on each separately to find families of parts and cells of machines. The similarity measure in the approach utilized limited information between parts and between machines. This paper reviews several well-known similarity measures which have been used for Group Technology. Computational clustering results are compared by various performance measures. (C) 2008 The Society of Manufacturing Engineers. Published by Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A graph clustering algorithm constructs groups of closely related parts and machines separately. After they are matched for the least intercell moves, a refining process runs on the initial cell formation to decrease the number of intercell moves. A simple modification of this main approach can deal with some practical constraints, such as the popular constraint of bounding the maximum number of machines in a cell. Our approach makes a big improvement in the computational time. More importantly, improvement is seen in the number of intercell moves when the computational results were compared with best known solutions from the literature. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: A major goal in the post-genomic era is to identify and characterise disease susceptibility genes and to apply this knowledge to disease prevention and treatment. Rodents and humans have remarkably similar genomes and share closely related biochemical, physiological and pathological pathways. In this work we utilised the latest information on the mouse transcriptome as revealed by the RIKEN FANTOM2 project to identify novel human disease-related candidate genes. We define a new term patholog to mean a homolog of a human disease-related gene encoding a product ( transcript, anti-sense or protein) potentially relevant to disease. Rather than just focus on Mendelian inheritance, we applied the analysis to all potential pathologs regardless of their inheritance pattern. Results: Bioinformatic analysis and human curation of 60,770 RIKEN full-length mouse cDNA clones produced 2,578 sequences that showed similarity ( 70 - 85% identity) to known human-disease genes. Using a newly developed biological information extraction and annotation tool ( FACTS) in parallel with human expert analysis of 17,051 MEDLINE scientific abstracts we identified 182 novel potential pathologs. Of these, 36 were identified by computational tools only, 49 by human expert analysis only and 97 by both methods. These pathologs were related to neoplastic ( 53%), hereditary ( 24%), immunological ( 5%), cardio-vascular (4%), or other (14%), disorders. Conclusions: Large scale genome projects continue to produce a vast amount of data with potential application to the study of human disease. For this potential to be realised we need intelligent strategies for data categorisation and the ability to link sequence data with relevant literature. This paper demonstrates the power of combining human expert annotation with FACTS, a newly developed bioinformatics tool, to identify novel pathologs from within large-scale mouse transcript datasets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A two-component survival mixture model is proposed to analyse a set of ischaemic stroke-specific mortality data. The survival experience of stroke patients after index stroke may be described by a subpopulation of patients in the acute condition and another subpopulation of patients in the chronic phase. To adjust for the inherent correlation of observations due to random hospital effects, a mixture model of two survival functions with random effects is formulated. Assuming a Weibull hazard in both components, an EM algorithm is developed for the estimation of fixed effect parameters and variance components. A simulation study is conducted to assess the performance of the two-component survival mixture model estimators. Simulation results confirm the applicability of the proposed model in a small sample setting. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The knowledge of thermochemical parameters such as the enthalpy of formation, gas-phase basicity, and proton affinity may be the key to understanding molecular reactivity. The obtention of these thermochemical parameters by theoretical chemical models may be advantageous when experimental measurements are difficult to accomplish. The development of ab initio composite models represents a major advance in the obtention of these thermochemical parameters,. but these methods do not always lead to accurate values. Aiming at achieving a comparison between the ab initio models and the hybrid models based on the density functional theory (DFT), we have studied gamma-butyrolactone and 2-pyrrolidinone with a goal of obtaining high-quality thermochemical parameters using the composite chemical models G2, G2MP2, MP2, G3, CBS-Q, CBS-4, and CBS-QB3; the DFT methods B3LYP, B3P86, PW91PW91, mPW1PW, and B98; and the basis sets 6-31G(d), 6-31+G(d), 6-31G(d,p), 6-31+G(d,p), 6-31++G(d,p), 6-311G(d), 6-311+G(d), 6-311G(d,p), 6-311+G(d,p), 6-311++G(d,p), aug-cc-pVDZ, and aug-cc-pVTZ. Values obtained for the enthalpies of formation, proton affinity, and gas-phase basicity of the two target molecules were compared to the experimental data reported in the literature. The best results were achieved with the use of DFT models, and the B3LYP method led to the most accurate data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We construct the Drinfeld twists (factorizing F-matrices) for the supersymmetric t-J model. Working in the basis provided by the F-matrix (i.e. the so-called F-basis), we obtain completely symmetric representations of the monodromy matrix and the pseudo-particle creation operators of the model. These enable us to resolve the hierarchy of the nested Bethe vectors for the gl(2\1) invariant t-J model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PREDBALB/c is a computational system that predicts peptides binding to the major histocompatibility complex-2 (H2(d)) of the BALB/c mouse, an important laboratory model organism. The predictions include the complete set of H2(d) class I ( H2-K-d, H2-L-d and H2-D-d) and class II (I-E-d and I-A(d)) molecules. The prediction system utilizes quantitative matrices, which were rigorously validated using experimentally determined binders and non-binders and also by in vivo studies using viral proteins. The prediction performance of PREDBALB/c is of very high accuracy. To our knowledge, this is the first online server for the prediction of peptides binding to a complete set of major histocompatibility complex molecules in a model organism (H2(d) haplotype). PREDBALB/c is available at http://antigen.i2r.a-star.edu.sg/predBalbc/.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Carbon monoxide, the chief killer in fires, and other species are modelled for a series of enclosure fires. The conditions emulate building fires where CO is formed in the rich, turbulent, nonpremixed flame and is transported frozen to lean mixtures by the ceiling jet which is cooled by radiation and dilution. Conditional moment closure modelling is used and computational domain minimisation criteria are developed which reduce the computational cost of this method. The predictions give good agreement for CO and other species in the lean, quenched-gas stream, holding promise that this method may provide a practical means of modelling real, three-dimensional fire situations. (c) 2005 The Combustion Institute. Published by Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main problem with current approaches to quantum computing is the difficulty of establishing and maintaining entanglement. A Topological Quantum Computer (TQC) aims to overcome this by using different physical processes that are topological in nature and which are less susceptible to disturbance by the environment. In a (2+1)-dimensional system, pseudoparticles called anyons have statistics that fall somewhere between bosons and fermions. The exchange of two anyons, an effect called braiding from knot theory, can occur in two different ways. The quantum states corresponding to the two elementary braids constitute a two-state system allowing the definition of a computational basis. Quantum gates can be built up from patterns of braids and for quantum computing it is essential that the operator describing the braiding-the R-matrix-be described by a unitary operator. The physics of anyonic systems is governed by quantum groups, in particular the quasi-triangular Hopf algebras obtained from finite groups by the application of the Drinfeld quantum double construction. Their representation theory has been described in detail by Gould and Tsohantjis, and in this review article we relate the work of Gould to TQC schemes, particularly that of Kauffman.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Colonius suggests that, in using standard set theory as the language in which to express our computational-level theory of human memory, we would need to violate the axiom of foundation in order to express meaningful memory bindings in which a context is identical to an item in the list. We circumvent Colonius's objection by allowing that a list item may serve as a label for a context without being identical to that context. This debate serves to highlight the value of specifying memory operations in set theoretic notation, as it would have been difficult if not impossible to formulate such an objection at the algorithmic level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The refinement calculus provides a framework for the stepwise development of imperative programs from specifications. In this paper we study a refinement calculus for deriving logic programs. Dealing with logic programs rather than imperative programs has the dual advantages that, due to the expressive power of logic programs, the final program is closer to the original specification, and each refinement step can achieve more. Together these reduce the overall number of derivation steps. We present a logic programming language extended with specification constructs (including general predicates, assertions, and types and invariants) to form a wide-spectrum language. General predicates allow non-executable properties to be included in specifications. Assertions, types and invariants make assumptions about the intended inputs of a procedure explicit, and can be used during refinement to optimize the constructed logic program. We provide a semantics for the extended logic programming language and derive a set of refinement laws. Finally we apply these to an example derivation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transport in bidisperse adsorbents is investigated here, while incorporating a two-dimensional model for adsorbate diffusion in the microparticles. The latter treatment permits consideration of the macropore concentration variation around the microparticle surface, and thereby predicts an adsorbate through-flux on the macroscopic coordinate. Such a through-flux has earlier been postulated in the literature, but with unrealistic mechanistic justification. The new model therefore resolves the existing ambiguity in this regard, and covers the entire spectrum of behaviour between microparticle and macropore diffusion control. Computational results show that if the macroscopic adsorbate flux, ignored in the conventional analysis, has a significant contribution to the total flux under macropore control conditions then it is always important even when the microparticle diffusion resistance is not negligible. The effect of various parameters such as relative microparticle size and isotherm heterogeneity on the uptake is also studied and discussed. (C) 1997 Elsevier Science Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nursing diagnoses associated with alterations of urinary elimination require different interventions, Nurses, who are not specialists, require support to diagnose and manage patients with disturbances of urine elimination. The aim of this study was to present a model based on fuzzy logic for differential diagnosis of alterations in urinary elimination, considering nursing diagnosis approved by the North American Nursing Diagnosis Association, 2001-2002. Fuzzy relations and the maximum-minimum composition approach were used to develop the system. The model performance was evaluated with 195 cases from the database of a previous study, resulting in 79.0% of total concordance and 19.5% of partial concordance, when compared with the panel of experts. Total discordance was observed in only three cases (1.5%). The agreement between model and experts was excellent (kappa = 0.98, P < .0001) or substantial (kappa = 0.69, P < .0001) when considering the overestimative accordance (accordance was considered when at least one diagnosis was equal) and the underestimative discordance (discordance was considered when at least one diagnosis was different), respectively. The model herein presented showed good performance and a simple theoretical structure, therefore demanding few computational resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a fuzzy approach to the Reed-Frost model for epidemic spreading taking into account uncertainties in the diagnostic of the infection. The heterogeneities in the infected group is based on the clinical signals of the individuals (symptoms, laboratorial exams, medical findings, etc.), which are incorporated into the dynamic of the epidemic. The infectivity level is time-varying and the classification of the individuals is performed through fuzzy relations. Simulations considering a real problem with data of the viral epidemic in a children daycare are performed and the results are compared with a stochastic Reed-Frost generalization.