956 resultados para Computer models
Resumo:
Two studies examined relations between groups (humanities and math-science students) that implicitly or explicitly share a common superordinate category (university student). In Experiment 1, 178 participants performed a noninteractive decision-making task during which category salience was manipulated in a 2 (superordinate category salience) x 2 (subordinate category salience) between-groups design. Consistent with the mutual intergroup differentiation model, participants for whom both categories were salient exhibited the lowest levels of bias, whereas bias was strongest when the superordinate category alone was made salient. This pattern of results was replicated in Experiment 2 (N = 135). In addition, Experiment 2 demonstrated that members of subgroups that are nested within a superordinate category are more sensitive to how the superordinate category is represented than are members of subgroups that extend beyond the boundaries of the superordinate category.
Resumo:
Sausage is a protein sequence threading program, but with remarkable run-time flexibility. Using different scripts, it can calculate protein sequence-structure alignments, search structure libraries, swap force fields, create models form alignments, convert file formats and analyse results. There are several different force fields which might be classed as knowledge-based, although they do not rely on Boltzmann statistics. Different force fields are used for alignment calculations and subsequent ranking of calculated models.
Resumo:
A method is presented for including path propagation effects into models of radiofrequency resonators for use in magnetic resonance imaging. The method is based on the use of Helmholtz retarded potentials and extends our previous work on current density models of resonators based on novel inverse finite Hilbert transform solutions to the requisite integral equations. Radiofrequency phase retardation effects are most pronounced at high field strengths (frequencies) as are static field perturbations due to the magnetic materials in the resonators themselves. Both of these effects are investigated and a novel resonator structure presented for use in magnetic resonance microscopy.
Resumo:
The concept of rainfall erosivity is extended to the estimation of catchment sediment yield and its variation over time. Five different formulations of rainfall erosivity indices, using annual, monthly and daily rainfall data, are proposed and tested on two catchments in the humid tropics of Australia. Rainfall erosivity indices, using simple power functions of annual and daily rainfall amounts, were found to be adequate in describing the interannual and seasonal variation of catchment sediment yield. The parameter values of these rainfall erosivity indices for catchment sediment yield are broadly similar to those for rainfall erosivity models in relation to the R-factor in the Universal Soil Loss Equation.
Resumo:
We shall be concerned with the problem of determining quasi-stationary distributions for Markovian models directly from their transition rates Q. We shall present simple conditions for a mu-invariant measure m for Q to be mu-invariant for the transition function, so that if m is finite, it can be normalized to produce a quasi-stationary distribution. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
In this paper we present a model of specification-based testing of interactive systems. This model provides the basis for a framework to guide such testing. Interactive systems are traditionally decomposed into a functionality component and a user interface component; this distinction is termed dialogue separation and is the underlying basis for conceptual and architectural models of such systems. Correctness involves both proper behaviour of the user interface and proper computation by the underlying functionality. Specification-based testing is one method used to increase confidence in correctness, but it has had limited application to interactive system development to date.
Resumo:
An extensive research program focused on the characterization of various metallurgical complex smelting and coal combustion slags is being undertaken. The research combines both experimental and thermodynamic modeling studies. The approach is illustrated by work on the PbO-ZnO-Al2O3-FeO-Fe2O3-CaO-SiO2 system. Experimental measurements of the liquidus and solidus have been undertaken under oxidizing and reducing conditions using equilibration, quenching, and electron probe X-ray microanalysis. The experimental program has been planned so as to obtain data for thermodynamic model development as well as for pseudo-ternary Liquidus diagrams that can be used directly by process operators. Thermodynamic modeling has been carried out using the computer system FACT, which contains thermodynamic databases with over 5000 compounds and evaluated solution models. The FACT package is used for the calculation of multiphase equilibria in multicomponent systems of industrial interest. A modified quasi-chemical solution model is used for the liquid slag phase. New optimizations have been carried out, which significantly improve the accuracy of the thermodynamic models for lead/zinc smelting and coal combustion processes. Examples of experimentally determined and calculated liquidus diagrams are presented. These examples provide information of direct relevance to various metallurgical smelting and coal combustion processes.
Resumo:
Three kinds of integrable Kondo problems in one-dimensional extended Hubbard models are studied by means of the boundary graded quantum inverse scattering method. The boundary K matrices depending on the local moments of the impurities are presented as a nontrivial realization of the graded reflection equation algebras acting in a (2s alpha + 1)-dimensional impurity Hilbert space. Furthermore, these models are solved using the algebraic Bethe ansatz method, and the Bethe ansatz equations are obtained.
Resumo:
Testing ecological models for management is an increasingly important part of the maturation of ecology as an applied science. Consequently, we need to work at applying fair tests of models with adequate data. We demonstrate that a recent test of a discrete time, stochastic model was biased towards falsifying the predictions. If the model was a perfect description of reality, the test falsified the predictions 84% of the time. We introduce an alternative testing procedure for stochastic models, and show that it falsifies the predictions only 5% of the time when the model is a perfect description of reality. The example is used as a point of departure to discuss some of the philosophical aspects of model testing.
Resumo:
Normal mixture models are being increasingly used to model the distributions of a wide variety of random phenomena and to cluster sets of continuous multivariate data. However, for a set of data containing a group or groups of observations with longer than normal tails or atypical observations, the use of normal components may unduly affect the fit of the mixture model. In this paper, we consider a more robust approach by modelling the data by a mixture of t distributions. The use of the ECM algorithm to fit this t mixture model is described and examples of its use are given in the context of clustering multivariate data in the presence of atypical observations in the form of background noise.
Resumo:
We investigate the internal dynamics of two cellular automaton models with heterogeneous strength fields and differing nearest neighbour laws. One model is a crack-like automaton, transferring ail stress from a rupture zone to the surroundings. The other automaton is a partial stress drop automaton, transferring only a fraction of the stress within a rupture zone to the surroundings. To study evolution of stress, the mean spectral density. f(k(r)) of a stress deficit held is: examined prior to, and immediately following ruptures in both models. Both models display a power-law relationship between f(k(r)) and spatial wavenumber (k(r)) of the form f(k(r)) similar tok(r)(-beta). In the crack model, the evolution of stress deficit is consistent with cyclic approach to, and retreat from a critical state in which large events occur. The approach to criticality is driven by tectonic loading. Short-range stress transfer in the model does not affect the approach to criticality of broad regions in the model. The evolution of stress deficit in the partial stress drop model is consistent with small fluctuations about a mean state of high stress, behaviour indicative of a self-organised critical system. Despite statistics similar to natural earthquakes these simplified models lack a physical basis. physically motivated models of earthquakes also display dynamical complexity similar to that of a critical point system. Studies of dynamical complexity in physical models of earthquakes may lead to advancement towards a physical theory for earthquakes.
Resumo:
The evolution of event time and size statistics in two heterogeneous cellular automaton models of earthquake behavior are studied and compared to the evolution of these quantities during observed periods of accelerating seismic energy release Drier to large earthquakes. The two automata have different nearest neighbor laws, one of which produces self-organized critical (SOC) behavior (PSD model) and the other which produces quasi-periodic large events (crack model). In the PSD model periods of accelerating energy release before large events are rare. In the crack model, many large events are preceded by periods of accelerating energy release. When compared to randomized event catalogs, accelerating energy release before large events occurs more often than random in the crack model but less often than random in the PSD model; it is easier to tell the crack and PSD model results apart from each other than to tell either model apart from a random catalog. The evolution of event sizes during the accelerating energy release sequences in all models is compared to that of observed sequences. The accelerating energy release sequences in the crack model consist of an increase in the rate of events of all sizes, consistent with observations from a small number of natural cases, however inconsistent with a larger number of cases in which there is an increase in the rate of only moderate-sized events. On average, no increase in the rate of events of any size is seen before large events in the PSD model.
Resumo:
A case sensitive intelligent model editor has been developed for constructing consistent lumped dynamic process models and for simplifying them using modelling assumptions. The approach is based on a systematic assumption-driven modelling procedure and on the syntax and semantics of process,models and the simplifying assumptions.
Resumo:
Examples from the Murray-Darling basin in Australia are used to illustrate different methods of disaggregation of reconnaissance-scale maps. One approach for disaggregation revolves around the de-convolution of the soil-landscape paradigm elaborated during a soil survey. The descriptions of soil ma units and block diagrams in a soil survey report detail soil-landscape relationships or soil toposequences that can be used to disaggregate map units into component landscape elements. Toposequences can be visualised on a computer by combining soil maps with digital elevation data. Expert knowledge or statistics can be used to implement the disaggregation. Use of a restructuring element and k-means clustering are illustrated. Another approach to disaggregation uses training areas to develop rules to extrapolate detailed mapping into other, larger areas where detailed mapping is unavailable. A two-level decision tree example is presented. At one level, the decision tree method is used to capture mapping rules from the training area; at another level, it is used to define the domain over which those rules can be extrapolated. (C) 2001 Elsevier Science B.V. All rights reserved.