36 resultados para Discrete choice analysis
Resumo:
Map algebra is a data model and simple functional notation to study the distribution and patterns of spatial phenomena. It uses a uniform representation of space as discrete grids, which are organized into layers. This paper discusses extensions to map algebra to handle neighborhood operations with a new data type called a template. Templates provide general windowing operations on grids to enable spatial models for cellular automata, mathematical morphology, and local spatial statistics. A programming language for map algebra that incorporates templates and special processing constructs is described. The programming language is called MapScript. Example program scripts are presented to perform diverse and interesting neighborhood analysis for descriptive, model-based and processed-based analysis.
Resumo:
The acousto-ultrasonic (AU) input-output characteristics for contact-type transmitting and receiving transducers coupled to composite laminated plates are considered in this paper. Combining a multiple integral transform method, an ordinary discrete layer theory for the laminates and some simplifying assumptions for the electro-mechanical transduction behaviour of the transducers, an analytical solution is developed which can deal with all the wave processes involved in the AU measurement system, i.e, wave generation, wave propagation and wave reception. The spectral response of the normal contact pressure sensed by the receiving transducer due to an arbitrary input pulse excited by the transmitting transducer is obtained. To validate the new analytical-numerical spectral technique in the low-frequency regime, the results are compared with Mindlin plate theory solutions. Based on the analytical results, numerical calculations are carried out to investigate the influence of various external parameters such as frequency content of the input pulse, transmitter/receiver spacing and transducer aperture on the output of the measurement system. The results show that the presented analytical-numerical procedure is an effective tool for understanding the input-output characteristics of the AU technique for laminated plates. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
For the improvement of genetic material suitable for on farm use under low-input conditions, participatory and formal plant breeding strategies are frequently presented as competing options. A common frame of reference to phrase mechanisms and purposes related to breeding strategies will facilitate clearer descriptions of similarities and differences between participatory plant breeding and formal plant breeding. In this paper an attempt is made to develop such a common framework by means of a statistically inspired language that acknowledges the importance of both on farm trials and research centre trials as sources of information for on farm genetic improvement. Key concepts are the genetic correlation between environments, and the heterogeneity of phenotypic and genetic variance over environments. Classic selection response theory is taken as the starting point for the comparison of selection trials (on farm and research centre) with respect to the expected genetic improvement in a target environment (low-input farms). The variance-covariance parameters that form the input for selection response comparisons traditionally come from a mixed model fit to multi-environment trial data. In this paper we propose a recently developed class of mixed models, namely multiplicative mixed models, also called factor-analytic models, for modelling genetic variances and covariances (correlations). Mixed multiplicative models allow genetic variances and covariances to be dependent on quantitative descriptors of the environment, and confer a high flexibility in the choice of variance-covariance structure, without requiring the estimation of a prohibitively high number of parameters. As a result detailed considerations regarding selection response comparisons are facilitated. ne statistical machinery involved is illustrated on an example data set consisting of barley trials from the International Center for Agricultural Research in the Dry Areas (ICARDA). Analysis of the example data showed that participatory plant breeding and formal plant breeding are better interpreted as providing complementary rather than competing information.
Resumo:
This paper tests the explanatory capacities of different versions of new institutionalism by examining the Australian case of a general transition in central banking practice and monetary politics: namely, the increased emphasis on low inflation and central bank independence. Standard versions of rational choice institutionalism largely dominate the literature on the politics of central banking, but this approach (here termed RC1) fails to account for Australian empirics. RC1 has a tendency to establish actor preferences exogenously to the analysis; actors' motives are also assumed a priori; actor's preferences are depicted in relatively static, ahistorical terms. And there is the tendency, even a methodological requirement, to assume relatively simple motives and preference sets among actors, in part because of the game theoretic nature of RC1 reasoning. It is possible to build a more accurate rational choice model by re-specifying and essentially updating the context, incentives and choice sets that have driven rational choice in this case. Enter RC2. However, this move subtly introduces methodological shifts and new theoretical challenges. By contrast, historical institutionalism uses an inductive methodology. Compared with deduction, it is arguably better able to deal with complexity and nuance. It also utilises a dynamic, historical approach, and specifies (dynamically) endogenous preference formation by interpretive actors. Historical institutionalism is also able to more easily incorporate a wider set of key explanatory variables and incorporate wider social aggregates. Hence, it is argued that historical institutionalism is the preferred explanatory theory and methodology in this case.
Resumo:
Chemical engineers are turning to multiscale modelling to extend traditional modelling approaches into new application areas and to achieve higher levels of detail and accuracy. There is, however, little advice available on the best strategy to use in constructing a multiscale model. This paper presents a starting point for the systematic analysis of multiscale models by defining several integrating frameworks for linking models at different scales. It briefly explores how the nature of the information flow between the models at the different scales is influenced by the choice of framework, and presents some restrictions on model-framework compatibility. The concepts are illustrated with reference to the modelling of a catalytic packed bed reactor. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Path analysis of attitudinal, motivational, demographic and behavioural factors influencing food choice among Australian consumers who had consumed at least some organic food in the preceding 12 months showed that concern with the naturalness of food and the sensory and emotional experience of eating were the major determinants of increasing levels of organic consumption. Increasing consumption was also related to other 'green consumption' behaviours such as recycling and to lower levels of concern with convenience in the purchase and preparation of food. Most of these factors were, in turn, strongly affected by gender and the level of responsibility taken by respondents for food provisioning within their households, a responsibility dominated by women. Education had a slightly negative effect on the levels of concern for sensory and emotional appeal due to lower levels of education among women. Income, age, political and ecological values and willingness to pay a premium for safe and environmentally friendly foods all had extremely minor effects. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
The sources of covariation among cognitive measures of Inspection Time, Choice Reaction Time, Delayed Response Speed and Accuracy, and IQ were examined in a classical twin design that included 245 monozygotic (MZ) and 298 dizygotic (DZ) twin pairs. Results indicated that a factor model comprising additive genetic and unique environmental effects was the most parsimonious. In this model, a general genetic cognitive factor emerged with factor loadings ranging from 0.28 to 0.64. Three other genetic factors explained the remaining genetic covariation between various speed and Delayed Response measures with IQ. However, a large proportion of the genetic variation in verbal (54%) and performance (25%) IQ was unrelated to these lower order cognitive measures. The independent genetic IQ variation may reflect information processes not captured by the elementary cognitive tasks, Inspection Time and Choice Reaction Time, nor our working memory task, Delayed Response. Unique environmental effects were mostly nonoverlapping, and partly represented test measurement error.
Resumo:
We demonstrate a device that allows for the coherent analysis of a pair of optical frequency sidebands in an arbitrary basis. We show that our device is quantum noise limited, and hence applications for this scheme may be found in discrete and continuous variable optical quantum information experiments. (c) 2005 Optical Society of America.
Resumo:
Background: Protein tertiary structure can be partly characterized via each amino acid's contact number measuring how residues are spatially arranged. The contact number of a residue in a folded protein is a measure of its exposure to the local environment, and is defined as the number of C-beta atoms in other residues within a sphere around the C-beta atom of the residue of interest. Contact number is partly conserved between protein folds and thus is useful for protein fold and structure prediction. In turn, each residue's contact number can be partially predicted from primary amino acid sequence, assisting tertiary fold analysis from sequence data. In this study, we provide a more accurate contact number prediction method from protein primary sequence. Results: We predict contact number from protein sequence using a novel support vector regression algorithm. Using protein local sequences with multiple sequence alignments (PSI-BLAST profiles), we demonstrate a correlation coefficient between predicted and observed contact numbers of 0.70, which outperforms previously achieved accuracies. Including additional information about sequence weight and amino acid composition further improves prediction accuracies significantly with the correlation coefficient reaching 0.73. If residues are classified as being either contacted or non-contacted, the prediction accuracies are all greater than 77%, regardless of the choice of classification thresholds. Conclusion: The successful application of support vector regression to the prediction of protein contact number reported here, together with previous applications of this approach to the prediction of protein accessible surface area and B-factor profile, suggests that a support vector regression approach may be very useful for determining the structure-function relation between primary sequence and higher order consecutive protein structural and functional properties.
Resumo:
In this paper we utilise a stochastic address model of broadcast oligopoly markets to analyse the Australian broadcast television market. In particular, we examine the effect of the presence of a single government market participant in this market. An examination of the dynamics of the simulations demonstrates that the presence of a government market participant can simultaneously generate positive outcomes for viewers as well as for other market suppliers. Further examination of simulation dynamics indicates that privatisation of the government market participant results in reduced viewer choice and diversity. We also demonstrate that additional private market participants would not result in significant benefits to viewers.
Resumo:
Selection of machine learning techniques requires a certain sensitivity to the requirements of the problem. In particular, the problem can be made more tractable by deliberately using algorithms that are biased toward solutions of the requisite kind. In this paper, we argue that recurrent neural networks have a natural bias toward a problem domain of which biological sequence analysis tasks are a subset. We use experiments with synthetic data to illustrate this bias. We then demonstrate that this bias can be exploitable using a data set of protein sequences containing several classes of subcellular localization targeting peptides. The results show that, compared with feed forward, recurrent neural networks will generally perform better on sequence analysis tasks. Furthermore, as the patterns within the sequence become more ambiguous, the choice of specific recurrent architecture becomes more critical.
Resumo:
Stirred Mills are becoming increasingly used for fine and ultra-fine grinding. This technology is still poorly understood when used in the mineral processing context. This makes process optimisation of such devices problematic. 3D DEM simulations of the flow of grinding media in pilot scale tower mills and pin mills are carried out in order to investigate the relative performance of these stirred mills. In the first part of this paper, media flow patterns and energy absorption rates and distributions were analysed to provide a good understanding of the media flow and the collisional environment in these mills. In this second part we analyse steady state coherent flow structures, liner stress and wear by impact and abrasion. We also examine mixing and transport efficiency. Together these provide a comprehensive understanding of all the key processes operating in these mills and a clear understanding of the relative performance issues. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Stirred mills are becoming increasingly used for fine and ultra-fine grinding. This technology is still poorly understood when used in the mineral processing context. This makes process optimisation of such devices problematic. 3D DEM simulations of the flow of grinding media in pilot scale tower mills and pin mills are carried out in order to investigate the relative performance of these stirred mills. Media flow patterns and energy absorption rates and distributions are analysed here. In the second part of this paper, coherent flow structures, equipment wear and mixing and transport efficiency are analysed. (C) 2006 Published by Elsevier Ltd.
Resumo:
The genetic analysis of mate choice is fraught with difficulties. Males produce complex signals and displays that can consist of a combination of acoustic, visual, chemical and behavioural phenotypes. Furthermore, female preferences for these male traits are notoriously difficult to quantify. During mate choice, genes not only affect the phenotypes of the individual they are in, but can influence the expression of traits in other individuals. How can genetic analyses be conducted to encompass this complexity? Tighter integration of classical quantitative genetic approaches with modern genomic technologies promises to advance our understanding of the complex genetic basis of mate choice.
Resumo:
Discrete stochastic simulations are a powerful tool for understanding the dynamics of chemical kinetics when there are small-to-moderate numbers of certain molecular species. In this paper we introduce delays into the stochastic simulation algorithm, thus mimicking delays associated with transcription and translation. We then show that this process may well explain more faithfully than continuous deterministic models the observed sustained oscillations in expression levels of hes1 mRNA and Hes1 protein.