54 resultados para generalized assignment
Resumo:
The experiment examined the influence of memory for prior instances on aircraft conflict detection. Participants saw pairs of similar aircraft repeatedly conflict with each other. Performance improvements suggest that participants credited the conflict status of familiar aircraft pairs to repeated static features such as speed, and dynamic features such as aircraft relative position. Participants missed conflicts when a conflict pair resembled a pair that had repeatedly passed safely. Participants either did not attend to, or interpret, the bearing of aircraft correctly as a result of false memory-based expectations. Implications for instance models and situational awareness in dynamic systems are discussed.
Resumo:
Formal Concept Analysis is an unsupervised machine learning technique that has successfully been applied to document organisation by considering documents as objects and keywords as attributes. The basic algorithms of Formal Concept Analysis then allow an intelligent information retrieval system to cluster documents according to keyword views. This paper investigates the scalability of this idea. In particular we present the results of applying spatial data structures to large datasets in formal concept analysis. Our experiments are motivated by the application of the Formal Concept Analysis idea of a virtual filesystem [11,17,15]. In particular the libferris [1] Semantic File System. This paper presents customizations to an RD-Tree Generalized Index Search Tree based index structure to better support the application of Formal Concept Analysis to large data sources.
Resumo:
There are two main types of data sources of income distributions in China: household survey data and grouped data. Household survey data are typically available for isolated years and individual provinces. In comparison, aggregate or grouped data are typically available more frequently and usually have national coverage. In principle, grouped data allow investigation of the change of inequality over longer, continuous periods of time, and the identification of patterns of inequality across broader regions. Nevertheless, a major limitation of grouped data is that only mean (average) income and income shares of quintile or decile groups of the population are reported. Directly using grouped data reported in this format is equivalent to assuming that all individuals in a quintile or decile group have the same income. This potentially distorts the estimate of inequality within each region. The aim of this paper is to apply an improved econometric method designed to use grouped data to study income inequality in China. A generalized beta distribution is employed to model income inequality in China at various levels and periods of time. The generalized beta distribution is more general and flexible than the lognormal distribution that has been used in past research, and also relaxes the assumption of a uniform distribution of income within quintile and decile groups of populations. The paper studies the nature and extent of inequality in rural and urban China over the period 1978 to 2002. Income inequality in the whole of China is then modeled using a mixture of province-specific distributions. The estimated results are used to study the trends in national inequality, and to discuss the empirical findings in the light of economic reforms, regional policies, and globalization of the Chinese economy.
Resumo:
The quasi mode theory of macroscopic quantization in quantum optics and cavity QED developed by Dalton, Barnett and Knight is generalized. This generalization allows for cases in which two or more quasi permittivities, along with their associated mode functions, are needed to describe the classical optics device. It brings problems such as reflection and refraction at a dielectric boundary, the linear coupler, and the coupling of two optical cavities within the scope of the theory. For the most part, the results that are obtained here are simple generalizations of those obtained in previous work. However the coupling constants, which are of great importance in applications of the theory, are shown to contain significant additional terms which cannot be 'guessed' from the simpler forms. The expressions for the coupling constants suggest that the critical factor in determining the strength of coupling between a pair of quasi modes is their degree of spatial overlap. In an accompanying paper a fully quantum theoretic derivation of the laws of reflection and refraction at a boundary is given as an illustration of the generalized theory. The quasi mode picture of this process involves the annihilation of a photon travelling in the incident region quasi mode, and the subsequent creation of a photon in either the incident region or transmitted region quasi modes.
Resumo:
The generalization of the quasi mode theory of macroscopic quantization in quantum optics and cavity QED presented in the previous paper, is applied to provide a fully quantum theoretic derivation of the laws of reflection and refraction at a boundary. The quasi mode picture of this process involves the annihilation of a photon travelling in the incident region quasi mode, and the subsequent creation of a photon in either the incident region or transmitted region quasi modes. The derivation of the laws of reflection and refraction is achieved through the dual application of the quasi mode theory and a quantum scattering theory based on the Heisenberg picture. Formal expressions from scattering theory are given for the reflection and transmission coefficients. The behaviour of the intensity for a localized one photon wave packet coming in at time minus infinity from the incident direction is examined and it is shown that at time plus infinity, the light intensity is only significant where the classical laws of reflection and refraction predict. The occurrence of both refraction and reflection is dependent upon the quasi mode theory coupling constants between incident and transmitted region quasi modes being nonzero, and it is seen that the contributions to such coupling constants come from the overlap of the mode functions in the boundary layer region, as might be expected from a microscopic theory.
Resumo:
We assessed the effectiveness of two generalized visual training programmes in enhancing visual and motor performance for racquet sports. Forty young participants were assigned equally to groups undertaking visual training using Revien and Gabor's Sports Vision programme (Group 1), visual training using Revien's Eyerobics (Group 2), a placebo condition involving reading (Group 3) and a control condition involving physical practice only (Group 4). Measures of basic visual function and of sport-specific motor performance were obtained from all participants before and immediately after a 4-week training period. Significant pre- to post-training differences were evident on some of the measures; however, these were not group-dependent. Contrary to the claims made by proponents of generalized visual training, we found no evidence that the visual training programmes led to improvements in either vision or motor performance above and beyond those resulting simply from test familiarity.
Resumo:
Sympatric individuals of Rattus fuscipes and Rattus leucopus, two Australian native rats from the tropical wet forests of north Queensland, are difficult to distinguish morphologically and are often confused in the field. When we started a study on fine-scale movements of these species, using microsatellite markers, we found that the species as identified in the field did not form coherent genetic groups. In this study, we examined the potential of an iterative process of genetic assignment to separate specimens from distinct (e.g. species, populations) natural groups. Five loci with extensive overlap in allele distributions between species were used for the iterative process. Samples were randomly distributed into two starting groups of equal size and then subjected to the test. At each iteration, misassigned samples switched groups, and the output groups from a given round of assignment formed the input groups for the next round. All samples were assigned correctly on the 10th iteration, in which two genetic groups were clearly separated. Mitochondrial DNA sequences were obtained from samples from each genetic group identified by assignment, together with those of museum voucher specimens, to assess which species corresponded to which genetic group. The iterative procedure was also used to resolve groups within species, adequately separating the genetically identified R. leucopus from our two sampling sites. These results show that the iterative assignment process can correctly differentiate samples into their appropriate natural groups when diagnostic genetic markers are not available, which allowed us to resolve accurately the two R. leucopus and R. fuscipes species. Our approach provides an analytical tool that may be applicable to a broad variety of situations where genetic groups need to be resolved.
Resumo:
Some results are obtained for non-compact cases in topological vector spaces for the existence problem of solutions for some set-valued variational inequalities with quasi-monotone and lower hemi-continuous operators, and with quasi-semi-monotone and upper hemi-continuous operators. Some applications are given in non-reflexive Banach spaces for these existence problems of solutions and for perturbation problems for these set-valued variational inequalities with quasi-monotone and quasi-semi-monotone operators.
Resumo:
Most finite element packages use the Newmark algorithm for time integration of structural dynamics. Various algorithms have been proposed to better optimize the high frequency dissipation of this algorithm. Hulbert and Chung proposed both implicit and explicit forms of the generalized alpha method. The algorithms optimize high frequency dissipation effectively, and despite recent work on algorithms that possess momentum conserving/energy dissipative properties in a non-linear context, the generalized alpha method remains an efficient way to solve many problems, especially with adaptive timestep control. However, the implicit and explicit algorithms use incompatible parameter sets and cannot be used together in a spatial partition, whereas this can be done for the Newmark algorithm, as Hughes and Liu demonstrated, and for the HHT-alpha algorithm developed from it. The present paper shows that the explicit generalized alpha method can be rewritten so that it becomes compatible with the implicit form. All four algorithmic parameters can be matched between the explicit and implicit forms. An element interface between implicit and explicit partitions can then be used, analogous to that devised by Hughes and Liu to extend the Newmark method. The stability of the explicit/implicit algorithm is examined in a linear context and found to exceed that of the explicit partition. The element partition is significantly less dissipative of intermediate frequencies than one using the HHT-alpha method. The explicit algorithm can also be rewritten so that the discrete equation of motion evaluates forces from displacements and velocities found at the predicted mid-point of a cycle. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
We present two integrable spin ladder models which possess a general free parameter besides the rung coupling J. The models are exactly solvable by means of the Bethe ansatz method and we present the Bethe ansatz equations. We analyze the elementary excitations of the models which reveal the existence of a gap for both models that depends on the free parameter. (C) 2003 American Institute of Physics.
Resumo:
A recent development of the Markov chain Monte Carlo (MCMC) technique is the emergence of MCMC samplers that allow transitions between different models. Such samplers make possible a range of computational tasks involving models, including model selection, model evaluation, model averaging and hypothesis testing. An example of this type of sampler is the reversible jump MCMC sampler, which is a generalization of the Metropolis-Hastings algorithm. Here, we present a new MCMC sampler of this type. The new sampler is a generalization of the Gibbs sampler, but somewhat surprisingly, it also turns out to encompass as particular cases all of the well-known MCMC samplers, including those of Metropolis, Barker, and Hastings. Moreover, the new sampler generalizes the reversible jump MCMC. It therefore appears to be a very general framework for MCMC sampling. This paper describes the new sampler and illustrates its use in three applications in Computational Biology, specifically determination of consensus sequences, phylogenetic inference and delineation of isochores via multiple change-point analysis.
Resumo:
Genetic assignment methods use genotype likelihoods to draw inference about where individuals were or were not born, potentially allowing direct, real-time estimates of dispersal. We used simulated data sets to test the power and accuracy of Monte Carlo resampling methods in generating statistical thresholds for identifying F-0 immigrants in populations with ongoing gene flow, and hence for providing direct, real-time estimates of migration rates. The identification of accurate critical values required that resampling methods preserved the linkage disequilibrium deriving from recent generations of immigrants and reflected the sampling variance present in the data set being analysed. A novel Monte Carlo resampling method taking into account these aspects was proposed and its efficiency was evaluated. Power and error were relatively insensitive to the frequency assumed for missing alleles. Power to identify F-0 immigrants was improved by using large sample size (up to about 50 individuals) and by sampling all populations from which migrants may have originated. A combination of plotting genotype likelihoods and calculating mean genotype likelihood ratios (D-LR) appeared to be an effective way to predict whether F-0 immigrants could be identified for a particular pair of populations using a given set of markers.