111 resultados para Empirical Algorithm Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Capturing the pattern of structural change is a relevant task in applied demand analysis, as consumer preferences may vary significantly over time. Filtering and smoothing techniques have recently played an increasingly relevant role. A dynamic Almost Ideal Demand System with random walk parameters is estimated in order to detect modifications in consumer habits and preferences, as well as changes in the behavioural response to prices and income. Systemwise estimation, consistent with the underlying constraints from economic theory, is achieved through the EM algorithm. The proposed model is applied to UK aggregate consumption of alcohol and tobacco, using quarterly data from 1963 to 2003. Increased alcohol consumption is explained by a preference shift, addictive behaviour and a lower price elasticity. The dynamic and time-varying specification is consistent with the theoretical requirements imposed at each sample point. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the study was to establish and verify a predictive vegetation model for plant community distribution in the alti-Mediterranean zone of the Lefka Ori massif, western Crete. Based on previous work three variables were identified as significant determinants of plant community distribution, namely altitude, slope angle and geomorphic landform. The response of four community types against these variables was tested using classification trees analysis in order to model community type occurrence. V-fold cross-validation plots were used to determine the length of the best fitting tree. The final 9node tree selected, classified correctly 92.5% of the samples. The results were used to provide decision rules for the construction of a spatial model for each community type. The model was implemented within a Geographical Information System (GIS) to predict the distribution of each community type in the study site. The evaluation of the model in the field using an error matrix gave an overall accuracy of 71%. The user's accuracy was higher for the Crepis-Cirsium (100%) and Telephium-Herniaria community type (66.7%) and relatively lower for the Peucedanum-Alyssum and Dianthus-Lomelosia community types (63.2% and 62.5%, respectively). Misclassification and field validation points to the need for improved geomorphological mapping and suggests the presence of transitional communities between existing community types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nested clade phylogeographic analysis (NCPA) is a popular method for reconstructing the demographic history of spatially distributed populations from genetic data. Although some parts of the analysis are automated, there is no unique and widely followed algorithm for doing this in its entirety, beginning with the data, and ending with the inferences drawn from the data. This article describes a method that automates NCPA, thereby providing a framework for replicating analyses in an objective way. To do so, a number of decisions need to be made so that the automated implementation is representative of previous analyses. We review how the NCPA procedure has evolved since its inception and conclude that there is scope for some variability in the manual application of NCPA. We apply the automated software to three published datasets previously analyzed manually and replicate many details of the manual analyses, suggesting that the current algorithm is representative of how a typical user will perform NCPA. We simulate a large number of replicate datasets for geographically distributed, but entirely random-mating, populations. These are then analyzed using the automated NCPA algorithm. Results indicate that NCPA tends to give a high frequency of false positives. In our simulations we observe that 14% of the clades give a conclusive inference that a demographic event has occurred, and that 75% of the datasets have at least one clade that gives such an inference. This is mainly due to the generation of multiple statistics per clade, of which only one is required to be significant to apply the inference key. We survey the inferences that have been made in recent publications and show that the most commonly inferred processes (restricted gene flow with isolation by distance and contiguous range expansion) are those that are commonly inferred in our simulations. However, published datasets typically yield a richer set of inferences with NCPA than obtained in our random-mating simulations, and further testing of NCPA with models of structured populations is necessary to examine its accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurately and reliably identifying the actual number of clusters present with a dataset of gene expression profiles, when no additional information on cluster structure is available, is a problem addressed by few algorithms. GeneMCL transforms microarray analysis data into a graph consisting of nodes connected by edges, where the nodes represent genes, and the edges represent the similarity in expression of those genes, as given by a proximity measurement. This measurement is taken to be the Pearson correlation coefficient combined with a local non-linear rescaling step. The resulting graph is input to the Markov Cluster (MCL) algorithm, which is an elegant, deterministic, non-specific and scalable method, which models stochastic flow through the graph. The algorithm is inherently affected by any cluster structure present, and rapidly decomposes a graph into cohesive clusters. The potential of the GeneMCL algorithm is demonstrated with a 5730 gene subset (IGS) of the Van't Veer breast cancer database, for which the clusterings are shown to reflect underlying biological mechanisms. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inferring the spatial expansion dynamics of invading species from molecular data is notoriously difficult due to the complexity of the processes involved. For these demographic scenarios, genetic data obtained from highly variable markers may be profitably combined with specific sampling schemes and information from other sources using a Bayesian approach. The geographic range of the introduced toad Bufo marinus is still expanding in eastern and northern Australia, in each case from isolates established around 1960. A large amount of demographic and historical information is available on both expansion areas. In each area, samples were collected along a transect representing populations of different ages and genotyped at 10 microsatellite loci. Five demographic models of expansion, differing in the dispersal pattern for migrants and founders and in the number of founders, were considered. Because the demographic history is complex, we used an approximate Bayesian method, based on a rejection-regression algorithm. to formally test the relative likelihoods of the five models of expansion and to infer demographic parameters. A stepwise migration-foundation model with founder events was statistically better supported than other four models in both expansion areas. Posterior distributions supported different dynamics of expansion in the studied areas. Populations in the eastern expansion area have a lower stable effective population size and have been founded by a smaller number of individuals than those in the northern expansion area. Once demographically stabilized, populations exchange a substantial number of effective migrants per generation in both expansion areas, and such exchanges are larger in northern than in eastern Australia. The effective number of migrants appears to be considerably lower than that of founders in both expansion areas. We found our inferences to be relatively robust to various assumptions on marker. demographic, and historical features. The method presented here is the only robust, model-based method available so far, which allows inferring complex population dynamics over a short time scale. It also provides the basis for investigating the interplay between population dynamics, drift, and selection in invasive species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty contributes a major part in the accuracy of a decision-making process while its inconsistency is always difficult to be solved by existing decision-making tools. Entropy has been proved to be useful to evaluate the inconsistency of uncertainty among different respondents. The study demonstrates an entropy-based financial decision support system called e-FDSS. This integrated system provides decision support to evaluate attributes (funding options and multiple risks) available in projects. Fuzzy logic theory is included in the system to deal with the qualitative aspect of these options and risks. An adaptive genetic algorithm (AGA) is also employed to solve the decision algorithm in the system in order to provide optimal and consistent rates to these attributes. Seven simplified and parallel projects from a Hong Kong construction small and medium enterprise (SME) were assessed to evaluate the system. The result shows that the system calculates risk adjusted discount rates (RADR) of projects in an objective way. These rates discount project cash flow impartially. Inconsistency of uncertainty is also successfully evaluated by the use of the entropy method. Finally, the system identifies the favourable funding options that are managed by a scheme called SME Loan Guarantee Scheme (SGS). Based on these results, resource allocation could then be optimized and the best time to start a new project could also be identified throughout the overall project life cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A desktop tool for replay and analysis of gaze-enhanced multiparty virtual collaborative sessions is described. We linked three CAVE (TM)-like environments, creating a multiparty collaborative virtual space where avatars are animated with 3D gaze as well as head and hand motions in real time. Log files are recorded for subsequent playback and analysis Using the proposed software tool. During replaying the user can rotate the viewpoint and navigate in the simulated 3D scene. The playback mechanism relies on multiple distributed log files captured at every site. This structure enables an observer to experience latencies of movement and information transfer for every site as this is important fir conversation analysis. Playback uses an event-replay algorithm, modified to allow fast traversal of the scene by selective rendering of nodes, and to simulate fast random access. The tool's is analysis module can show each participant's 3D gaze points and areas where gaze has been concentrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer vision applications generally split their problem into multiple simpler tasks. Likewise research often combines algorithms into systems for evaluation purposes. Frameworks for modular vision provide interfaces and mechanisms for algorithm combination and network transparency. However, these don’t provide interfaces efficiently utilising the slow memory in modern PCs. We investigate quantitatively how system performance varies with different patterns of memory usage by the framework for an example vision system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a procedure for filtering electromyographic (EMG) signals. Its key element is the Empirical Mode Decomposition, a novel digital signal processing technique that can decompose my time-series into a set of functions designated as intrinsic mode functions. The procedure for EMG signal filtering is compared to a related approach based on the wavelet transform. Results obtained from the analysis of synthetic and experimental EMG signals show that Our method can be Successfully and easily applied in practice to attenuation of background activity in EMG signals. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parametric software effort estimation models consisting on a single mathematical relationship suffer from poor adjustment and predictive characteristics in cases in which the historical database considered contains data coming from projects of a heterogeneous nature. The segmentation of the input domain according to clusters obtained from the database of historical projects serves as a tool for more realistic models that use several local estimation relationships. Nonetheless, it may be hypothesized that using clustering algorithms without previous consideration of the influence of well-known project attributes misses the opportunity to obtain more realistic segments. In this paper, we describe the results of an empirical study using the ISBSG-8 database and the EM clustering algorithm that studies the influence of the consideration of two process-related attributes as drivers of the clustering process: the use of engineering methodologies and the use of CASE tools. The results provide evidence that such consideration conditions significantly the final model obtained, even though the resulting predictive quality is of a similar magnitude.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tremor is a clinical feature characterized by oscillations of a part of the body. The detection and study of tremor is an important step in investigations seeking to explain underlying control strategies of the central nervous system under natural (or physiological) and pathological conditions. It is well established that tremorous activity is composed of deterministic and stochastic components. For this reason, the use of digital signal processing techniques (DSP) which take into account the nonlinearity and nonstationarity of such signals may bring new information into the signal analysis which is often obscured by traditional linear techniques (e.g. Fourier analysis). In this context, this paper introduces the application of the empirical mode decomposition (EMD) and Hilbert spectrum (HS), which are relatively new DSP techniques for the analysis of nonlinear and nonstationary time-series, for the study of tremor. Our results, obtained from the analysis of experimental signals collected from 31 patients with different neurological conditions, showed that the EMD could automatically decompose acquired signals into basic components, called intrinsic mode functions (IMFs), representing tremorous and voluntary activity. The identification of a physical meaning for IMFs in the context of tremor analysis suggests an alternative and new way of detecting tremorous activity. These results may be relevant for those applications requiring automatic detection of tremor. Furthermore, the energy of IMFs was visualized as a function of time and frequency by means of the HS. This analysis showed that the variation of energy of tremorous and voluntary activity could be distinguished and characterized on the HS. Such results may be relevant for those applications aiming to identify neurological disorders. In general, both the HS and EMD demonstrated to be very useful to perform objective analysis of any kind of tremor and can therefore be potentially used to perform functional assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide a system identification framework for the analysis of THz-transient data. The subspace identification algorithm for both deterministic and stochastic systems is used to model the time-domain responses of structures under broadband excitation. Structures with additional time delays can be modelled within the state-space framework using additional state variables. We compare the numerical stability of the commonly used least-squares ARX models to that of the subspace N4SID algorithm by using examples of fourth-order and eighth-order systems under pulse and chirp excitation conditions. These models correspond to structures having two and four modes simultaneously propagating respectively. We show that chirp excitation combined with the subspace identification algorithm can provide a better identification of the underlying mode dynamics than the ARX model does as the complexity of the system increases. The use of an identified state-space model for mode demixing, upon transformation to a decoupled realization form is illustrated. Applications of state-space models and the N4SID algorithm to THz transient spectroscopy as well as to optical systems are highlighted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transient neural assemblies mediated by synchrony in particular frequency ranges are thought to underlie cognition. We propose a new approach to their detection, using empirical mode decomposition (EMD), a data-driven approach removing the need for arbitrary bandpass filter cut-offs. Phase locking is sought between modes. We explore the features of EMD, including making a quantitative assessment of its ability to preserve phase content of signals, and proceed to develop a statistical framework with which to assess synchrony episodes. Furthermore, we propose a new approach to ensure signal decomposition using EMD. We adapt the Hilbert spectrum to a time-frequency representation of phase locking and are able to locate synchrony successfully in time and frequency between synthetic signals reminiscent of EEG. We compare our approach, which we call EMD phase locking analysis (EMDPL) with existing methods and show it to offer improved time-frequency localisation of synchrony.