1000 resultados para 280499 Computation Theory and Mathematics not elsewhere classified
Resumo:
The quality of discovered features in relevance feedback (RF) is the key issue for effective search query. Most existing feedback methods do not carefully address the issue of selecting features for noise reduction. As a result, extracted noisy features can easily contribute to undesirable effectiveness. In this paper, we propose a novel feature extraction method for query formulation. This method first extract term association patterns in RF as knowledge for feature extraction. Negative RF is then used to improve the quality of the discovered knowledge. A novel information filtering (IF) model is developed to evaluate the proposed method. The experimental results conducted on Reuters Corpus Volume 1 and TREC topics confirm that the proposed model achieved encouraging performance compared to state-of-the-art IF models.
Resumo:
Background Predicting protein subnuclear localization is a challenging problem. Some previous works based on non-sequence information including Gene Ontology annotations and kernel fusion have respective limitations. The aim of this work is twofold: one is to propose a novel individual feature extraction method; another is to develop an ensemble method to improve prediction performance using comprehensive information represented in the form of high dimensional feature vector obtained by 11 feature extraction methods. Methodology/Principal Findings A novel two-stage multiclass support vector machine is proposed to predict protein subnuclear localizations. It only considers those feature extraction methods based on amino acid classifications and physicochemical properties. In order to speed up our system, an automatic search method for the kernel parameter is used. The prediction performance of our method is evaluated on four datasets: Lei dataset, multi-localization dataset, SNL9 dataset and a new independent dataset. The overall accuracy of prediction for 6 localizations on Lei dataset is 75.2% and that for 9 localizations on SNL9 dataset is 72.1% in the leave-one-out cross validation, 71.7% for the multi-localization dataset and 69.8% for the new independent dataset, respectively. Comparisons with those existing methods show that our method performs better for both single-localization and multi-localization proteins and achieves more balanced sensitivities and specificities on large-size and small-size subcellular localizations. The overall accuracy improvements are 4.0% and 4.7% for single-localization proteins and 6.5% for multi-localization proteins. The reliability and stability of our classification model are further confirmed by permutation analysis. Conclusions It can be concluded that our method is effective and valuable for predicting protein subnuclear localizations. A web server has been designed to implement the proposed method. It is freely available at http://bioinformatics.awowshop.com/snlpred_page.php.
Resumo:
A one-time program is a hypothetical device by which a user may evaluate a circuit on exactly one input of his choice, before the device self-destructs. One-time programs cannot be achieved by software alone, as any software can be copied and re-run. However, it is known that every circuit can be compiled into a one-time program using a very basic hypothetical hardware device called a one-time memory. At first glance it may seem that quantum information, which cannot be copied, might also allow for one-time programs. But it is not hard to see that this intuition is false: one-time programs for classical or quantum circuits based solely on quantum information do not exist, even with computational assumptions. This observation raises the question, "what assumptions are required to achieve one-time programs for quantum circuits?" Our main result is that any quantum circuit can be compiled into a one-time program assuming only the same basic one-time memory devices used for classical circuits. Moreover, these quantum one-time programs achieve statistical universal composability (UC-security) against any malicious user. Our construction employs methods for computation on authenticated quantum data, and we present a new quantum authentication scheme called the trap scheme for this purpose. As a corollary, we establish UC-security of a recent protocol for delegated quantum computation.
Resumo:
Evolutionary computation is an effective tool for solving optimization problems. However, its significant computational demand has limited its real-time and on-line applications, especially in embedded systems with limited computing resources, e.g., mobile robots. Heuristic methods such as the genetic algorithm (GA) based approaches have been investigated for robot path planning in dynamic environments. However, research on the simulated annealing (SA) algorithm, another popular evolutionary computation algorithm, for dynamic path planning is still limited mainly due to its high computational demand. An enhanced SA approach, which integrates two additional mathematical operators and initial path selection heuristics into the standard SA, is developed in this work for robot path planning in dynamic environments with both static and dynamic obstacles. It improves the computing performance of the standard SA significantly while giving an optimal or near-optimal robot path solution, making its real-time and on-line applications possible. Using the classic and deterministic Dijkstra algorithm as a benchmark, comprehensive case studies are carried out to demonstrate the performance of the enhanced SA and other SA algorithms in various dynamic path planning scenarios.
Resumo:
In this paper, a polynomial time algorithm is presented for solving the Eden problem for graph cellular automata. The algorithm is based on our neighborhood elimination operation which removes local neighborhood configurations which cannot be used in a pre-image of a given configuration. This paper presents a detailed derivation of our algorithm from first principles, and a detailed complexity and accuracy analysis is also given. In the case of time complexity, it is shown that the average case time complexity of the algorithm is \Theta(n^2), and the best and worst cases are \Omega(n) and O(n^3) respectively. This represents a vast improvement in the upper bound over current methods, without compromising average case performance.
Resumo:
This thesis presents an empirical study of the effects of topology on cellular automata rule spaces. The classical definition of a cellular automaton is restricted to that of a regular lattice, often with periodic boundary conditions. This definition is extended to allow for arbitrary topologies. The dynamics of cellular automata within the triangular tessellation were analysed when transformed to 2-manifolds of topological genus 0, genus 1 and genus 2. Cellular automata dynamics were analysed from a statistical mechanics perspective. The sample sizes required to obtain accurate entropy calculations were determined by an entropy error analysis which observed the error in the computed entropy against increasing sample sizes. Each cellular automata rule space was sampled repeatedly and the selected cellular automata were simulated over many thousands of trials for each topology. This resulted in an entropy distribution for each rule space. The computed entropy distributions are indicative of the cellular automata dynamical class distribution. Through the comparison of these dynamical class distributions using the E-statistic, it was identified that such topological changes cause these distributions to alter. This is a significant result which implies that both global structure and local dynamics play a important role in defining long term behaviour of cellular automata.
Resumo:
This chapter introduces different theoretical approaches to negotiation and provides an explanation of these differing frameworks. While the action of a negotiation centres on the background research undertaken and what happens at the negotiation table, there is a need to know what principles and assumptions are informing these activities. Theories offer a way of understanding the underlying structures, processes and relationships of negotiation. Further, negotiation theories also assist with focusing attention on the 'basis of the bargain' and provide a standpoint from which to judge offers and counter-offers during the negotiation.
Resumo:
Purpose – The purpose of this paper is to provide of a review of the theory and models underlying project management (PM) research degrees that encourage reflective learning. Design/methodology/approach – Review of the literature and reflection on the practice of being actively involved in conducting and supervising academic research and disseminating academic output. The paper argues the case for the potential usefulness of reflective academic research to PM practitioners. It also highlights theoretical drivers of and barriers to reflective academic research by PM practitioners. Findings – A reflective learning approach to research can drive practical results though it requires a great deal of commitment and support by both academic and industry partners. Practical implications – This paper suggests how PM practitioners can engage in academic research that has practical outcomes and how to be more effective at disseminating these research outcomes. Originality/value – Advanced academic degrees, in particular those completed by PM practitioners, can validate a valuable source of innovative ideas and approaches that should be more quickly absorbed into the PM profession’s sources of knowledge. The value of this paper is to critically review and facilitate a reduced adaptation time for implementation of useful reflective academic research to industry.
Resumo:
I agree with Costanza and Finkelstein (2015) that it is futile to further invest in the study of generational differences in the work context due to a lack of appropriate theory and methods. The key problem with the generations concept is that splitting continuous variables such as age or time into a few discrete units involves arbitrary cutoffs and atheoretical groupings of individuals (e.g., stating that all people born between the early 1960s and early 1980s belong to Generation X). As noted by methodologists, this procedure leads to a loss of information about individuals and reduced statistical power (MacCallum, Zhang, Preacher, & Rucker, 2002). Due to these conceptual and methodological limitations, I regard it as very difficult if not impossible to develop a “comprehensive theory of generations” (Costanza & Finkelstein, p. 20) and to rigorously examine generational differences at work in empirical studies.
Resumo:
The data structure of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. This research develops a methodology for evaluating, ex ante, the relative desirability of alternative data structures for end user queries. This research theorizes that the data structure that yields the lowest weighted average complexity for a representative sample of information requests is the most desirable data structure for end user queries. The theory was tested in an experiment that compared queries from two different relational database schemas. As theorized, end users querying the data structure associated with the less complex queries performed better Complexity was measured using three different Halstead metrics. Each of the three metrics provided excellent predictions of end user performance. This research supplies strong evidence that organizations can use complexity metrics to evaluate, ex ante, the desirability of alternate data structures. Organizations can use these evaluations to enhance the efficient and effective retrieval of information by creating data structures that minimize end user query complexity.
Resumo:
The mechanical behavior of the vertebrate skull is often modeled using free-body analysis of simple geometric structures and, more recently, finite-element (FE) analysis. In this study, we compare experimentally collected in vivo bone strain orientations and magnitudes from the cranium of the American alligator with those extrapolated from a beam model and extracted from an FE model. The strain magnitudes predicted from beam and FE skull models bear little similarity to relative and absolute strain magnitudes recorded during in vivo biting experiments. However, quantitative differences between principal strain orientations extracted from the FE skull model and recorded during the in vivo experiments were smaller, and both generally matched expectations from the beam model. The differences in strain magnitude between the data sets may be attributable to the level of resolution of the models, the material properties used in the FE model, and the loading conditions (i.e., external forces and constraints). This study indicates that FE models and modeling of skulls as simple engineering structures may give a preliminary idea of how these structures are loaded, but whenever possible, modeling results should be verified with either in vitro or preferably in vivo testing, especially if precise knowledge of strain magnitudes is desired. (c) 2005 Wiley-Liss, Inc.
Resumo:
Words and Silences is the official on-line journal of the International Oral History Association. It is an internationally peer reviewed, high quality forum for oral historians from a wide range of disciplines and a means for the professional community to share projects and current trends of oral history from around the world. We are extremely pleased to release the first online issue of Word &Silences. This e-journal is the result of long standing discussion and debate about the best way to publish a quality bilingual oral history journal (including a blind peer reviewed section) as a viable solution to mounting difficulties associated with publishing in print. We have discovered that an online version is also not without its challenges and requires tremendous labor intensive dedication. We strongly encourage members to assist us with small review process tasks in the future, so that we can ensure the sustainability of an annual W&S publication for our members and beyond.
Resumo:
The work of Italian-based photo-artist Patrick Nicholas is analysed to show how his re-workings of classic ‘old-master’ paintings can be seen as the art of ‘redaction,’ shedding new light on the relationship between originality and copying. I argue that redactional creativity is both highly productive of new meanings and a reinvention of the role of the medieval Golden Legend. (Lives of the Saints).