903 resultados para information theory
Resumo:
Despite many successes of conventional DNA sequencing methods, some DNAs remain difficult or impossible to sequence. Unsequenceable regions occur in the genomes of many biologically important organisms, including the human genome. Such regions range in length from tens to millions of bases, and may contain valuable information such as the sequences of important genes. The authors have recently developed a technique that renders a wide range of problematic DNAs amenable to sequencing. The technique is known as sequence analysis via mutagenesis (SAM). This paper presents a number of algorithms for analysing and interpreting data generated by this technique.
Resumo:
Despite the success of conventional Sanger sequencing, significant regions of many genomes still present major obstacles to sequencing. Here we propose a novel approach with the potential to alleviate a wide range of sequencing difficulties. The technique involves extracting target DNA sequence from variants generated by introduction of random mutations. The introduction of mutations does not destroy original sequence information, but distributes it amongst multiple variants. Some of these variants lack problematic features of the target and are more amenable to conventional sequencing. The technique has been successfully demonstrated with mutation levels up to an average 18% base substitution and has been used to read previously intractable poly(A), AT-rich and GC-rich motifs.
Resumo:
This paper introduces the concept of religious information poverty in Australian state schools from an information science perspective. Information scientists have been theorising about the global information society for some time, along with its increased provision of vital information for the good of the world. Australian state schools see themselves as preparing children for effective participation in the information society, yet Australian children are currently suffering a religious illiteracy that undermines this goal. Some reasons and theories are offered to explain the existence of religious information poverty in state schools, and suggestions for professional stakeholders are offered for its alleviation.
Resumo:
“Closing the gap in curriculum development leadership” is a Carrick-funded University of Queensland project which is designed to address two related gaps in current knowledge and in existing professional development programs for academic staff. The first gap is in our knowledge of curriculum and pedagogical issues as they arise in relation to multi-year sequences of study, such as majors in generalist degrees, or core programs in more structured degrees. While there is considerable knowledge of curriculum and pedagogy at the course or individual unit of study level (e.g. Philosophy I), there is very little properly conceptualised, empirically informed knowledge about student learning (and teaching) over, say, a three-year major sequence in a traditional Arts or Sciences subject. The Carrick-funded project aims to (begin to) fill this gap through bottom-up curriculum development projects across the range of UQ’s offerings. The second gap is in our professional development programs and, indeed, in our recognition and support for the people who are in charge of such multi-year sequences of study. The major convener or program coordinator is not as well supported, in Australian and overseas professional development programs, as the lecturer in charge of a single course (or unit of study). Nor is her work likely to be taken account of in workload calculations or for the purposes of promotion and career advancement more generally. The Carrick-funded project aims to fill this gap by developing, in consultation with crucial stakeholders, amendments to existing university policies and practices. The attached documents provide a useful introduction to the project. For more information, please contact Fred D’Agostino at f.dagostino@uq.edu.au.
Resumo:
Power system real time security assessment is one of the fundamental modules of the electricity markets. Typically, when a contingency occurs, it is required that security assessment and enhancement module shall be ready for action within about 20 minutes’ time to meet the real time requirement. The recent California black out again highlighted the importance of system security. This paper proposed an approach for power system security assessment and enhancement based on the information provided from the pre-defined system parameter space. The proposed scheme opens up an efficient way for real time security assessment and enhancement in a competitive electricity market for single contingency case
Resumo:
Data mining is the process to identify valid, implicit, previously unknown, potentially useful and understandable information from large databases. It is an important step in the process of knowledge discovery in databases, (Olaru & Wehenkel, 1999). In a data mining process, input data can be structured, seme-structured, or unstructured. Data can be in text, categorical or numerical values. One of the important characteristics of data mining is its ability to deal data with large volume, distributed, time variant, noisy, and high dimensionality. A large number of data mining algorithms have been developed for different applications. For example, association rules mining can be useful for market basket problems, clustering algorithms can be used to discover trends in unsupervised learning problems, classification algorithms can be applied in decision-making problems, and sequential and time series mining algorithms can be used in predicting events, fault detection, and other supervised learning problems (Vapnik, 1999). Classification is among the most important tasks in the data mining, particularly for data mining applications into engineering fields. Together with regression, classification is mainly for predictive modelling. So far, there have been a number of classification algorithms in practice. According to (Sebastiani, 2002), the main classification algorithms can be categorized as: decision tree and rule based approach such as C4.5 (Quinlan, 1996); probability methods such as Bayesian classifier (Lewis, 1998); on-line methods such as Winnow (Littlestone, 1988) and CVFDT (Hulten 2001), neural networks methods (Rumelhart, Hinton & Wiliams, 1986); example-based methods such as k-nearest neighbors (Duda & Hart, 1973), and SVM (Cortes & Vapnik, 1995). Other important techniques for classification tasks include Associative Classification (Liu et al, 1998) and Ensemble Classification (Tumer, 1996).
Resumo:
Business process design is primarily driven by process improvement objectives. However, the role of control objectives stemming from regulations and standards is becoming increasingly important for businesses in light of recent events that led to some of the largest scandals in corporate history. As organizations strive to meet compliance agendas, there is an evident need to provide systematic approaches that assist in the understanding of the interplay between (often conflicting) business and control objectives during business process design. In this paper, our objective is twofold. We will firstly present a research agenda in the space of business process compliance, identifying major technical and organizational challenges. We then tackle a part of the overall problem space, which deals with the effective modeling of control objectives and subsequently their propagation onto business process models. Control objective modeling is proposed through a specialized modal logic based on normative systems theory, and the visualization of control objectives on business process models is achieved procedurally. The proposed approach is demonstrated in the context of a purchase-to-pay scenario.