992 resultados para Quantum process
Resumo:
A known limitation of the Probability Ranking Principle (PRP) is that it does not cater for dependence between documents. Recently, the Quantum Probability Ranking Principle (QPRP) has been proposed, which implicitly captures dependencies between documents through “quantum interference”. This paper explores whether this new ranking principle leads to improved performance for subtopic retrieval, where novelty and diversity is required. In a thorough empirical investigation, models based on the PRP, as well as other recently proposed ranking strategies for subtopic retrieval (i.e. Maximal Marginal Relevance (MMR) and Portfolio Theory(PT)), are compared against the QPRP. On the given task, it is shown that the QPRP outperforms these other ranking strategies. And unlike MMR and PT, one of the main advantages of the QPRP is that no parameter estimation/tuning is required; making the QPRP both simple and effective. This research demonstrates that the application of quantum theory to problems within information retrieval can lead to significant improvements.
Resumo:
Social tagging systems are shown to evidence a well known cognitive heuristic, the guppy effect, which arises from the combination of different concepts. We present some empirical evidence of this effect, drawn from a popular social tagging Web service. The guppy effect is then described using a quantum inspired formalism that has been already successfully applied to model conjunction fallacy and probability judgement errors. Key to the formalism is the concept of interference, which is able to capture and quantify the strength of the guppy effect.
Resumo:
In this work, we summarise the development of a ranking principle based on quantum probability theory, called the Quantum Probability Ranking Principle (QPRP), and we also provide an overview of the initial experiments performed employing the QPRP. The main difference between the QPRP and the classic Probability Ranking Principle, is that the QPRP implicitly captures the dependencies between documents by means of quantum interference". Subsequently, the optimal ranking of documents is not based solely on documents' probability of relevance but also on the interference with the previously ranked documents. Our research shows that the application of quantum theory to problems within information retrieval can lead to consistently better retrieval effectiveness, while still being simple, elegant and tractable.
Resumo:
Quantum-inspired models have recently attracted increasing attention in Information Retrieval. An intriguing characteristic of the mathematical framework of quantum theory is the presence of complex numbers. However, it is unclear what such numbers could or would actually represent or mean in Information Retrieval. The goal of this paper is to discuss the role of complex numbers within the context of Information Retrieval. First, we introduce how complex numbers are used in quantum probability theory. Then, we examine van Rijsbergen’s proposal of evoking complex valued representations of informations objects. We empirically show that such a representation is unlikely to be effective in practice (confuting its usefulness in Information Retrieval). We then explore alternative proposals which may be more successful at realising the power of complex numbers.
Resumo:
In this thesis we investigate the use of quantum probability theory for ranking documents. Quantum probability theory is used to estimate the probability of relevance of a document given a user's query. We posit that quantum probability theory can lead to a better estimation of the probability of a document being relevant to a user's query than the common approach, i. e. the Probability Ranking Principle (PRP), which is based upon Kolmogorovian probability theory. Following our hypothesis, we formulate an analogy between the document retrieval scenario and a physical scenario, that of the double slit experiment. Through the analogy, we propose a novel ranking approach, the quantum probability ranking principle (qPRP). Key to our proposal is the presence of quantum interference. Mathematically, this is the statistical deviation between empirical observations and expected values predicted by the Kolmogorovian rule of additivity of probabilities of disjoint events in configurations such that of the double slit experiment. We propose an interpretation of quantum interference in the document ranking scenario, and examine how quantum interference can be effectively estimated for document retrieval. To validate our proposal and to gain more insights about approaches for document ranking, we (1) analyse PRP, qPRP and other ranking approaches, exposing the assumptions underlying their ranking criteria and formulating the conditions for the optimality of the two ranking principles, (2) empirically compare three ranking principles (i. e. PRP, interactive PRP, and qPRP) and two state-of-the-art ranking strategies in two retrieval scenarios, those of ad-hoc retrieval and diversity retrieval, (3) analytically contrast the ranking criteria of the examined approaches, exposing similarities and differences, (4) study the ranking behaviours of approaches alternative to PRP in terms of the kinematics they impose on relevant documents, i. e. by considering the extent and direction of the movements of relevant documents across the ranking recorded when comparing PRP against its alternatives. Our findings show that the effectiveness of the examined ranking approaches strongly depends upon the evaluation context. In the traditional evaluation context of ad-hoc retrieval, PRP is empirically shown to be better or comparable to alternative ranking approaches. However, when we turn to examine evaluation contexts that account for interdependent document relevance (i. e. when the relevance of a document is assessed also with respect to other retrieved documents, as it is the case in the diversity retrieval scenario) then the use of quantum probability theory and thus of qPRP is shown to improve retrieval and ranking effectiveness over the traditional PRP and alternative ranking strategies, such as Maximal Marginal Relevance, Portfolio theory, and Interactive PRP. This work represents a significant step forward regarding the use of quantum theory in information retrieval. It demonstrates in fact that the application of quantum theory to problems within information retrieval can lead to improvements both in modelling power and retrieval effectiveness, allowing the constructions of models that capture the complexity of information retrieval situations. Furthermore, the thesis opens up a number of lines for future research. These include: (1) investigating estimations and approximations of quantum interference in qPRP; (2) exploiting complex numbers for the representation of documents and queries, and; (3) applying the concepts underlying qPRP to tasks other than document ranking.
Resumo:
The measures by which major developments are officially approved for construction are - by common agreement - complex, time-consuming, and of questionable merit in terms of maintaining ecological viability.
Resumo:
This thesis presents novel techniques for addressing the problems of continuous change and inconsistencies in large process model collections. The developed techniques treat process models as a collection of fragments and facilitate version control, standardization and automated process model discovery using fragment-based concepts. Experimental results show that the presented techniques are beneficial in consolidating large process model collections, specifically when there is a high degree of redundancy.
Resumo:
In recent years, the imperative to communicate organisational impacts to a variety of stakeholders has gained increasing importance within all sectors. Despite growing external demands for evaluation and social impact measurement, there has been limited critically informed analysis about the presumed importance of these activities to organisational success and the practical challenges faced by organisations in undertaking such assessment. In this paper, we present the findings from an action research study of five Australian small to medium social enterprises’ practices and use of evaluation and social impact analysis. Our findings have implications for social enterprise operators, policy makers and social investors regarding when, why and at what level these activities contribute to organisational performance and the fulfilment of mission.
Resumo:
This paper critically evaluates the series of inquires that the Australian Labor government undertook during 2011-2013 into reform of Australian media, communications and copyright laws. One important driver of policy reform was the government’s commitment to building a National Broadband Network (NBN), and the implications this had for existing broadcasting and telecommunications policy, as it would constitute a major driver of convergence of media and communications access devices and content platforms. These inquiries included: the Convergence Review of media and communications legislation; the Australian Law Reform Commission (ALRC) review of the National Classification Scheme; and the Independent Media Inquiry (Finkelstein Review) into Media and Media Regulation. One unusual feature of this review process was the degree to which academics were involved in the process, not simply as providers of expert opinion, but as review chairs seconded from their universities. This paper considers the role played by activist groups in all of these inquiries and their relationship to the various participants in the inquiries, as well as the implications of academics being engaged in such inquiries, not simply as activist-scholars, but as those primarily responsible for delivering policy review outcomes. The paper draws upon the concept of "policy windows" in order to better understand the context in which the inquiries took place, and their relative lack of legislative impact.
Resumo:
Temporary Traffic Control Plans (TCP’s), which provide construction phasing to maintain traffic during construction operations, are integral component of highway construction project design. Using the initial design, designers develop estimated quantities for the required TCP devices that become the basis for bids submitted by highway contractors. However, actual as-built quantities are often significantly different from the engineer’s original estimate. The total cost of TCP phasing on highway construction projects amounts to 6–10% of the total construction cost. Variations between engineer estimated quantities and final quantities contribute to reduced cost control, increased chances of cost related litigations, and bid rankings and selection. Statistical analyses of over 2000 highway construction projects were performed to determine the sources of variation, which later were used as the basis of development for an automated-hybrid prediction model that uses multiple regressions and heuristic rules to provide accurate TCP quantities and costs. The predictive accuracy of the model developed was demonstrated through several case studies.
Resumo:
The present article gives an overview of the reversible addition fragmentation chain transfer (RAFT) process. RAFT is one of the most versatile living radical polymerization systems and yields polymers of predictable chain length and narrow molecular weight distribution. RAFT relies on the rapid exchange of thiocarbonyl thio groups between growing polymeric chains. The key strengths of the RAFT process for polymer design are its high tolerance of monomer functionality and reaction conditions, the wide range of well-controlled polymeric architectures achievable, and its (in-principle) non-rate-retarding nature. This article introduces the mechanism of polymerization, the range of polymer molecular weights achievable, the range of monomers in which polymerization is controlled by RAFT, the various polymeric architectures that can be obtained, the type of end-group functionalities available to RAFT-made polymers, and the process of RAFT polymerization.
Resumo:
IT resources are indispensable in the management of Public Sector Organizations (PSOs) around the world. We investigate the factors that could leverage the IT resources in PSOs in developing economies. While research on ways to leverage IT resources in private sector organizations of developed countries is substantial, our understanding on ways to leverage the IT resources in the public sector in developing countries is limited. The current study aspires to address this gap in the literature by seeking to determine the key factors required to create process value from public sector IT investments in developing countries. We draw on the resource-centric theories to imply the nature of factors that could leverage the IT resources in the public sector. Employing an interpretive design, we identified three factors necessary for IT process value generation in the public sector. We discuss these factors and state their implications to theory and practice.
Resumo:
The previous chapters gave an insightful introduction into the various facets of Business Process Management. We now share a rich understanding of the essential ideas behind designing and managing processes for organizational purposes. We have also learned about the various streams of research and development that have influenced contemporary BPM. As a matter of fact, BPM has become a holistic management discipline. As such, it requires that a plethora of facets needs to be addressed for its successful und sustainable application. This chapter provides a framework that consolidates and structures the essential factors that constitute BPM as a whole. Drawing from research in the field of maturity models, we suggest six core elements of BPM: strategic alignment, governance, methods, information technology, people, and culture. These six elements serve as the structure for this BPM Handbook.
Resumo:
Nuclei and electrons in condensed matter and/or molecules are usually entangled, due to the prevailing (mainly electromagnetic) interactions. However, the "environment" of a microscopic scattering system (e.g. a proton) causes ultrafast decoherence, thus making atomic and/or nuclear entanglement e®ects not directly accessible to experiments. However, our neutron Compton scattering experiments from protons (H-atoms) in condensed systems and molecules have a characteristic collisional time about 100|1000 attoseconds. The quantum dynamics of an atom in this ultrashort, but ¯nite, time window is governed by non-unitary time evolution due to the aforementioned decoherence. Unexpectedly, recent theoretical investigations have shown that decoherence can also have the following energetic consequences. Disentangling two subsystems A and B of a quantum system AB is tantamount to erasure of quantum phase relations between A and B. This erasure is widely believed to be an innocuous process, which e.g. does not a®ect the energies of A and B. However, two independent groups proved recently that disentangling two systems, within a su±ciently short time interval, causes increase of their energies. This is also derivable by the simplest Lindblad-type master equation of one particle being subject to pure decoherence. Our neutron-proton scattering experiments with H2 molecules provide for the first time experimental evidence of this e®ect. Our results reveal that the neutron-proton collision, leading to the cleavage of the H-H bond in the attosecond timescale, is accompanied by larger energy transfer (by about 2|3%) than conventional theory predicts. Preliminary results from current investigations show qualitatively the same e®ect in the neutron-deuteron Compton scattering from D2 molecules. We interpret the experimental findings by treating the neutron-proton (or neutron-deuteron) collisional system as an entangled open quantum system being subject to fast decoherence caused by its "environment" (i.e., two electrons plus second nucleus of H2 or D2). The presented results seem to be of generic nature, and may have considerable consequences for various processes in condensed matter and molecules, e.g. in elementary chemical reactions.
Resumo:
The Comment by Mayers and Reiter criticizes our work on two counts. Firstly, it is claimed that the quantum decoherence effects that we report in consequence of our experimental analysis of neutron Compton scattering from H in gaseous H2 are not, as we maintain, outside the framework of conventional neutron scatteringtheory. Secondly, it is claimed that we did not really observe such effects, owing to a faulty analysis of the experimental data, which are claimed to be in agreement with conventional theory. We focus in this response on the critical issue of the reliability of our experimental results and analysis. Using the same standard Vesuvio instrument programs used by Mayers et al., we show that, if the experimental results for H in gaseous H2 are in agreement with conventional theory, then those for D in gaseous D2 obtained in the same way cannot be, and vice-versa. We expose a flaw in the calibration methodology used by Mayers et al. that leads to the present disagreement over the behaviour of H, namely the ad hoc adjustment of the measured H peak positions in TOF during the calibration of Vesuvio so that agreement is obtained with the expectation of conventional theory. We briefly address the question of the necessity to apply the theory of open quantum systems.