324 resultados para quantum measurement
Resumo:
In this work, we summarise the development of a ranking principle based on quantum probability theory, called the Quantum Probability Ranking Principle (QPRP), and we also provide an overview of the initial experiments performed employing the QPRP. The main difference between the QPRP and the classic Probability Ranking Principle, is that the QPRP implicitly captures the dependencies between documents by means of quantum interference". Subsequently, the optimal ranking of documents is not based solely on documents' probability of relevance but also on the interference with the previously ranked documents. Our research shows that the application of quantum theory to problems within information retrieval can lead to consistently better retrieval effectiveness, while still being simple, elegant and tractable.
Resumo:
Quantum-inspired models have recently attracted increasing attention in Information Retrieval. An intriguing characteristic of the mathematical framework of quantum theory is the presence of complex numbers. However, it is unclear what such numbers could or would actually represent or mean in Information Retrieval. The goal of this paper is to discuss the role of complex numbers within the context of Information Retrieval. First, we introduce how complex numbers are used in quantum probability theory. Then, we examine van Rijsbergen’s proposal of evoking complex valued representations of informations objects. We empirically show that such a representation is unlikely to be effective in practice (confuting its usefulness in Information Retrieval). We then explore alternative proposals which may be more successful at realising the power of complex numbers.
Resumo:
In this thesis we investigate the use of quantum probability theory for ranking documents. Quantum probability theory is used to estimate the probability of relevance of a document given a user's query. We posit that quantum probability theory can lead to a better estimation of the probability of a document being relevant to a user's query than the common approach, i. e. the Probability Ranking Principle (PRP), which is based upon Kolmogorovian probability theory. Following our hypothesis, we formulate an analogy between the document retrieval scenario and a physical scenario, that of the double slit experiment. Through the analogy, we propose a novel ranking approach, the quantum probability ranking principle (qPRP). Key to our proposal is the presence of quantum interference. Mathematically, this is the statistical deviation between empirical observations and expected values predicted by the Kolmogorovian rule of additivity of probabilities of disjoint events in configurations such that of the double slit experiment. We propose an interpretation of quantum interference in the document ranking scenario, and examine how quantum interference can be effectively estimated for document retrieval. To validate our proposal and to gain more insights about approaches for document ranking, we (1) analyse PRP, qPRP and other ranking approaches, exposing the assumptions underlying their ranking criteria and formulating the conditions for the optimality of the two ranking principles, (2) empirically compare three ranking principles (i. e. PRP, interactive PRP, and qPRP) and two state-of-the-art ranking strategies in two retrieval scenarios, those of ad-hoc retrieval and diversity retrieval, (3) analytically contrast the ranking criteria of the examined approaches, exposing similarities and differences, (4) study the ranking behaviours of approaches alternative to PRP in terms of the kinematics they impose on relevant documents, i. e. by considering the extent and direction of the movements of relevant documents across the ranking recorded when comparing PRP against its alternatives. Our findings show that the effectiveness of the examined ranking approaches strongly depends upon the evaluation context. In the traditional evaluation context of ad-hoc retrieval, PRP is empirically shown to be better or comparable to alternative ranking approaches. However, when we turn to examine evaluation contexts that account for interdependent document relevance (i. e. when the relevance of a document is assessed also with respect to other retrieved documents, as it is the case in the diversity retrieval scenario) then the use of quantum probability theory and thus of qPRP is shown to improve retrieval and ranking effectiveness over the traditional PRP and alternative ranking strategies, such as Maximal Marginal Relevance, Portfolio theory, and Interactive PRP. This work represents a significant step forward regarding the use of quantum theory in information retrieval. It demonstrates in fact that the application of quantum theory to problems within information retrieval can lead to improvements both in modelling power and retrieval effectiveness, allowing the constructions of models that capture the complexity of information retrieval situations. Furthermore, the thesis opens up a number of lines for future research. These include: (1) investigating estimations and approximations of quantum interference in qPRP; (2) exploiting complex numbers for the representation of documents and queries, and; (3) applying the concepts underlying qPRP to tasks other than document ranking.
Resumo:
In recent years, the imperative to communicate organisational impacts to a variety of stakeholders has gained increasing importance within all sectors. Despite growing external demands for evaluation and social impact measurement, there has been limited critically informed analysis about the presumed importance of these activities to organisational success and the practical challenges faced by organisations in undertaking such assessment. In this paper, we present the findings from an action research study of five Australian small to medium social enterprises’ practices and use of evaluation and social impact analysis. Our findings have implications for social enterprise operators, policy makers and social investors regarding when, why and at what level these activities contribute to organisational performance and the fulfilment of mission.
Resumo:
Dynamic light scattering (DLS) has become a primary nanoparticle characterization technique with applications from materials characterization to biological and environmental detection. With the expansion in DLS use from homogeneous spheres to more complicated nanostructures, comes a decrease in accuracy. Much research has been performed to develop different diffusion models that account for the vastly different structures but little attention has been given to the effect on the light scattering properties in relation to DLS. In this work, small (core size < 5 nm) core-shell nanoparticles were used as a case study to measure the capping thickness of a layer of dodecanethiol (DDT) on Au and ZnO nanoparticles by DLS. We find that the DDT shell has very little effect on the scattering properties of the inorganic core and hence can be ignored to a first approximation. However, this results in conventional DLS analysis overestimating the hydrodynamic size in the volume and number weighted distributions. By introducing a simple correction formula that more accurately yields hydrodynamic size distributions a more precise determination of the molecular shell thickness is obtained. With this correction, the measured thickness of the DDT shell was found to be 7.3 ± 0.3 Å, much less than the extended chain length of 16 Å. This organic layer thickness suggests that on small nanoparticles, the DDT monolayer adopts a compact disordered structure rather than an open ordered structure on both ZnO and Au nanoparticle surfaces. These observations are in agreement with published molecular dynamics results.
Resumo:
The Comment by Mayers and Reiter criticizes our work on two counts. Firstly, it is claimed that the quantum decoherence effects that we report in consequence of our experimental analysis of neutron Compton scattering from H in gaseous H2 are not, as we maintain, outside the framework of conventional neutron scatteringtheory. Secondly, it is claimed that we did not really observe such effects, owing to a faulty analysis of the experimental data, which are claimed to be in agreement with conventional theory. We focus in this response on the critical issue of the reliability of our experimental results and analysis. Using the same standard Vesuvio instrument programs used by Mayers et al., we show that, if the experimental results for H in gaseous H2 are in agreement with conventional theory, then those for D in gaseous D2 obtained in the same way cannot be, and vice-versa. We expose a flaw in the calibration methodology used by Mayers et al. that leads to the present disagreement over the behaviour of H, namely the ad hoc adjustment of the measured H peak positions in TOF during the calibration of Vesuvio so that agreement is obtained with the expectation of conventional theory. We briefly address the question of the necessity to apply the theory of open quantum systems.
Resumo:
Decoherence of quantum entangled particles is observed in most systems, and is usually caused by system-environment interactions. Disentangling two subsystems A and B of a quantum systemAB is tantamount to erasure of quantum phase relations between A and B. It is widely believed that this erasure is an innocuous process, which e.g. does not affect the energies of A and B. Surprisingly, recent theoretical investigations by different groups showed that disentangling two systems, i.e. their decoherence, can cause an increase of their energies. Applying this result to the context of neutronCompton scattering from H2 molecules, we provide for the first time experimental evidence which supports this prediction. The results reveal that the neutron-proton collision leading to the cleavage of the H-H bond in the sub-femtosecond timescale is accompanied by larger energy transfer (by about 3%) than conventional theory predicts. It is proposed to interpreted the results by considering the neutron-proton collisional system as an entangled open quantum system being subject to decoherence owing to the interactions with the “environment” (i.e., two electrons plus second proton of H2).
Resumo:
Process compliance measurement is getting increasing attention in companies due to stricter legal requirements and market pressure for operational excellence. On the other hand, the metrics to quantify process compliance have only been defined recently. A major criticism points to the fact that existing measures appear to be unintuitive. In this paper, we trace back this problem to a more foundational question: which notion of behavioural equivalence is appropriate for discussing compliance? We present a quantification approach based on behavioural profiles, which is a process abstraction mechanism. Behavioural profiles can be regarded as weaker than existing equivalence notions like trace equivalence, and they can be calculated efficiently. As a validation, we present a respective implementation that measures compliance of logs against a normative process model. This implementation is being evaluated in a case study with an international service provider.
Resumo:
Magnetic resonance imaging (MRI) offers the opportunity to study biological tissues and processes in a non-disruptive manner. The technique shows promise for the study of the load-bearing performance (consolidation) of articular cartilage and changes in articular cartilage accompanying osteoarthritis. Consolidation of articular cartilage involves the recording of two transient characteristics: the change over time of strain and the hydrostatic excess pore pressure (HEPP). MRI study of cartilage consolidation under mechanical load is limited by difficulties in measuring the HEPP in the presence of the strong magnetic fields associated with the MRI technique. Here we describe the use of MRI to image and characterize bovine articular cartilage deforming under load in an MRI compatible consolidometer while monitoring pressure with a Fabry-Perot interferometer-based fiber-optic pressure transducer.
Resumo:
Introduction This study investigated the sensitivity of calculated stereotactic radiotherapy and radiosurgery doses to the accuracy of the beam data used by the treatment planning system. Methods Two sets of field output factors were acquired using fields smaller than approximately 1 cm2, for inclusion in beam data used by the iPlan treatment planning system (Brainlab, Feldkirchen, Germany). One set of output factors were measured using an Exradin A16 ion chamber (Standard Imaging, Middleton, USA). Although this chamber has a relatively small collecting volume (0.007 cm3), measurements made in small fields using this chamber are subject to the effects of volume averaging, electronic disequilibrium and chamber perturbations. The second, more accurate, set of measurements were obtained by applying perturbation correction factors, calculated using Monte Carlo simulations according to a method recommended by Cranmer-Sargison et al. [1] to measurements made using a 60017 unshielded electron diode (PTW, Freiburg, Germany). A series of 12 sample patient treatments were used to investigate the effects of beam data accuracy on resulting planned dose. These treatments, which involved 135 fields, were planned for delivery via static conformal arcs and 3DCRT techniques, to targets ranging from prostates (up to 8 cm across) to meningiomas (usually more than 2 cm across) to arterioveinous malformations, acoustic neuromas and brain metastases (often less than 2 cm across). Isocentre doses were calculated for all of these fields using iPlan, and the results of using the two different sets of beam data were evaluated. Results While the isocentre doses for many fields are identical (difference = 0.0 %), there is a general trend for the doses calculated using the data obtained from corrected diode measurements to exceed the doses calculated using the less-accurate Exradin ion chamber measurements (difference\0.0 %). There are several alarming outliers (circled in the Fig. 1) where doses differ by more than 3 %, in beams from sample treatments planned for volumes up to 2 cm across. Discussion and conclusions These results demonstrate that treatment planning dose calculations for SRT/SRS treatments can be substantially affected when beam data for fields smaller than approximately 1 cm2 are measured inaccurately, even when treatment volumes are up to 2 cm across.
Resumo:
Background Parents play a significant role in shaping youth physical activity (PA). However, interventions targeting PA parenting have been ineffective. Methodological inconsistencies related to the measurement of parental influences may be a contributing factor. The purpose of this article is to review the extant peer-reviewed literature related to the measurement of general and specific parental influences on youth PA. Methods A systematic review of studies measuring constructs of PA parenting was conducted. Computerized searches were completed using PubMed, MEDLINE, Academic Search Premier, SPORTDiscus, and PsycINFO. Reference lists of the identified articles were manually reviewed as well as the authors' personal collections. Articles were selected on the basis of strict inclusion criteria and details regarding the measurement protocols were extracted. A total of 117 articles met the inclusionary criteria. Methodological articles that evaluated the validity and reliability of PA parenting measures (n=10) were reviewed separately from parental influence articles (n=107). Results A significant percentage of studies used measures with indeterminate validity and reliability. A significant percentage of articles did not provide sample items, describe the response format, or report the possible range of scores. No studies were located that evaluated sensitivity to change. Conclusion The reporting of measurement properties and the use of valid and reliable measurement scales need to be improved considerably.
Resumo:
Physical activity (PA) parenting research has proliferated over the past decade, with findings verifying the influential role that parents play in children's emerging PA behaviors. This knowledge, however, has not translated into effective family-based PA interventions. During a preconference workshop to the 2012 International Society for Behavioral Nutrition and Physical Activity annual meeting, a PA parenting workgroup met to: (1) Discuss challenges in PA parenting research that may limit its translation, (2) identify explanations or reasons for such challenges, and; (3) recommend strategies for future research. Challenges discussed by the workgroup included a proliferation of disconnected and inconsistently measured constructs, a limited understanding of the dimensions of PA parenting, and a narrow conceptualization of hypothesized moderators of the relationship between PA parenting and child PA. Potential reasons for such challenges emphasized by the group included a disinclination to employ theory when developing measures and examining predictors and outcomes of PA parenting as well as a lack of agreed-upon measurement standards. Suggested solutions focused on the need to link PA parenting research with general parenting research, define and adopt rigorous standards of measurement, and identify new methods to assess PA parenting. As an initial step toward implementing these recommendations, the workgroup developed a conceptual model that: (1) Integrates parenting dimensions from the general parenting literature into the conceptualization of PA parenting, (2) draws on behavioral and developmental theory, and; (3) emphasizes areas which have been neglected to date including precursors to PA parenting and effect modifiers.
Resumo:
The concept of dispositional resistance to change has been introduced in a series of exploratory and confirmatory analyses through which the validity of the Resistance to Change (RTC) Scale has been established (S. Oreg, 2003). However, the vast majority of participants with whom the scale was validated were from the United States. The purpose of the present work was to examine the meaningfulness of the construct and the validity of the scale across nations. Measurement equivalence analyses of data from 17 countries, representing 13 languages and 4 continents, confirmed the cross-national validity of the scale. Equivalent patterns of relationships between personal values and RTC across samples extend the nomological net of the construct and provide further evidence that dispositional resistance to change holds equivalent meanings across nations.
Resumo:
Intended to bridge the gap between the latest methodological developments and cross-cultural research, this interdisciplinary resource presents the latest strategies for analyzing cross-cultural data. Techniques are demonstrated through the use of applications that employ cross national data sets such as the latest European Social Survey. With an emphasis on the generalized latent variable approach, internationally?prominent researchers from a variety of fields explain how the methods work, how to apply them, and how they relate to other methods presented in the book. Syntax and graphical and verbal explanations of the techniques are included. [from publisher's website]