998 resultados para Quantum statistics
Resumo:
We utilise the well-developed quantum decision models known to the QI community to create a higher order social decision making model. A simple Agent Based Model (ABM) of a society of agents with changing attitudes towards a social issue is presented, where the private attitudes of individuals in the system are represented using a geometric structure inspired by quantum theory. We track the changing attitudes of the members of that society, and their resulting propensities to act, or not, in a given social context. A number of new issues surrounding this "scaling up" of quantum decision theories are discussed, as well as new directions and opportunities.
Resumo:
This work is a theoretical investigation into the coupling of a single excited quantum emitter to the plasmon mode of a V groove waveguide. The V groove waveguide consists of a triangular channel milled in gold and the emitter is modeled as a dipole emitter, and could represent a quantum dot, nitrogen vacancy in diamond, or similar. In this work the dependence of coupling efficiency of emitter to plasmon mode is determined for various geometrical parameters of the emitter-waveguide system. Using the finite element method, the effect on coupling efficiency of the emitter position and orientation, groove angle, groove depth, and tip radius, is studied in detail. We demonstrate that all parameters, with the exception of groove depth, have a significant impact on the attainable coupling efficiency. Understanding the effect of various geometrical parameters on the coupling between emitters and the plasmonic mode of the waveguide is essential for the design and optimization of quantum dot–V groove devices.
Resumo:
Key establishment is a crucial primitive for building secure channels in a multi-party setting. Without quantum mechanics, key establishment can only be done under the assumption that some computational problem is hard. Since digital communication can be easily eavesdropped and recorded, it is important to consider the secrecy of information anticipating future algorithmic and computational discoveries which could break the secrecy of past keys, violating the secrecy of the confidential channel. Quantum key distribution (QKD) can be used generate secret keys that are secure against any future algorithmic or computational improvements. QKD protocols still require authentication of classical communication, although existing security proofs of QKD typically assume idealized authentication. It is generally considered folklore that QKD when used with computationally secure authentication is still secure against an unbounded adversary, provided the adversary did not break the authentication during the run of the protocol. We describe a security model for quantum key distribution extending classical authenticated key exchange (AKE) security models. Using our model, we characterize the long-term security of the BB84 QKD protocol with computationally secure authentication against an eventually unbounded adversary. By basing our model on traditional AKE models, we can more readily compare the relative merits of various forms of QKD and existing classical AKE protocols. This comparison illustrates in which types of adversarial environments different quantum and classical key agreement protocols can be secure.
Resumo:
Queensland University of Technology (QUT) was one of the first universities in Australia to establish an institutional repository. Launched in November 2003, the repository (QUT ePrints) uses the EPrints open source repository software (from Southampton) and has enjoyed the benefit of an institutional deposit mandate since January 2004. Currently (April 2012), the repository holds over 36,000 records, including 17,909 open access publications with another 2,434 publications embargoed but with mediated access enabled via the ‘Request a copy’ button which is a feature of the EPrints software. At QUT, the repository is managed by the library.QUT ePrints (http://eprints.qut.edu.au) The repository is embedded into a number of other systems at QUT including the staff profile system and the University’s research information system. It has also been integrated into a number of critical processes related to Government reporting and research assessment. Internally, senior research administrators often look to the repository for information to assist with decision-making and planning. While some statistics could be drawn from the advanced search feature and the existing download statistics feature, they were rarely at the level of granularity or aggregation required. Getting the information from the ‘back end’ of the repository was very time-consuming for the Library staff. In 2011, the Library funded a project to enhance the range of statistics which would be available from the public interface of QUT ePrints. The repository team conducted a series of focus groups and individual interviews to identify and prioritise functionality requirements for a new statistics ‘dashboard’. The participants included a mix research administrators, early career researchers and senior researchers. The repository team identified a number of business criteria (eg extensible, support available, skills required etc) and then gave each a weighting. After considering all the known options available, five software packages (IRStats, ePrintsStats, AWStats, BIRT and Google Urchin/Analytics) were thoroughly evaluated against a list of 69 criteria to determine which would be most suitable. The evaluation revealed that IRStats was the best fit for our requirements. It was deemed capable of meeting 21 out of the 31 high priority criteria. Consequently, IRStats was implemented as the basis for QUT ePrints’ new statistics dashboards which were launched in Open Access Week, October 2011. Statistics dashboards are now available at four levels; whole-of-repository level, organisational unit level, individual author level and individual item level. The data available includes, cumulative total deposits, time series deposits, deposits by item type, % fulltexts, % open access, cumulative downloads, time series downloads, downloads by item type, author ranking, paper ranking (by downloads), downloader geographic location, domains, internal v external downloads, citation data (from Scopus and Web of Science), most popular search terms, non-search referring websites. The data is displayed in charts, maps and table format. The new statistics dashboards are a great success. Feedback received from staff and students has been very positive. Individual researchers have said that they have found the information to be very useful when compiling a track record. It is now very easy for senior administrators (including the Deputy Vice Chancellor-Research) to compare the full-text deposit rates (i.e. mandate compliance rates) across organisational units. This has led to increased ‘encouragement’ from Heads of School and Deans in relation to the provision of full-text versions.
Resumo:
This article presents a methodology that integrates cumulative plots with probe vehicle data for estimation of travel time statistics (average, quartile) on urban networks. The integration reduces relative deviation among the cumulative plots so that the classical analytical procedure of defining the area between the plots as the total travel time can be applied. For quartile estimation, a slicing technique is proposed. The methodology is validated with real data from Lucerne, Switzerland and it is concluded that the travel time estimates from the proposed methodology are statistically equivalent to the observed values.
Resumo:
Many donors, particularly those contemplating a substantial donation, consider whether their donation will be deductible from their taxable income. This motivation is not lost on fundraisers who conduct appeals before the end of the taxation year to capitalise on such desires. The motivation is also not lost on Treasury analysts who perceive the tax deduction as “lost” revenue and wonder if the loss is “efficient” in economic terms. Would it be more efficient for the government to give grants to deserving organisations, rather than permitting donor directed gifts? Better still, what about contracts that lock in the use of the money for a government priority? What place does tax deduction play in influencing a donor to give? Does the size of the gift bear any relationship to the size of the tax deduction? Could an increased level of donations take up an increasing shortfall in government welfare and community infrastructure spending? Despite these questions being asked regularly, little has been rigorously established about the effect of taxation deductions on a donor’s gifts.
Resumo:
Much of our understanding of human thinking is based on probabilistic models. This innovative book by Jerome R. Busemeyer and Peter D. Bruza argues that, actually, the underlying mathematical structures from quantum theory provide a much better account of human thinking than traditional models. They introduce the foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory. The first, "contextuality", is a way to understand interference effects found with inferences and decisions under conditions of uncertainty. The second, "entanglement", allows cognitive phenomena to be modelled in non-reductionist ways. Employing these principles drawn from quantum theory allows us to view human cognition and decision in a totally new light...
Resumo:
The term “vagueness” describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno’s sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib’s and Pelletier’s (2011) theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substan- tial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.
Resumo:
Purpose – The purpose of this paper is to summarise a successfully defended doctoral thesis. The main purpose of this paper is to provide a summary of the scope, and main issues raised in the thesis so that readers undertaking studies in the same or connected areas may be aware of current contributions to the topic. The secondary aims are to frame the completed thesis in the context of doctoral-level research in project management as well as offer ideas for further investigation which would serve to extend scientific knowledge on the topic. Design/methodology/approach – Research reported in this paper is based on a quantitative study using inferential statistics aimed at better understanding the actual and potential usage of earned value management (EVM) as applied to external projects under contract. Theories uncovered during the literature review were hypothesized and tested using experiential data collected from 145 EVM practitioners with direct experience on one or more external projects under contract that applied the methodology. Findings – The results of this research suggest that EVM is an effective project management methodology. The principles of EVM were shown to be significant positive predictors of project success on contracted efforts and to be a relatively greater positive predictor of project success when using fixed-price versus cost-plus (CP) type contracts. Moreover, EVM's work-breakdown structure (WBS) utility was shown to positively contribute to the formation of project contracts. The contribution was not significantly different between fixed-price and CP contracted projects, with exceptions in the areas of schedule planning and payment planning. EVM's “S” curve benefited the administration of project contracts. The contribution of the S-curve was not significantly different between fixed-price and CP contracted projects. Furthermore, EVM metrics were shown to also be important contributors to the administration of project contracts. The relative contribution of EVM metrics to projects under fixed-price versus CP contracts was not significantly different, with one exception in the area of evaluating and processing payment requests. Practical implications – These results have important implications for project practitioners, EVM advocates, as well as corporate and governmental policy makers. EVM should be considered for all projects – not only for its positive contribution to project contract development and administration, for its contribution to project success as well, regardless of contract type. Contract type should not be the sole determining factor in the decision whether or not to use EVM. More particularly, the more fixed the contracted project cost, the more the principles of EVM explain the success of the project. The use of EVM mechanics should also be used in all projects regardless of contract type. Payment planning using a WBS should be emphasized in fixed-price contracts using EVM in order to help mitigate performance risk. Schedule planning using a WBS should be emphasized in CP contracts using EVM in order to help mitigate financial risk. Similarly, EVM metrics should be emphasized in fixed-price contracts in evaluating and processing payment requests. Originality/value – This paper provides a summary of cutting-edge research work and a link to the published thesis that researchers can use to help them understand how the research methodology was applied as well as how it can be extended.
Resumo:
This study of English Coronial practice raises a number of questions, not only regarding state investigations of suicide, but also of the role of the Coroner itself. Following observations at over 20 inquests into possible suicides, and in-depth interviews with six Coroners, three main issue emerged: first, there exists considerable slippage between different Coroners over which deaths are likely to be classified as suicide; second, the high standard of proof required, and immense pressure faced by Coroners from family members at inquest to reach any verdict other than suicide, can significantly depress likely suicide rates; and finally, Coroners feel no professional obligation, either individually or collectively, to contribute to the production of consistent and useful social data regarding suicide—arguably rendering comparative suicide statistics relatively worthless. These issues lead, ultimately, to a more important question about the role we expect Coroners to play within social governance, and within an effective, contemporary democracy.
Resumo:
A one-time program is a hypothetical device by which a user may evaluate a circuit on exactly one input of his choice, before the device self-destructs. One-time programs cannot be achieved by software alone, as any software can be copied and re-run. However, it is known that every circuit can be compiled into a one-time program using a very basic hypothetical hardware device called a one-time memory. At first glance it may seem that quantum information, which cannot be copied, might also allow for one-time programs. But it is not hard to see that this intuition is false: one-time programs for classical or quantum circuits based solely on quantum information do not exist, even with computational assumptions. This observation raises the question, "what assumptions are required to achieve one-time programs for quantum circuits?" Our main result is that any quantum circuit can be compiled into a one-time program assuming only the same basic one-time memory devices used for classical circuits. Moreover, these quantum one-time programs achieve statistical universal composability (UC-security) against any malicious user. Our construction employs methods for computation on authenticated quantum data, and we present a new quantum authentication scheme called the trap scheme for this purpose. As a corollary, we establish UC-security of a recent protocol for delegated quantum computation.
Resumo:
According to social constructivists, learners are active participants in constructing new knowledge in a social process where they interact with others. In these social settings teachers or more knowledgeable peers provide support. This research study investigated the contribution that an online synchronous tutorial makes to support teaching and learning of undergraduate introductory statistics offered by an Australian regional university at a distance. The introductory statistics course which served as a research setting in this study was a requirement of a variety of programs at the University, including psychology, business and science. Often students in these programs perceive this course to be difficult and irrelevant to their programs of study. Negative attitudes and associated anxiety mean that students often struggle with the content. While asynchronous discussion forums have been shown to provide a level of interaction and support, it was anticipated that online synchronous tutorials would offer immediate feedback to move students forward through ―stuck places.‖ At the beginning of the semester the researcher offered distance students in this course the opportunity to participate in a weekly online synchronous tutorial which was an addition to the usual support offered by the teaching team. This tutorial was restricted to 12 volunteers to allow sufficient interaction to occur for each of the participants. The researcher, as participant-observer, conducted the weekly tutorials using the University's interactive online learning platform, Wimba Classroom, whereby participants interacted using audio, text chat and a virtual whiteboard. Prior to the start of semester, participants were surveyed about their previous mathematical experiences, their perceptions of the introductory statistics course and why they wanted to participate in the online tutorial. During the semester, they were regularly asked pertinent research questions related to their personal outcomes from the tutorial sessions. These sessions were recorded using screen capture software and the participants were interviewed about their experiences at the end of the semester. Analysis of these data indicated that the perceived value of online synchronous tutorial lies in the interaction with fellow students and a content expert and with the immediacy of feedback given. The collaborative learning environment offered the support required to maintain motivation, enhance confidence and develop problemsolving skills in these distance students of introductory statistics. Based on these findings a model of online synchronous learning is proposed.
Resumo:
Computer vision is increasingly becoming interested in the rapid estimation of object detectors. The canonical strategy of using Hard Negative Mining to train a Support Vector Machine is slow, since the large negative set must be traversed at least once per detector. Recent work has demonstrated that, with an assumption of signal stationarity, Linear Discriminant Analysis is able to learn comparable detectors without ever revisiting the negative set. Even with this insight, the time to learn a detector can still be on the order of minutes. Correlation filters, on the other hand, can produce a detector in under a second. However, this involves the unnatural assumption that the statistics are periodic, and requires the negative set to be re-sampled per detector size. These two methods differ chie y in the structure which they impose on the co- variance matrix of all examples. This paper is a comparative study which develops techniques (i) to assume periodic statistics without needing to revisit the negative set and (ii) to accelerate the estimation of detectors with aperiodic statistics. It is experimentally verified that periodicity is detrimental.
Resumo:
A cell classification algorithm that uses first, second and third order statistics of pixel intensity distributions over pre-defined regions is implemented and evaluated. A cell image is segmented into 6 regions extending from a boundary layer to an inner circle. First, second and third order statistical features are extracted from histograms of pixel intensities in these regions. Third order statistical features used are one-dimensional bispectral invariants. 108 features were considered as candidates for Adaboost based fusion. The best 10 stage fused classifier was selected for each class and a decision tree constructed for the 6-class problem. The classifier is robust, accurate and fast by design.
Resumo:
In this paper we modeled a quantum dot at near proximity to a gap plasmon waveguide to study the quantum dot-plasmon interactions. Assuming that the waveguide is single mode, this paper is concerned about the dependence of spontaneous emission rate of the quantum dot on waveguide dimensions such as width and height. We compare coupling efficiency of a gap waveguide with symmetric configuration and asymmetric configuration illustrating that symmetric waveguide has a better coupling efficiency to the quantum dot. We also demonstrate that optimally placed quantum dot near a symmetric waveguide with 50 nm x 50 nm cross section can capture 80% of the spontaneous emission into a guided plasmon mode.