990 resultados para Classical Information
Resumo:
We consider discrete-time versions of two classical problems in the optimal control of admission to a queueing system: i) optimal routing of arrivals to two parallel queues and ii) optimal acceptance/rejection of arrivals to a single queue. We extend the formulation of these problems to permit a k step delay in the observation of the queue lengths by the controller. For geometric inter-arrival times and geometric service times the problems are formulated as controlled Markov chains with expected total discounted cost as the minimization objective. For problem i) we show that when k = 1, the optimal policy is to allocate an arrival to the queue with the smaller expected queue length (JSEQ: Join the Shortest Expected Queue). We also show that for this problem, for k greater than or equal to 2, JSEQ is not optimal. For problem ii) we show that when k = 1, the optimal policy is a threshold policy. There are, however, two thresholds m(0) greater than or equal to m(1) > 0, such that mo is used when the previous action was to reject, and mi is used when the previous action was to accept.
Resumo:
An attempt is made to present some challenging problems (mainly to the technically minded researchers) in the development of computational models for certain (visual) processes which are executed with, apparently, deceptive ease by the human visual system. However, in the interest of simplicity (and with a nonmathematical audience in mind), the presentation is almost completely devoid of mathematical formalism. Some of the findings in biological vision are presented in order to provoke some approaches to their computational models, The development of ideas is not complete, and the vast literature on biological and computational vision cannot be reviewed here. A related but rather specific aspect of computational vision (namely, detection of edges) has been discussed by Zucker, who brings out some of the difficulties experienced in the classical approaches.Space limitations here preclude any detailed analysis of even the elementary aspects of information processing in biological vision, However, the main purpose of the present paper is to highlight some of the fascinating problems in the frontier area of modelling mathematically the human vision system.
Resumo:
Onset and evolution of the Rayleigh-Benard (R-B) convection are investigated using the Information Preservation (IP) method. The information velocity and temperature are updated using the Octant Flux Splitting (OFS) model developed by Masters & Ye based on the Maxwell transport equation suggested by Sun & Boyd. Statistical noise inherent in particle approaches such as the direct simulation Monte Carlo (DSMC) method is effectively reduced by the IP method, and therefore the evolutions from an initial quiescent fluid to a final steady state are shown clearly. An interesting phenomenon is observed: when the Rayleigh number (Ra) exceeds its critical value, there exists an obvious incubation stage. During the incubation stage, the vortex structure clearly appears and evolves, whereas the Nusselt number (Nu) of the lower plate is close to unity. After the incubation stage, the vortex velocity and Nu rapidly increase, and the flow field quickly reaches a steady, convective state. A relation of Nu to Ra given by IP agrees with those given by DSMC, the classical theory and experimental data.
Resumo:
Authority files serve to uniquely identify real world ‘things’ or entities like documents, persons, organisations, and their properties, like relations and features. Already important in the classical library world, authority files are indispensable for adequate information retrieval and analysis in the computer age. This is because, even more than humans, computers are poor at handling ambiguity. Through authority files, people tell computers which terms, names or numbers refer to the same thing or have the same meaning by giving equivalent notions the same identifier. Thus, authority files signpost the internet where these identifiers are interlinked on the basis of relevance. When executing a query, computers are able to navigate from identifier to identifier by following these links and collect the queried information on these so-called ‘crosswalks’. In this context, identifiers also go under the name controlled access points. Identifiers become even more crucial now massive data collections like library catalogues or research datasets are releasing their till-now contained data directly to the internet. This development is coined Open Linked Data. The concatenating name for the internet is Web of Data instead of the classical Web of Documents.
Resumo:
An entangled two-mode coherent state is studied within the framework of 2 x 2-dimensional Hilbert space. An entanglement concentration scheme based on joint Bell-state measurements is worked out. When the entangled coherent state is embedded in vacuum environment, its entanglement is degraded but not totally lost. It is found that the larger the initial coherent amplitude, the faster entanglement decreases. We investigate a scheme to teleport a coherent superposition state while considering a mixed quantum channel. We find that the decohered entangled coherent state may be useless for quantum teleportation as it gives the optimal fidelity of teleportation less than the classical limit 2/3.
Resumo:
As semiconductor electronic devices scale to the nanometer range and quantum structures (molecules, fullerenes, quantum dots, nanotubes) are investigated for use in information processing and storage, it, becomes useful to explore the limits imposed by quantum mechanics on classical computing. To formulate the problem of a quantum mechanical description of classical computing, electronic device and logic gates are described as quantum sub-systems with inputs treated as boundary conditions, outputs expressed.is operator expectation values, and transfer characteristics and logic operations expressed through the sub-system Hamiltonian. with constraints appropriate to the boundary conditions. This approach, naturally, leads to a description of the subsystem.,, in terms of density matrices. Application of the maximum entropy principle subject to the boundary conditions (inputs) allows for the determination of the density matrix (logic operation), and for calculation of expectation values of operators over a finite region (outputs). The method allows for in analysis of the static properties of quantum sub-systems.
Resumo:
This paper explores relationships between classical and parametric measures of graph (or network) complexity. Classical measures are based on vertex decompositions induced by equivalence relations. Parametric measures, on the other hand, are constructed by using information functions to assign probabilities to the vertices. The inequalities established in this paper relating classical and parametric measures lay a foundation for systematic classification of entropy-based measures of graph complexity.
Resumo:
We consider two celebrated criteria for defining the nonclassicality of bipartite bosonic quantum systems, the first stemming from information theoretic concepts and the second from physical constraints on the quantum phase space. Consequently, two sets of allegedly classical states are singled out: (i) the set C composed of the so-called classical-classical (CC) states—separable states that are locally distinguishable and do not possess quantum discord; (ii) the set P of states endowed with a positive P representation (P-classical states)—mixtures of Glauber coherent states that, e.g., fail to show negativity of their Wigner function. By showing that C and P are almost disjoint, we prove that the two defining criteria are maximally inequivalent. Thus, the notions of classicality that they put forward are radically different. In particular, generic CC states show quantumness in their P representation, and vice versa, almost all P-classical states have positive quantum discord and, hence, are not CC. This inequivalence is further elucidated considering different applications of P-classical and CC states. Our results suggest that there are other quantum correlations in nature than those revealed by entanglement and quantum discord.
Resumo:
Parametric interactions in nonlinear crystals represent a powerful tool in the optical manipulation of information, both in the classical and the quantum regime. Here, we analyze in detail classical and quantum aspects of three-and five-mode parametric interactions in chi(2) nonlinear crystals. The equations of motion are explicitly derived and then solved within the parametric approximation. We describe several applications, including holography, all-optical gates, generation of entanglement, and telecloning. Experimental results on the photon distributions and correlations of the generated beams are also reported and discussed.
Resumo:
Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.
Resumo:
Depending on the representation setting, different combination rules have been proposed for fusing information from distinct sources. Moreover in each setting, different sets of axioms that combination rules should satisfy have been advocated, thus justifying the existence of alternative rules (usually motivated by situations where the behavior of other rules was found unsatisfactory). These sets of axioms are usually purely considered in their own settings, without in-depth analysis of common properties essential for all the settings. This paper introduces core properties that, once properly instantiated, are meaningful in different representation settings ranging from logic to imprecise probabilities. The following representation settings are especially considered: classical set representation, possibility theory, and evidence theory, the latter encompassing the two other ones as special cases. This unified discussion of combination rules across different settings is expected to provide a fresh look on some old but basic issues in information fusion.
Resumo:
Background: We aimed to test whether the three classical hypotheses of the interaction between posttraumatic symptomatology and substance use (high risk of trauma exposure, susceptibility for posttraumatic symptomatology, and self-medication of symptoms), may be useful in the understanding of substance use among burn patients. Methods: We analysed substance use data (nicotine, alcohol, cannabis, amphetamines, cocaine, opiates, and tranquilizers) and psychopathology measures among burn patients admitted to a Burns Unit and enrolled in a longitudinal observational study. Lifetime substance use information (n = 246) was incorporated to analyses aiming to test the high risk hypothesis. Only patients assessed for psychopathology in a six months follow-up (n = 183) were included in prospective analyses testing the susceptibility and self-medication hypotheses. Results: Regarding the high risk hypothesis, results show a higher proportion of heroin and tranquilizer users compared to the general population. Furthermore, in line with the susceptibility hypothesis, higher levels of symptomatology were found in lifetime alcohol, tobacco and drug users during recovery. The self-medication hypothesis could be tested partially due to the hospital stay “cleaning” effect, but severity of symptoms was linked to caffeine, nicotine, alcohol and cannabis use after discharge. Conclusions: We found that the three classical hypotheses could be used to understand the link between traumatic experiences and substance use explaining different patterns of burn patient’s risk for trauma exposure and emergence of symptomatology.
Resumo:
Seismic data is difficult to analyze and classical mathematical tools reveal strong limitations in exposing hidden relationships between earthquakes. In this paper, we study earthquake phenomena in the perspective of complex systems. Global seismic data, covering the period from 1962 up to 2011 is analyzed. The events, characterized by their magnitude, geographic location and time of occurrence, are divided into groups, either according to the Flinn-Engdahl (F-E) seismic regions of Earth or using a rectangular grid based in latitude and longitude coordinates. Two methods of analysis are considered and compared in this study. In a first method, the distributions of magnitudes are approximated by Gutenberg-Richter (G-R) distributions and the parameters used to reveal the relationships among regions. In the second method, the mutual information is calculated and adopted as a measure of similarity between regions. In both cases, using clustering analysis, visualization maps are generated, providing an intuitive and useful representation of the complex relationships that are present among seismic data. Such relationships might not be perceived on classical geographic maps. Therefore, the generated charts are a valid alternative to other visualization tools, for understanding the global behavior of earthquakes.
Resumo:
Depuis l’introduction de la mécanique quantique, plusieurs mystères de la nature ont trouvé leurs explications. De plus en plus, les concepts de la mécanique quantique se sont entremêlés avec d’autres de la théorie de la complexité du calcul. De nouvelles idées et solutions ont été découvertes et élaborées dans le but de résoudre ces problèmes informatiques. En particulier, la mécanique quantique a secoué plusieurs preuves de sécurité de protocoles classiques. Dans ce m´emoire, nous faisons un étalage de résultats récents de l’implication de la mécanique quantique sur la complexité du calcul, et cela plus précisément dans le cas de classes avec interaction. Nous présentons ces travaux de recherches avec la nomenclature des jeux à information imparfaite avec coopération. Nous exposons les différences entre les théories classiques, quantiques et non-signalantes et les démontrons par l’exemple du jeu à cycle impair. Nous centralisons notre attention autour de deux grands thèmes : l’effet sur un jeu de l’ajout de joueurs et de la répétition parallèle. Nous observons que l’effet de ces modifications a des conséquences très différentes en fonction de la théorie physique considérée.
Resumo:
La théorie de l'information quantique étudie les limites fondamentales qu'imposent les lois de la physique sur les tâches de traitement de données comme la compression et la transmission de données sur un canal bruité. Cette thèse présente des techniques générales permettant de résoudre plusieurs problèmes fondamentaux de la théorie de l'information quantique dans un seul et même cadre. Le théorème central de cette thèse énonce l'existence d'un protocole permettant de transmettre des données quantiques que le receveur connaît déjà partiellement à l'aide d'une seule utilisation d'un canal quantique bruité. Ce théorème a de plus comme corollaires immédiats plusieurs théorèmes centraux de la théorie de l'information quantique. Les chapitres suivants utilisent ce théorème pour prouver l'existence de nouveaux protocoles pour deux autres types de canaux quantiques, soit les canaux de diffusion quantiques et les canaux quantiques avec information supplémentaire fournie au transmetteur. Ces protocoles traitent aussi de la transmission de données quantiques partiellement connues du receveur à l'aide d'une seule utilisation du canal, et ont comme corollaires des versions asymptotiques avec et sans intrication auxiliaire. Les versions asymptotiques avec intrication auxiliaire peuvent, dans les deux cas, être considérées comme des versions quantiques des meilleurs théorèmes de codage connus pour les versions classiques de ces problèmes. Le dernier chapitre traite d'un phénomène purement quantique appelé verrouillage: il est possible d'encoder un message classique dans un état quantique de sorte qu'en lui enlevant un sous-système de taille logarithmique par rapport à sa taille totale, on puisse s'assurer qu'aucune mesure ne puisse avoir de corrélation significative avec le message. Le message se trouve donc «verrouillé» par une clé de taille logarithmique. Cette thèse présente le premier protocole de verrouillage dont le critère de succès est que la distance trace entre la distribution jointe du message et du résultat de la mesure et le produit de leur marginales soit suffisamment petite.