917 resultados para COHERENT
Resumo:
One of the great puzzles in the psychology of visual perception is that the visual world appears to be a coherent whole despite our viewing it through temporally discontinuous series of eye fixations. The investigators attempted to explain this puzzle from the perspective of sequential visual information integration. In recent years, investigators hypothesized that information maintained in the visual short-term memory (VSTM) could become visual mental images gradually during time delay in visual buffer and integrated with information perceived currently. Some elementary studies had been carried out to investigate the integration between VSTM and visual percepts, but further research is required to account for several questions on the spatial-temporal characteristics, information representation and mechanism of integrating sequential visual information. Based on the theory of similarity between visual mental image and visual perception, this research (including three studies) employed the temporal integration paradigm and empty cell localization task to further explore the spatial-temporal characteristics, information representation and mechanism of integrating sequential visual information (sequential arrays). The purpose of study 1 was to further explore the temporal characteristics of sequential visual information integration by examining the effects of encoding time of sequential stimuli on the integration of sequential visual information. The purpose of study 2 was to further explore the spatial characteristics of sequential visual information integration by investigating the effects of spatial characteristics change on the integration of sequential visual information. The purpose of study 3 was to explore the information representation of information maintained in the VSTM and integration mechanism in the process of integrating sequential visual information by employing the behavioral experiments and eye tracking technology. The results indicated that: (1) Sequential arrays could be integrated without strategic instruction. Increasing the duration of the first array could cause improvement in performance and increasing the duration of the second array could not improve the performance. Temporal correlation model was not fit to explain the sequential array integration under long-ISI conditions. (2) Stimuli complexity influenced not only the overall performance of sequential arrays but also the values of ISI at asymptotic level of performance. Sequential arrays still could be integrated when the spatial characteristics of sequential arrays changed. During ISI, constructing and manipulating of visual mental image of array 1 were two separate processing phases. (3) During integrating sequential arrays, people represented the pattern constituted by the objects' image maintained in the VSTM and the topological characteristics of the objects' image had some impact on fixation location. The image-perception integration hypothesis was supported when the number of dots in array 1 was less than empty cells, and the convert-and-compare hypothesis was supported when the number of the dot in array 1 was equal to or more than empty cells. These findings not only contribute to make people understand the process of sequential visual information integration better, but also have significant practical application in the design of visual interface.
Resumo:
Research on naïve physics investigates children’s intuitive understanding of physical objects, phenomena and processes. Children, and also many adults, were found to have a misconception of inertia, called impetus theory. In order to investigate the development of this naïve concept and the mechanism underlying it, four age groups (5-year-olds, 2nd graders, 5th graders, and 8th graders) were included in this research. Modified experimental tasks were used to explore the effects of daily experience, perceptual cues and general information-processing ability on children’s understanding of inertia. The results of this research are: 1) Five- to thirteen-year-olds’ understanding of inertia problems which were constituted by two ogjects moving at the same spped undergoes an L-shaped developmental trend; Children’s performance became worse as they got older, and their performance in the experiment did not necessarily ascend with the improvement of their cognitive abilities. 2) The L-shaped developmental curve suggests that children in different ages used different strategies to solve inertia problems: Five- to eight-year-olds only used heuristic strategy, while eleven- to thirteen-year-olds solved problems by analyzing the details of inertia motion. 3) The different performance between familiar and unfamiliar problems showed that older children were not able to spontaneously transfer their knowledge and experience from daily action and observation of inertia to unfamiliar, abstract inertia problems. 4) Five- to eight-year-olds showed straight and fragmented pattern, while more eleven- to thirteen-year-olds showed standard impetus theory and revised impetus theory pattern, which showed that younger children were influenced by perceptual cues and their understanding of inertia was fragmented, while older children had coherent impetus theory. 5) When the perceptual cues were controlled, even 40 percent 5 years olds showed the information-processing ability to analyze the distance, speed and time of two objects traveling in two different directions at the same time, demonstrating that they have achieved a necessary level to theorize their naïve concept of inertia.
Resumo:
A fundamental understanding of the information carrying capacity of optical channels requires the signal and physical channel to be modeled quantum mechanically. This thesis considers the problems of distributing multi-party quantum entanglement to distant users in a quantum communication system and determining the ability of quantum optical channels to reliably transmit information. A recent proposal for a quantum communication architecture that realizes long-distance, high-fidelity qubit teleportation is reviewed. Previous work on this communication architecture is extended in two primary ways. First, models are developed for assessing the effects of amplitude, phase, and frequency errors in the entanglement source of polarization-entangled photons, as well as fiber loss and imperfect polarization restoration, on the throughput and fidelity of the system. Second, an error model is derived for an extension of this communication architecture that allows for the production and storage of three-party entangled Greenberger-Horne-Zeilinger states. A performance analysis of the quantum communication architecture in qubit teleportation and quantum secret sharing communication protocols is presented. Recent work on determining the channel capacity of optical channels is extended in several ways. Classical capacity is derived for a class of Gaussian Bosonic channels representing the quantum version of classical colored Gaussian-noise channels. The proof is strongly mo- tivated by the standard technique of whitening Gaussian noise used in classical information theory. Minimum output entropy problems related to these channel capacity derivations are also studied. These single-user Bosonic capacity results are extended to a multi-user scenario by deriving capacity regions for single-mode and wideband coherent-state multiple access channels. An even larger capacity region is obtained when the transmitters use non- classical Gaussian states, and an outer bound on the ultimate capacity region is presented
Resumo:
Recent developments in microfabrication and nanotechnology will enable the inexpensive manufacturing of massive numbers of tiny computing elements with sensors and actuators. New programming paradigms are required for obtaining organized and coherent behavior from the cooperation of large numbers of unreliable processing elements that are interconnected in unknown, irregular, and possibly time-varying ways. Amorphous computing is the study of developing and programming such ultrascale computing environments. This paper presents an approach to programming an amorphous computer by spontaneously organizing an unstructured collection of processing elements into cooperative groups and hierarchies. This paper introduces a structure called an AC Hierarchy, which logically organizes processors into groups at different levels of granularity. The AC hierarchy simplifies programming of an amorphous computer through new language abstractions, facilitates the design of efficient and robust algorithms, and simplifies the analysis of their performance. Several example applications are presented that greatly benefit from the AC hierarchy. This paper introduces three algorithms for constructing multiple levels of the hierarchy from an unstructured collection of processors.
Resumo:
The Listener is an automated system that unintrusively performs knowledge acquisition from informal input. The Listener develops a coherent internal representation of a description from an initial set of disorganized, imprecise, incomplete, ambiguous, and possibly inconsistent statements. The Listener can produce a summary document from its internal representation to facilitate communication, review, and validation. A special purpose Listener, called the Requirements Apprentice (RA), has been implemented in the software requirements acquisition domain. Unlike most other requirements analysis tools, which start from a formal description language, the focus of the RA is on the transition between informal and formal specifications.
Resumo:
Parallel shared-memory machines with hundreds or thousands of processor-memory nodes have been built; in the future we will see machines with millions or even billions of nodes. Associated with such large systems is a new set of design challenges. Many problems must be addressed by an architecture in order for it to be successful; of these, we focus on three in particular. First, a scalable memory system is required. Second, the network messaging protocol must be fault-tolerant. Third, the overheads of thread creation, thread management and synchronization must be extremely low. This thesis presents the complete system design for Hamal, a shared-memory architecture which addresses these concerns and is directly scalable to one million nodes. Virtual memory and distributed objects are implemented in a manner that requires neither inter-node synchronization nor the storage of globally coherent translations at each node. We develop a lightweight fault-tolerant messaging protocol that guarantees message delivery and idempotence across a discarding network. A number of hardware mechanisms provide efficient support for massive multithreading and fine-grained synchronization. Experiments are conducted in simulation, using a trace-driven network simulator to investigate the messaging protocol and a cycle-accurate simulator to evaluate the Hamal architecture. We determine implementation parameters for the messaging protocol which optimize performance. A discarding network is easier to design and can be clocked at a higher rate, and we find that with this protocol its performance can approach that of a non-discarding network. Our simulations of Hamal demonstrate the effectiveness of its thread management and synchronization primitives. In particular, we find register-based synchronization to be an extremely efficient mechanism which can be used to implement a software barrier with a latency of only 523 cycles on a 512 node machine.
Resumo:
The heated debate over the conflict between ethics and economics is often described as an epochal issue, an expression of present-day fragility, resulting from the implosion of the development model, which has characterised western society. The debate, however, exposes a paradox. Whilst, on the one hand, the neoclassical economic theory is radically criticized, on the other such criticism does not appear to delineate any solid, practicable alternative. Thus, the mainstream economic theory is still taught, practised by individuals as well as institutions, and further developed by the prevailing academic research. For this reason, a viable alternative needs to be sought, along with a new research methodology, which would allow to apply novel and more coherent theoretical assumptions into effective research and real cases. The theoretical instruments by which to create the models for human behaviour need to take into account the biological foundation of behaviour, expressed in evolutionary genetics terms. The aim of this paper is to establish whether our moral knowledge of economics may claim any scientific objectivity in light of advances in subject areas that differ in their scope and methods: moral philosophy, economics, cognitive neuroscience and artificial intelligence, each of which makes a specific contribution to understanding the operation of the human mind and towards forming the moral values onto which economic choice and action are founded. Given that the object of the study of economic science is the analysis of complex systems, nowadays the most efficient method seems to be artificial life simulation.
Resumo:
Lee M.H., Qualitative Modelling of Linear Networks in ECAD Applications, Expert Update, Vol. 3, Num. 2, pp23-32, BCS SGES, Summer 2000. Qualitative modeling of linear networks in ecad applications (1999) by M Lee Venue: Pages 146?152 of: Proceedings 13th international workshop on qualitative reasoning, QR ?99
Resumo:
Urquhart, C. (editor for JUSTEIS team), Spink, S., Thomas, R., Yeoman, A., Durbin, J., Turner, J., Armstrong, A., Lonsdale, R. & Fenton, R. (2003). JUSTEIS (JISC Usage Surveys: Trends in Electronic Information Services) Strand A: survey of end users of all electronic information services (HE and FE), with Action research report. Final report 2002/2003 Cycle Four. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth with Information Automation Ltd (CIQM). Sponsorship: JISC
Resumo:
Raufaste, C., Dollet, B., Cox, S., Jiang, Y. and Graner, F. (2007). Yield drag in a two-dimensional foam flow around a circular obstacle: Effect of liquid fraction. European Physical Journal E, 23 (2), 217?228 Sponsorship: Y.J. is supported by US DOE under contract No. DE-AC52-06NA25396. S.C. is supported by EPSRC (EP/D071127/1)
Resumo:
Bain, William, 'Are There Any Lessons of History?: The English School and the Activity of Being an Historian', International Politics (2007) 44(5) pp.513-530 RAE2008
Resumo:
Scully, Roger, Becoming Europeans? Attitudes, Roles and Socialisation in the European Parliament (Oxford: Oxford University Press, 2005), pp.vii+168 RAE2008
Resumo:
Bain, William, 'One Order, Two Laws: Recovering the 'Normative' in English School Theory', Review of International Studies, (2007) 33(4) pp.557-575 RAE2008
Resumo:
Slocombe, William, Nihilism and the Sublime Postmodern (New York: Routledge, 2005) RAE2008
Resumo:
Kargl, Florian; Meyer, A.; Koza, M.M.; Schober, H., (2006) 'Formation of channels for fast-ion diffusion in alkali silicate melts: A quasielastic neutron scattering study', Physical Review B: Condensed Matter and Materials Physics 74 pp.14304 RAE2008