905 resultados para Quantum computational complexity
Resumo:
Vehicle emitted particles are of significant concern based on their potential to influence local air quality and human health. Transport microenvironments usually contain higher vehicle emission concentrations compared to other environments, and people spend a substantial amount of time in these microenvironments when commuting. Currently there is limited scientific knowledge on particle concentration, passenger exposure and the distribution of vehicle emissions in transport microenvironments, partially due to the fact that the instrumentation required to conduct such measurements is not available in many research centres. Information on passenger waiting time and location in such microenvironments has also not been investigated, which makes it difficult to evaluate a passenger’s spatial-temporal exposure to vehicle emissions. Furthermore, current emission models are incapable of rapidly predicting emission distribution, given the complexity of variations in emission rates that result from changes in driving conditions, as well as the time spent in driving condition within the transport microenvironment. In order to address these scientific gaps in knowledge, this work conducted, for the first time, a comprehensive statistical analysis of experimental data, along with multi-parameter assessment, exposure evaluation and comparison, and emission model development and application, in relation to traffic interrupted transport microenvironments. The work aimed to quantify and characterise particle emissions and human exposure in the transport microenvironments, with bus stations and a pedestrian crossing identified as suitable research locations representing a typical transport microenvironment. Firstly, two bus stations in Brisbane, Australia, with different designs, were selected to conduct measurements of particle number size distributions, particle number and PM2.5 concentrations during two different seasons. Simultaneous traffic and meteorological parameters were also monitored, aiming to quantify particle characteristics and investigate the impact of bus flow rate, station design and meteorological conditions on particle characteristics at stations. The results showed higher concentrations of PN20-30 at the station situated in an open area (open station), which is likely to be attributed to the lower average daily temperature compared to the station with a canyon structure (canyon station). During precipitation events, it was found that particle number concentration in the size range 25-250 nm decreased greatly, and that the average daily reduction in PM2.5 concentration on rainy days compared to fine days was 44.2 % and 22.6 % at the open and canyon station, respectively. The effect of ambient wind speeds on particle number concentrations was also examined, and no relationship was found between particle number concentration and wind speed for the entire measurement period. In addition, 33 pairs of average half-hourly PN7-3000 concentrations were calculated and identified at the two stations, during the same time of a day, and with the same ambient wind speeds and precipitation conditions. The results of a paired t-test showed that the average half-hourly PN7-3000 concentrations at the two stations were not significantly different at the 5% confidence level (t = 0.06, p = 0.96), which indicates that the different station designs were not a crucial factor for influencing PN7-3000 concentrations. A further assessment of passenger exposure to bus emissions on a platform was evaluated at another bus station in Brisbane, Australia. The sampling was conducted over seven weekdays to investigate spatial-temporal variations in size-fractionated particle number and PM2.5 concentrations, as well as human exposure on the platform. For the whole day, the average PN13-800 concentration was 1.3 x 104 and 1.0 x 104 particle/cm3 at the centre and end of the platform, respectively, of which PN50-100 accounted for the largest proportion to the total count. Furthermore, the contribution of exposure at the bus station to the overall daily exposure was assessed using two assumed scenarios of a school student and an office worker. It was found that, although the daily time fraction (the percentage of time spend at a location in a whole day) at the station was only 0.8 %, the daily exposure fractions (the percentage of exposures at a location accounting for the daily exposure) at the station were 2.7% and 2.8 % for exposure to PN13-800 and 2.7% and 3.5% for exposure to PM2.5 for the school student and the office worker, respectively. A new parameter, “exposure intensity” (the ratio of daily exposure fraction and the daily time fraction) was also defined and calculated at the station, with values of 3.3 and 3.4 for exposure to PN13-880, and 3.3 and 4.2 for exposure to PM2.5, for the school student and the office worker, respectively. In order to quantify the enhanced emissions at critical locations and define the emission distribution in further dispersion models for traffic interrupted transport microenvironments, a composite line source emission (CLSE) model was developed to specifically quantify exposure levels and describe the spatial variability of vehicle emissions in traffic interrupted microenvironments. This model took into account the complexity of vehicle movements in the queue, as well as different emission rates relevant to various driving conditions (cruise, decelerate, idle and accelerate), and it utilised multi-representative segments to capture the accurate emission distribution for real vehicle flow. This model does not only helped to quantify the enhanced emissions at critical locations, but it also helped to define the emission source distribution of the disrupted steady flow for further dispersion modelling. The model then was applied to estimate particle number emissions at a bidirectional bus station used by diesel and compressed natural gas fuelled buses. It was found that the acceleration distance was of critical importance when estimating particle number emission, since the highest emissions occurred in sections where most of the buses were accelerating and no significant increases were observed at locations where they idled. It was also shown that emissions at the front end of the platform were 43 times greater than at the rear of the platform. The CLSE model was also applied at a signalled pedestrian crossing, in order to assess increased particle number emissions from motor vehicles when forced to stop and accelerate from rest. The CLSE model was used to calculate the total emissions produced by a specific number and mix of light petrol cars and diesel passenger buses including 1 car travelling in 1 direction (/1 direction), 14 cars / 1 direction, 1 bus / 1 direction, 28 cars / 2 directions, 24 cars and 2 buses / 2 directions, and 20 cars and 4 buses / 2 directions. It was found that the total emissions produced during stopping on a red signal were significantly higher than when the traffic moved at a steady speed. Overall, total emissions due to the interruption of the traffic increased by a factor of 13, 11, 45, 11, 41, and 43 for the above 6 cases, respectively. In summary, this PhD thesis presents the results of a comprehensive study on particle number and mass concentration, together with particle size distribution, in a bus station transport microenvironment, influenced by bus flow rates, meteorological conditions and station design. Passenger spatial-temporal exposure to bus emitted particles was also assessed according to waiting time and location along the platform, as well as the contribution of exposure at the bus station to overall daily exposure. Due to the complexity of the interrupted traffic flow within the transport microenvironments, a unique CLSE model was also developed, which is capable of quantifying emission levels at critical locations within the transport microenvironment, for the purpose of evaluating passenger exposure and conducting simulations of vehicle emission dispersion. The application of the CLSE model at a pedestrian crossing also proved its applicability and simplicity for use in a real-world transport microenvironment.
An experimental and computational investigation of performance of Green Gully for reusing stormwater
Resumo:
A new stormwater quality improvement device (SQID) called ‘Green Gully’ has been designed and developed in this study with an aim to re-using stormwater for irrigating plants and trees. The main purpose of the Green Gully is to collect road runoff/stormwater, make it suitable for irrigation and provide an automated network system for watering roadside plants and irrigational areas. This paper presents the design and development of Green Gully along with experimental and computational investigations of the performance of Green Gully. Performance (in the form of efficiency, i.e. the percentage of water flow through the gully grate) was experimentally determined using a gully model in the laboratory first, then a three dimensional numerical model was developed and simulated to predict the efficiency of Green Gully as a function of flow rate. Computational Fluid Dynamics (CFD) code FLUENT was used for the simulation. GAMBIT was used for geometry creation and mesh generation. Experimental and simulation results are discussed and compared in this paper. The predicted efficiency was compared with the laboratory measured efficiency. It was found that the simulated results are in good agreement with the experimental results.
Resumo:
In John Frazer's seminal book An Evolutionary Architecture (1995), from which this essay is extracted, a fundamental approach is established for have natural systems can unfold mechanisms for negotiating the complex design space inherent in architectural systems. In this essay, which forms a critical part of the book, Frazer draws both correlations and distinctions from natural processes as emulated in design processes and form as active manifestations within natural systems. Form is seen as an evolving agent generated via the rules of descriptive genetic coding, functioning as a part of a metabolic environment. Frazer's process-model establishes the realm in which computation must manoeuvre to produce a valid solution space, including the operations of self-organisation, complexity and emergent behaviour. Addressing design as an authored practice, he extends the transference of 'creativity' from the explicit impression into form, to the investment of though, organisation and strategy in the computational processes which produce form. Frazer's text concentrates astutely on the practising of the evolutionary paradigm, the output of which postulates an architecture born of the relationships to dynamic environmental and socio-economic contexts, and realised through morphogenetic materialisation.
Resumo:
This chapter argues that evolutionary economics should be founded upon complex systems theory rather than neo-Darwinian analogies concerning natural selection, which focus on supply side considerations and competition amongst firms and technologies. It suggests that conceptions such as production and consumption functions should be replaced by network representations, in which the preferences or, more correctly, the aspirations of consumers are fundamental and, as such, the primary drivers of economic growth. Technological innovation is viewed as a process that is intermediate between these aspirational networks, and the organizational networks in which goods and services are produced. Consumer knowledge becomes at least as important as producer knowledge in determining how economic value is generated. It becomes clear that the stability afforded by connective systems of rules is essential for economic flexibility to exist, but that too many rules result in inert and structurally unstable states. In contrast, too few rules result in a more stable state, but at a low level of ordered complexity. Economic evolution from this perspective is explored using random and scale free network representations of complex systems.
Resumo:
This position paper provides an overview of work conducted and an outlook of future directions within the field of Information Retrieval (IR) that aims to develop novel models, methods and frameworks inspired by Quantum Theory (QT).
Resumo:
Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.
Resumo:
We report on analysis of discussions in an online community of people with chronic illness using socio-cognitively motivated, automatically produced semantic spaces. The analysis aims to further the emerging theory of "transition" (how people can learn to incorporate the consequences of illness into their lives). An automatically derived representation of sense of self for individuals is created in the semantic space by the analysis of the email utterances of the community members. The movement over time of the sense of self is visualised, via projection, with respect to axes of "ordinariness" and "extra-ordinariness". Qualitative evaluation shows that the visualisation is paralleled by the transitions of people during the course of their illness. The research aims to progress tools for analysis of textual data to promote greater use of tacit knowledge as found in online virtual communities. We hope it also encourages further interest in representation of sense-of-self.
Resumo:
Purpose---The aim of this study is to identify complexity measures for building projects in the People’s Republic of China (PRC). Design/Methodology/Approach---A three-round of Delphi questionnaire survey was conducted to identify the key parameters that measure the degree of project complexity. A complexity index (CI) was developed based on the identified measures and their relative importance. Findings---Six key measures of project complexity have been identified, which include, namely (1) building structure & function; (2) construction method; (3) the urgency of the project schedule; (4) project size/scale; (5) geological condition; and (6) neighboring environment. Practical implications---These complexity measures help stakeholders assess degrees of project complexity and better manage the potential risks that might be induced to different levels of project complexity. Originality/Value---The findings provide insightful perspectives to define and understand project complexity. For stakeholders, understanding and addressing the complexity help to improve project planning and implementation.
Resumo:
Increasingly societies and their governments are facing important social issues that have science and technology as key features. A number of these socio-scientific issues have two features that distinguish them from the restricted contexts in which school science has traditionally been presented. Some of their science is uncertain and scientific knowledge is not the only knowledge involved. As a result, the concepts of uncertainty, risk and complexity become essential aspects of the science underlying these issues. In this chapter we discuss the nature and role of these concepts in the public understanding of science and consider their links with school science. We argue that these same concepts and their role in contemporary scientific knowledge need to be addressed in school science curricula. The new features for content, pedagogy and assessment of this urgent challenge for science educators are outlined. These will be essential if the goal of science education for citizenship is to be achieved with our students, who will increasingly be required to make personal and collective decisions on issues involving science and technology.
Resumo:
Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.
Resumo:
As computers approach the physical limits of information storable in memory, new methods will be needed to further improve information storage and retrieval. We propose a quantum inspired vector based approach, which offers a contextually dependent mapping from the subsymbolic to the symbolic representations of information. If implemented computationally, this approach would provide exceptionally high density of information storage, without the traditionally required physical increase in storage capacity. The approach is inspired by the structure of human memory and incorporates elements of Gardenfors’ Conceptual Space approach and Humphreys et al.’s matrix model of memory.
Resumo:
Compositionality is a frequently made assumption in linguistics, and yet many human subjects reveal highly non-compositional word associations when confronted with novel concept combinations. This article will show how a non-compositional account of concept combinations can be supplied by modelling them as interacting quantum systems.
Resumo:
Cyclic nitroxide radicals represent promising alternatives to the iodine-based redox mediator commonly used in dye-sensitized solar cells (DSSCs). To date DSSCs with nitroxide-based redox mediators have achieved energy conversion efficiencies of just over 5 % but efficiencies of over 15 % might be achievable, given an appropriate mediator. The efficacy of the mediator depends upon two main factors: it must reversibly undergo one-electron oxidation and it must possess an oxidation potential in a range of 0.600-0.850 V (vs. a standard hydrogen electrode (SHE) in acetonitrile at 25 °C). Herein, we have examined the effect that structural modifications have on the value of the oxidation potential of cyclic nitroxides as well as the reversibility of the oxidation process. These included alterations to the N-containing skeleton (pyrrolidine, piperidine, isoindoline, azaphenalene, etc.), as well as the introduction of different substituents (alkyl-, methoxy-, amino-, carboxy-, etc.) to the ring. Standard oxidation potentials were calculated using high-level ab initio methodology that was demonstrated to be very accurate (with a mean absolute deviation from experimental values of only 16 mV). An optimal value of 1.45 for the electrostatic scaling factor for UAKS radii in acetonitrile solution was obtained. Established trends in the values of oxidation potentials were used to guide molecular design of stable nitroxides with desired E° ox and a number of compounds were suggested for potential use as enhanced redox mediators in DSSCs. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Resumo:
We utilise the well-developed quantum decision models known to the QI community to create a higher order social decision making model. A simple Agent Based Model (ABM) of a society of agents with changing attitudes towards a social issue is presented, where the private attitudes of individuals in the system are represented using a geometric structure inspired by quantum theory. We track the changing attitudes of the members of that society, and their resulting propensities to act, or not, in a given social context. A number of new issues surrounding this "scaling up" of quantum decision theories are discussed, as well as new directions and opportunities.
Resumo:
This work is a theoretical investigation into the coupling of a single excited quantum emitter to the plasmon mode of a V groove waveguide. The V groove waveguide consists of a triangular channel milled in gold and the emitter is modeled as a dipole emitter, and could represent a quantum dot, nitrogen vacancy in diamond, or similar. In this work the dependence of coupling efficiency of emitter to plasmon mode is determined for various geometrical parameters of the emitter-waveguide system. Using the finite element method, the effect on coupling efficiency of the emitter position and orientation, groove angle, groove depth, and tip radius, is studied in detail. We demonstrate that all parameters, with the exception of groove depth, have a significant impact on the attainable coupling efficiency. Understanding the effect of various geometrical parameters on the coupling between emitters and the plasmonic mode of the waveguide is essential for the design and optimization of quantum dot–V groove devices.