246 resultados para zero-point quantum fluctuations
em Queensland University of Technology - ePrints Archive
Resumo:
This paper describes the socio-economic and environmental impacts of battery driven Auto Rickshaw at Rajshahi city in Bangladesh. Unemployment problem is one of the major problems in Bangladesh. The number of unemployed people in Bangladesh is 7 lacks. Auto Rickshaw reduces this unemployment problem near about 2%.In this thesis work various questions were asked to the Auto Rickshaw driver in the different point in the Rajshahi city. Then those data were calculated to know their socio economic condition. The average number of passenger per Auto Rickshaw was determined at various places of Rajshahi city (Talaimari mor, Hadir mor, Alupotti, Shaheb bazar zero point, Shodor Hospital mor, Fire brigade mor, CNB mor, Lakshipur mor, Bondo gate, Bornali, Panir tank, Rail gate, Rail Station, Bhodrar mor, Adorsha School mor). Air pollution is a great threat for human health. One of the major causes of the air pollution is the emission from various vehicles, which are running by the burning of the fossil fuel in different internal combustion(IC) engines. All the data’s about emission from various power plants were collected from internet. Then the amounts of emission (CO2, NOX and PM) from different power plant were calculated in terms of kg/km. The energy required by the Auto Rickshaw per km was also calculated. Then the histogram of emission from different vehicles in terms of kg/km was drawn. By analyzing the data and chart, it was found that, battery driven Auto Rickshaw increases income, social status, comfort and decreases unemployment problems.
Resumo:
This chapter briefly introduces the concepts and modeling of gas/isotope separation by two dimensional carbon frameworks, i.e. porous graphene and carbon nanomeshes, on the basis of reviewing recent literatures. The small size of evenly distributed pores on these carbon frameworks make them ideal not only for the separation of small gas molecules but also for isotope separation by utilizing the different zero point energies induced by confinement of the pores. The related simulations were treated by transition state theory, an affordable yet precise method that could be adopted in combination with different levels of theory. Such method could be employed to evaluate the performance, as well as to aid the design, of other 2D carbon frameworks toward the goal of gas/isotope separation in the future.
Resumo:
The increasing interest in nanoscience and nanotechnology has prompted intense investigations into appropriate fabrication techniques. Self-organized, bottom-up growth of nanomaterials using plasma nanofabrication techniques1–10 has proven to be one of the most promising approaches for the construction of precisely tailored nanostructures (i.e., quantum dots,11–13 nanotubes,14–17 nanowires,18–20 etc.) arrays. Thus the primary aim of this chapter is to show how plasmas may be used to achieve a high level of control during the self-organized growth of a range of nanomaterials, from zero-dimensional quantum dots (Section 15.2) to one- and two-dimensional nanomaterials (Section 15.3) to nanostructured films (Section 15.4)...
Resumo:
Aims We combine measurements of weak gravitational lensing from the CFHTLS-Wide survey, supernovae Ia from CFHT SNLS and CMB anisotropies from WMAP5 to obtain joint constraints on cosmological parameters, in particular, the dark-energy equation-of-state parameter w. We assess the influence of systematics in the data on the results and look for possible correlations with cosmological parameters. Methods We implemented an MCMC algorithm to sample the parameter space of a flat CDM model with a dark-energy component of constant w. Systematics in the data are parametrised and included in the analysis. We determine the influence of photometric calibration of SNIa data on cosmological results by calculating the response of the distance modulus to photometric zero-point variations. The weak lensing data set is tested for anomalous field-to-field variations and a systematic shape measurement bias for high-redshift galaxies. Results Ignoring photometric uncertainties for SNLS biases cosmological parameters by at most 20% of the statistical errors, using supernovae alone; the parameter uncertainties are underestimated by 10%. The weak-lensing field-to-field variance between 1 deg2-MegaCam pointings is 5-15% higher than predicted from N-body simulations. We find no bias in the lensing signal at high redshift, within the framework of a simple model, and marginalising over cosmological parameters. Assuming a systematic underestimation of the lensing signal, the normalisation increases by up to 8%. Combining all three probes we obtain -0.10 < 1 + w < 0.06 at 68% confidence ( -0.18 < 1 + w < 0.12 at 95%), including systematic errors. Our results are therefore consistent with the cosmological constant . Systematics in the data increase the error bars by up to 35%; the best-fit values change by less than 0.15.
Resumo:
In vector space based approaches to natural language processing, similarity is commonly measured by taking the angle between two vectors representing words or documents in a semantic space. This is natural from a mathematical point of view, as the angle between unit vectors is, up to constant scaling, the only unitarily invariant metric on the unit sphere. However, similarity judgement tasks reveal that human subjects fail to produce data which satisfies the symmetry and triangle inequality requirements for a metric space. A possible conclusion, reached in particular by Tversky et al., is that some of the most basic assumptions of geometric models are unwarranted in the case of psychological similarity, a result which would impose strong limits on the validity and applicability vector space based (and hence also quantum inspired) approaches to the modelling of cognitive processes. This paper proposes a resolution to this fundamental criticism of of the applicability of vector space models of cognition. We argue that pairs of words imply a context which in turn induces a point of view, allowing a subject to estimate semantic similarity. Context is here introduced as a point of view vector (POVV) and the expected similarity is derived as a measure over the POVV's. Different pairs of words will invoke different contexts and different POVV's. Hence the triangle inequality ceases to be a valid constraint on the angles. We test the proposal on a few triples of words and outline further research.
Resumo:
The flood flow in urbanised areas constitutes a major hazard to the population and infrastructure as seen during the summer 2010-2011 floods in Queensland (Australia). Flood flows in urban environments have been studied relatively recently, although no study considered the impact of turbulence in the flow. During the 12-13 January 2011 flood of the Brisbane River, some turbulence measurements were conducted in an inundated urban environment in Gardens Point Road next to Brisbane's central business district (CBD) at relatively high frequency (50 Hz). The properties of the sediment flood deposits were characterised and the acoustic Doppler velocimeter unit was calibrated to obtain both instantaneous velocity components and suspended sediment concentration in the same sampling volume with the same temporal resolution. While the flow motion in Gardens Point Road was subcritical, the water elevations and velocities fluctuated with a distinctive period between 50 and 80 s. The low frequency fluctuations were linked with some local topographic effects: i.e, some local choke induced by an upstream constriction between stairwells caused some slow oscillations with a period close to the natural sloshing period of the car park. The instantaneous velocity data were analysed using a triple decomposition, and the same triple decomposition was applied to the water depth, velocity flux, suspended sediment concentration and suspended sediment flux data. The velocity fluctuation data showed a large energy component in the slow fluctuation range. For the first two tests at z = 0.35 m, the turbulence data suggested some isotropy. At z = 0.083 m, on the other hand, the findings indicated some flow anisotropy. The suspended sediment concentration (SSC) data presented a general trend with increasing SSC for decreasing water depth. During a test (T4), some long -period oscillations were observed with a period about 18 minutes. The cause of these oscillations remains unknown to the authors. The last test (T5) took place in very shallow waters and high suspended sediment concentrations. It is suggested that the flow in the car park was disconnected from the main channel. Overall the flow conditions at the sampling sites corresponded to a specific momentum between 0.2 to 0.4 m2 which would be near the upper end of the scale for safe evacuation of individuals in flooded areas. But the authors do not believe the evacuation of individuals in Gardens Point Road would have been safe because of the intense water surges and flow turbulence. More generally any criterion for safe evacuation solely based upon the flow velocity, water depth or specific momentum cannot account for the hazards caused by the flow turbulence, water depth fluctuations and water surges.
Resumo:
Flood flows in inundated urban environment constitute a natural hazard. During the 12- 13 January 2011 flood of the Brisbane River, detailed water elevation, velocity and suspended sediment data were recorded in an inundated street at the peak of the flood. The field observations highlighted a number of unusual flow interactions with the urban surroundings. These included some slow fluctuations in water elevations and velocity with distinctive periods between 50 and 100 s caused by some local topographic effect (choking), superposed with some fast turbulent fluctuations. The suspended sediment data highlighted some significant suspended sediment loads in the inundated zone.
Resumo:
A simple, effective, and innovative approach based on ion-assisted self-organization is proposed to synthesize size-selected Si quantum dots (QDs) on SiC substrates at low substrate temperatures. Using hybrid numerical simulations, the formation of Si QDs through a self-organization approach is investigated by taking into account two distinct cases of Si QD formation using the ionization energy approximation theory, which considers ionized in-fluxes containing Si3+ and Si1+ ions in the presence of a microscopic nonuniform electric field induced by a variable surface bias. The results show that the highest percentage of the surface coverage by 1 and 2 nm size-selected QDs was achieved using a bias of -20 V and ions in the lowest charge state, namely, Si1+ ions in a low substrate temperature range (227-327 °C). As low substrate temperatures (≤500 °C) are desirable from a technological point of view, because (i) low-temperature deposition techniques are compatible with current thin-film Si-based solar cell fabrication and (ii) high processing temperatures can frequently cause damage to other components in electronic devices and destroy the tandem structure of Si QD-based third-generation solar cells, our results are highly relevant to the development of the third-generation all-Si tandem photovoltaic solar cells.
Resumo:
Traditional information retrieval (IR) systems respond to user queries with ranked lists of relevant documents. The separation of content and structure in XML documents allows individual XML elements to be selected in isolation. Thus, users expect XML-IR systems to return highly relevant results that are more precise than entire documents. In this paper we describe the implementation of a search engine for XML document collections. The system is keyword based and is built upon an XML inverted file system. We describe the approach that was adopted to meet the requirements of Content Only (CO) and Vague Content and Structure (VCAS) queries in INEX 2004.
Resumo:
Cultural policy settings attempting to foster the growth and development of the Australian feature film industry in era of globalisation are coming under increasing pressure. Global forces and emerging production and distribution models are challenging the “narrowness” of cultural policy – mandating a particular film culture, circumscribing certain notions of value and limiting the variety of films produced through cultural policy driven subvention models. Australian horror film production is an important case study. Horror films are a production strategy well suited to the financial limitations of the Australian film industry with competitive advantages for producers against international competitors. However, emerging within a “national” cinema driven by public subsidy and social/cultural objectives, horror films – internationally oriented with a low-culture status – have been severely marginalised within public funding environments. This paper introduces Australian horror film production, and examines the limitations of cultural policy, and the impacts of these questions for the Producer Offset.
Resumo:
Quantum key distribution (QKD) promises secure key agreement by using quantum mechanical systems. We argue that QKD will be an important part of future cryptographic infrastructures. It can provide long-term confidentiality for encrypted information without reliance on computational assumptions. Although QKD still requires authentication to prevent man-in-the-middle attacks, it can make use of either information-theoretically secure symmetric key authentication or computationally secure public key authentication: even when using public key authentication, we argue that QKD still offers stronger security than classical key agreement.
Resumo:
One of the perceived Achilles heels of online citizen journalism is its perceived inability to conduct investigative and first-hand reporting. A number of projects have recently addressed this problem, with varying success: the U.S.-based Assignment Zero was described as "a highly satisfying failure" (Howe 2007), while the German MyHeimat.de appears to have been thoroughly successful in attracting a strong community of contributors, even to the point of being able to generate print versions of its content, distributed free of charge to households in selected German cities. In Australia, citizen journalism played a prominent part in covering the federal elections held on 24 November 2007; news bloggers and public opinion Websites provided a strong counterpoint to the mainstream media coverage of the election campaign (Bruns et al., 2007). Youdecide2007.org, a collaboration between researchers at Queensland University of Technology and media practitioners at the public service broadcaster SBS, the public opinion site On Line Opinion, and technology company Cisco Systems, was developed as a dedicated space for a specifically hyperlocal coverage of the election campaign in each of Australia's 150 electorates from the urban sprawls of Sydney and Brisbane to the sparsely populated remote regions of outback Australia. YD07 provided training materials for would-be citizen journalists and encouraged them to contribute electorate profiles, interview candidates, and conduct vox-pops with citizens in their local area. The site developed a strong following especially in its home state of Queensland, and its interviewers influenced national public debate by uncovering the sometimes controversial personal views of mainstream and fringe candidates. At the same time, the success of YD07 was limited by external constraints determined by campaign timing and institutional frameworks. As part of a continuing action research cycle, lessons learnt from Youdecide2007.org are going to be translated into further iterations of the project, which will cover the local government elections in the Australian state of Queensland, to be held in March 2008, and developments subsequent to these elections. This paper will present research outcomes from the Youdecide2007.org project. In particular, it will examine the roles of staff contributors and citizen journalists in attracting members, providing information, promoting discussion, and fostering community on the site: early indications from a study of interaction data on the site indicate notably different contribution patterns and effects for staff and citizen participants, which may point towards the possibility of developing more explicit pro-am collaboration models in line with the Pro-Am phenomenon outlined by Leadbeater & Miller (2004). The paper will outline strengths and weaknesses of the Youdecide model and highlight requirements for the successful development of active citizen journalism communities. In doing so, it will also evaluate the feasibility of hyperlocal citizen journalism approaches, and their interrelationship with broader regional, state, and national journalism in both its citizen and industrial forms.
Resumo:
The structures of the anhydrous 1:1 proton-transfer compounds of 4,5-dichlorophthalic acid (DCPA) with the monocyclic heteroaromatic Lewis bases 2-aminopyrimidine, 3-(aminocarboxy) pyridine (nicotinamide) and 4-(aminocarbonyl) pyridine (isonicotinamide), namely 2-aminopyrimidinium 2-carboxy-4,5-dichlorobenzoate C4H6N3+ C8H3Cl2O4- (I), 3-(aminocarbonyl) pyridinium 2-carboxy-4,5-dichlorobenzoate C6H7N2O+ C8H3Cl2O4- (II) and the unusual salt adduct 4-(aminocarbonyl) pyridinium 2-carboxy-4,5-dichlorobenzoate 2-carboxymethyl-4,5-dichlorobenzoic acid (1/1/1) C6H7N2O+ C8H3Cl2O4-.C9H6Cl2O4 (III) have been determined at 130 K. Compound (I) forms discrete centrosymmetric hydrogen-bonded cyclic bis(cation--anion) units having both R2/2(8) and R2/1(4) N-H...O interactions. In compound (II) the primary N-H...O linked cation--anion units are extended into a two-dimensional sheet structure via amide-carboxyl and amide-carbonyl N-H...O interactions. The structure of (III) reveals the presence of an unusual and unexpected self-synthesized methyl monoester of the acid as an adduct molecule giving one-dimensional hydrogen-bonded chains. In all three structures the hydrogen phthalate anions are
Resumo:
Monitoring unused or dark IP addresses offers opportunities to extract useful information about both on-going and new attack patterns. In recent years, different techniques have been used to analyze such traffic including sequential analysis where a change in traffic behavior, for example change in mean, is used as an indication of malicious activity. Change points themselves say little about detected change; further data processing is necessary for the extraction of useful information and to identify the exact cause of the detected change which is limited due to the size and nature of observed traffic. In this paper, we address the problem of analyzing a large volume of such traffic by correlating change points identified in different traffic parameters. The significance of the proposed technique is two-fold. Firstly, automatic extraction of information related to change points by correlating change points detected across multiple traffic parameters. Secondly, validation of the detected change point by the simultaneous presence of another change point in a different parameter. Using a real network trace collected from unused IP addresses, we demonstrate that the proposed technique enables us to not only validate the change point but also extract useful information about the causes of change points.