483 resultados para Scintillation counters
Resumo:
This paper describes a novel probabilistic approach to incorporating odometric information into appearance-based SLAM systems, without performing metric map construction or calculating relative feature geometry. The proposed system, dubbed Continuous Appearance-based Trajectory SLAM (CAT-SLAM), represents location as a probability distribution along a trajectory, and represents appearance continuously over the trajectory rather than at discrete locations. The distribution is evaluated using a Rao-Blackwellised particle filter, which weights particles based on local appearance and odometric similarity and explicitly models both the likelihood of revisiting previous locations and visiting new locations. A modified resampling scheme counters particle deprivation and allows loop closure updates to be performed in constant time regardless of map size. We compare the performance of CAT-SLAM to FAB-MAP (an appearance-only SLAM algorithm) in an outdoor environment, demonstrating a threefold increase in the number of correct loop closures detected by CAT-SLAM.
Resumo:
Arguing that Baz Luhrmann's "Australia" (2008) is a big-budget, non-independent film espousing a left-leaning political ideology in its non-racist representations of Aborigines on film, this paper suggests the addition of a 'fourth formation' to the 1984 Moore and Muecke model is warranted. According to their theorising, racist "first formation" films promote policies of assimilation whereas "second formation" films avoid overt political statements in favour of more acceptable multicultural liberalism. Moore and Muecke's seemingly ultimate "third formation films", however, blatantly foreground the director's leftist political dogma in a necessarily low budget, independent production. "Australia", on the other hand, is an advance on the third formation because its feminised Aboriginal voice is safely backed by a colossal production budget and indicates a transformation in public perceptions of Aboriginal issues. Furthermore, this paper argues that the use of low-cost post-production techniques such as voice-over narration by racially appropriate individuals and the use of diegetic song in Australia work to ensure the positive reception of the left-leaning message regarding the Stolen Generations. With these devices Luhrmann effectively counters the claims of right-wing denialists such as Andrew Bolt and Keith Windschuttle.
Resumo:
Computer resource allocation represents a significant challenge particularly for multiprocessor systems, which consist of shared computing resources to be allocated among co-runner processes and threads. While an efficient resource allocation would result in a highly efficient and stable overall multiprocessor system and individual thread performance, ineffective poor resource allocation causes significant performance bottlenecks even for the system with high computing resources. This thesis proposes a cache aware adaptive closed loop scheduling framework as an efficient resource allocation strategy for the highly dynamic resource management problem, which requires instant estimation of highly uncertain and unpredictable resource patterns. Many different approaches to this highly dynamic resource allocation problem have been developed but neither the dynamic nature nor the time-varying and uncertain characteristics of the resource allocation problem is well considered. These approaches facilitate either static and dynamic optimization methods or advanced scheduling algorithms such as the Proportional Fair (PFair) scheduling algorithm. Some of these approaches, which consider the dynamic nature of multiprocessor systems, apply only a basic closed loop system; hence, they fail to take the time-varying and uncertainty of the system into account. Therefore, further research into the multiprocessor resource allocation is required. Our closed loop cache aware adaptive scheduling framework takes the resource availability and the resource usage patterns into account by measuring time-varying factors such as cache miss counts, stalls and instruction counts. More specifically, the cache usage pattern of the thread is identified using QR recursive least square algorithm (RLS) and cache miss count time series statistics. For the identified cache resource dynamics, our closed loop cache aware adaptive scheduling framework enforces instruction fairness for the threads. Fairness in the context of our research project is defined as a resource allocation equity, which reduces corunner thread dependence in a shared resource environment. In this way, instruction count degradation due to shared cache resource conflicts is overcome. In this respect, our closed loop cache aware adaptive scheduling framework contributes to the research field in two major and three minor aspects. The two major contributions lead to the cache aware scheduling system. The first major contribution is the development of the execution fairness algorithm, which degrades the co-runner cache impact on the thread performance. The second contribution is the development of relevant mathematical models, such as thread execution pattern and cache access pattern models, which in fact formulate the execution fairness algorithm in terms of mathematical quantities. Following the development of the cache aware scheduling system, our adaptive self-tuning control framework is constructed to add an adaptive closed loop aspect to the cache aware scheduling system. This control framework in fact consists of two main components: the parameter estimator, and the controller design module. The first minor contribution is the development of the parameter estimators; the QR Recursive Least Square(RLS) algorithm is applied into our closed loop cache aware adaptive scheduling framework to estimate highly uncertain and time-varying cache resource patterns of threads. The second minor contribution is the designing of a controller design module; the algebraic controller design algorithm, Pole Placement, is utilized to design the relevant controller, which is able to provide desired timevarying control action. The adaptive self-tuning control framework and cache aware scheduling system in fact constitute our final framework, closed loop cache aware adaptive scheduling framework. The third minor contribution is to validate this cache aware adaptive closed loop scheduling framework efficiency in overwhelming the co-runner cache dependency. The timeseries statistical counters are developed for M-Sim Multi-Core Simulator; and the theoretical findings and mathematical formulations are applied as MATLAB m-file software codes. In this way, the overall framework is tested and experiment outcomes are analyzed. According to our experiment outcomes, it is concluded that our closed loop cache aware adaptive scheduling framework successfully drives co-runner cache dependent thread instruction count to co-runner independent instruction count with an error margin up to 25% in case cache is highly utilized. In addition, thread cache access pattern is also estimated with 75% accuracy.
Resumo:
This paper describes a new system, dubbed Continuous Appearance-based Trajectory Simultaneous Localisation and Mapping (CAT-SLAM), which augments sequential appearance-based place recognition with local metric pose filtering to improve the frequency and reliability of appearance-based loop closure. As in other approaches to appearance-based mapping, loop closure is performed without calculating global feature geometry or performing 3D map construction. Loop-closure filtering uses a probabilistic distribution of possible loop closures along the robot’s previous trajectory, which is represented by a linked list of previously visited locations linked by odometric information. Sequential appearance-based place recognition and local metric pose filtering are evaluated simultaneously using a Rao–Blackwellised particle filter, which weights particles based on appearance matching over sequential frames and the similarity of robot motion along the trajectory. The particle filter explicitly models both the likelihood of revisiting previous locations and exploring new locations. A modified resampling scheme counters particle deprivation and allows loop-closure updates to be performed in constant time for a given environment. We compare the performance of CAT-SLAM with FAB-MAP (a state-of-the-art appearance-only SLAM algorithm) using multiple real-world datasets, demonstrating an increase in the number of correct loop closures detected by CAT-SLAM.
Resumo:
The position of housing demand and supply is not consistent. The Australian situation counters the experience demonstrated in many other parts of the world in the aftermath of the Global Financial Crisis, with residential housing prices proving particularly resilient. A seemingly inexorable housing demand remains a critical issue affecting the socio-economic landscape. Underpinned by high levels of population growth fuelled by immigration, and further buoyed by sustained historically low interest rates, increasing income levels, and increased government assistance for first home buyers, this strong housing demand level ensures problems related to housing affordability continue almost unabated. A significant, but less visible factor impacting housing affordability relates to holding costs. Although only one contributor in the housing affordability matrix, the nature and extent of holding cost impact requires elucidation: for example, the computation and methodology behind the calculation of holding costs varies widely - and in some instances completely ignored. In addition, ambiguity exists in terms of the inclusion of various elements that comprise holding costs, thereby affecting the assessment of their relative contribution. Such anomalies may be explained by considering that assessment is conducted over time in an ever-changing environment. A strong relationship with opportunity cost - in turn dependant inter alia upon prevailing inflation and / or interest rates - adds further complexity. By extending research in the general area of housing affordability, this thesis seeks to provide a detailed investigation of those elements related to holding costs specifically in the context of midsized (i.e. between 15-200 lots) greenfield residential property developments in South East Queensland. With the dimensions of holding costs and their influence over housing affordability determined, the null hypothesis H0 that holding costs are not passed on can be addressed. Arriving at these conclusions involves the development of robust economic and econometric models which seek to clarify the componentry impacts of holding cost elements. An explanatory sequential design research methodology has been adopted, whereby the compilation and analysis of quantitative data and the development of an economic model is informed by the subsequent collection and analysis of primarily qualitative data derived from surveying development related organisations. Ultimately, there are significant policy implications in relation to the framework used in Australian jurisdictions that promote, retain, or otherwise maximise, the opportunities for affordable housing.
Resumo:
The overall aim of this project was to contribute to existing knowledge regarding methods for measuring characteristics of airborne nanoparticles and controlling occupational exposure to airborne nanoparticles, and to gather data on nanoparticle emission and transport in various workplaces. The scope of this study involved investigating the characteristics and behaviour of particles arising from the operation of six nanotechnology processes, subdivided into nine processes for measurement purposes. It did not include the toxicological evaluation of the aerosol and therefore, no direct conclusion was made regarding the health effects of exposure to these particles. Our research included real-time measurement of sub, and supermicrometre particle number and mass concentration, count median diameter, and alveolar deposited surface area using condensation particle counters, an optical particle counter, DustTrak photometer, scanning mobility particle sizer, and nanoparticle surface area monitor, respectively. Off-line particle analysis included scanning and transmission electron microscopy, energy-dispersive x-ray spectrometry, and thermal optical analysis of elemental carbon. Sources of fibrous and non-fibrous particles were included.
Resumo:
A frame-rate stereo vision system, based on non-parametric matching metrics, is described. Traditional metrics, such as normalized cross-correlation, are expensive in terms of logic. Non-parametric measures require only simple, parallelizable, functions such as comparators, counters and exclusive-or, and are thus very well suited to implementation in reprogrammable logic.
Resumo:
An accurate evaluation of the airborne particle dose-response relationship requires detailed measurements of the actual particle concentration levels that people are exposed to, in every microenvironment in which they reside. The aim of this work was to perform an exposure assessment of children in relation to two different aerosol species: ultrafine particles (UFPs) and black carbon (BC). To this purpose, personal exposure measurements, in terms of UFP and BC concentrations, were performed on 103 children aged 8-11 years (10.1 ± 1.1 years) using hand-held particle counters and aethalometers. Simultaneously, a time-activity diary and a portable GPS were used to determine the children’s daily time-activity pattern and estimate their inhaled dose of UFPs and BC. The median concentration to which the study population was exposed was found to be comparable to the high levels typically detected in urban traffic microenvironments, in terms of both particle number (2.2×104 part. cm-3) and BC (3.8 μg m-3) concentrations. Daily inhaled doses were also found to be relatively high and were equal to 3.35×1011 part. day-1 and 3.92×101 μg day-1 for UFPs and BC, respectively. Cooking and using transportation were recognized as the main activities contributing to overall daily exposure, when normalized according to their corresponding time contribution for UFPs and BC, respectively. Therefore, UFPs and BC could represent tracers of children exposure to particulate pollution from indoor cooking activities and transportation microenvironments, respectively.
Resumo:
According to a study conducted by the International Maritime organisation (IMO) shipping sector is responsible for 3.3% of the global Greenhouse Gas (GHG) emissions. The 1997 Kyoto Protocol calls upon states to pursue limitation or reduction of emissions of GHG from marine bunker fuels working through the IMO. In 2011, 14 years after the adoption of the Kyoto Protocol, the Marine Environment Protection Committee (MEPC) of the IMO has adopted mandatory energy efficiency measures for international shipping which can be treated as the first ever mandatory global GHG reduction instrument for an international industry. The MEPC approved an amendment of Annex VI of the 1973 International Convention for the Prevention of Pollution from Ships (MARPOL 73/78) to introduce a mandatory Energy Efficiency Design Index (EEDI) for new ships and the Ship Energy Efficiency Management Plan (SEEMP) for all ships. Considering the growth projections of human population and world trade the technical and operational measures may not be able to reduce the amount of GHG emissions from international shipping in a satisfactory level. Therefore, the IMO is considering to introduce market-based mechanisms that may serve two purposes including providing a fiscal incentive for the maritime industry to invest in more energy efficient manner and off-setting of growing ship emissions. Some leading developing countries already voiced their serious reservations on the newly adopted IMO regulations stating that by imposing the same obligation on all countries, irrespective of their economic status, this amendment has rejected the Principle of Common but Differentiated Responsibility (the CBDR Principle), which has always been the cornerstone of international climate change law discourses. They also claimed that negotiation for a market based mechanism should not be continued without a clear commitment from the developed counters for promotion of technical co-operation and transfer of technology relating to the improvement of energy efficiency of ships. Against this backdrop, this article explores the challenges for the developing counters in the implementation of already adopted technical and operational measures.
Resumo:
The development of global navigation satellite systems (GNSS) provides a solution of many applied problems with increasingly higher quality and accuracy nowadays. Researches that are carried out by the Bavarian Academy of Sciences and Humanities in Munich (BAW) in the field of airborne gravimetry are based on sophisticated data processing from high frequency GNSS receiver for kinematic aircraft positioning. Applied algorithms for inertial acceleration determination are based on the high sampling rate (50Hz) and on reducing of such factors as ionosphere scintillation and multipath at aircraft /antenna near field effects. The quality of the GNSS derived kinematic height are studied also by intercomparison with lift height variations collected by a precise high sampling rate vertical scale [1]. This work is aimed at the ways of more accurate determination of mini-aircraft altitude by means of high frequency GNSS receivers, in particular by considering their dynamic behaviour.
Resumo:
This paper describes the results of experiments made in the vicinity of EHV overhead lines to investigate sources of clouds of charged particles using simultaneously-recording arrays of electric field meters to measure direct electric fields produced under ion clouds. E-field measurements, made at one metre above ground level, are correlated with wind speed and direction, and with measurements from ionisation counters and audible corona effects to identify possible positions of sources of corona on adjacent power lines. Measurements made in dry conditions on EHV lines in flat remote locations with no adjacent buildings or large vegetation indicate the presence of discrete ion sources associated with high stress points on some types of line hardware such as connectors and conductor spacers. Faulty line components such as insulators and line fittings are also found to be a possible source of ion clouds.
Resumo:
Corona discharge is responsible for the flux of small ions from overhead power lines, and is capable of modifying the ambient electrical environment, such as the air ion concentrations at ground level. Once produced, small ions quickly attach to aerosol particles in the air, producing ‘large ions’, approximately 1 nm to 1 µm in diameter. However, very few studies have measured air ion concentrations directly near high voltage transmission lines. The present study involved the simultaneously measurement of small ion concentration and net large ion concentration using air ion counters and an aerosol electrometer at four power line sites. Both positive and negative small ion concentration (<1.6nm), net large ion concentration (2nm-5μm) and particle number concentration (10nm-2μm) were measured using air ion counters and an aerosol electrometer at four power line sites. Measurements at sites 1 and 2 were conducted at both upwind and downwind sides. The results showed that total ion concentrations on the downwind side were 3-5 times higher than on the upwind side, while particle number concentrations did not show a significant difference. This result also shows that a large number of ions were emitted from the power lines at sites 1 and 2. Furthermore, both positive and negative ions were observed at different power line sites. Dominant positive ions were observed at site 1, with a concentration of 4.4 x 103 ions cm-3, which was 10 times higher than on the upwind side. Contrary to site 1, sites 2 to 4 showed negative ion emissions, with concentrations of -1.2 x 103, -460 and -410 ions cm-3, respectively. These values were higher than the background urban negative ion concentration of 400 cm-3. At site 1 and site 2, the net ion concentration and net particle charge concentration on downwind side of the lines showed same polarities. Further investigations were also conducted into the correlation between net ion concentration and net charge particle concentration 20 m downwind of the power lines at site 2. The two parameters showed a correlation coefficient of 0.72, indicating that a substantial number of ions could attach to particles and affect the particle charge status within a short distance from the source.
Resumo:
Previous studies showed that a significant number of the particles present in indoor air are generated by cooking activities, and measured particle concentrations and exposures have been used to estimate the related human dose. The dose evaluation can be affected by the particle charge level which is usually not considered in particle deposition models. To this purpose, in this paper we show, for the very first time, the electric charge of particles generated during cooking activities and thus extending the interest on particle charging characterization to indoor micro-environments, so far essentially focused on outdoors. Particle number, together with positive and negative cluster ion concentrations, was monitored using a condensation particle counter and two air ion counters, respectively, during different cooking events. Positively-charged particle distribution fractions during gas combustion, bacon grilling, and eggplant grilling events were measured by two Scanning Mobility Particle Sizer spectrometers, used with and without a neutralizer. Finally, a Tandem Differential Mobility Analyzer was used to measure the charge specific particle distributions of bacon and eggplant grilling experiments, selecting particles of 30, 50, 80 and 100 nm in mobility diameter. The total fraction of positively-charged particles was 4.0%, 7.9%, and 5.6% for gas combustion, bacon grilling, and eggplant grilling events, respectively, then lower than other typical outdoor combustion-generated particles.
Resumo:
During their entire lives, people are exposed to the pollutants present in indoor air. Recently, Electronic Nicotine Delivery Systems, mainly known as electronic cigarettes, have been widely commercialized: they deliver particles into the lungs of the users but a “second-hand smoke” has yet to be associated to this indoor source. On the other hand, the naturally-occurring radioactive gas, i.e. radon, represents a significant risk for lung cancer, and the cumulative action of these two agents could be worse than the agents separately would. In order to deepen the interaction between radon progeny and second-hand aerosol from different types of cigarettes, a designed experimental study was carried out by generating aerosol from e-cigarette vaping as well as from second-hand traditional smoke inside a walk-in radon chamber at the National Institute of Ionizing Radiation Metrology (INMRI) of Italy. In this chamber, the radon present in air comes naturally from the floor and ambient conditions are controlled. To characterize the sidestream smoke emitted by cigarettes, condensation particle counters and scanning mobility particle sizer were used. Radon concentration in the air was measured through an Alphaguard ionization chamber, whereas the measurement of radon decay product in the air was performed with the Tracelab BWLM Plus-2S Radon daughter Monitor. It was found an increase of the Potential Alpha-Energy Concentration (PAEC) due to the radon decay products attached to aerosol for higher particle number concentrations. This varied from 7.47 ± 0.34 MeV L−1 to 12.6 ± 0.26 MeV L−1 (69%) for the e-cigarette. In the case of traditional cigarette and at the same radon concentration, the increase was from 14.1 ± 0.43 MeV L−1 to 18.6 ± 0.19 MeV L−1 (31%). The equilibrium factor increases, varying from 23.4% ± 1.11% to 29.5% ± 0.26% and from 30.9% ± 1.0% to 38.1 ± 0.88 for the e-cigarette and traditional cigarette, respectively. These growths still continue for long time after the combustion, by increasing the exposure risk.