842 resultados para 160699 Political Science not elsewhere classified
A Modified inverse integer Cholesky decorrelation method and the performance on ambiguity resolution
Resumo:
One of the research focuses in the integer least squares problem is the decorrelation technique to reduce the number of integer parameter search candidates and improve the efficiency of the integer parameter search method. It remains as a challenging issue for determining carrier phase ambiguities and plays a critical role in the future of GNSS high precise positioning area. Currently, there are three main decorrelation techniques being employed: the integer Gaussian decorrelation, the Lenstra–Lenstra–Lovász (LLL) algorithm and the inverse integer Cholesky decorrelation (IICD) method. Although the performance of these three state-of-the-art methods have been proved and demonstrated, there is still a potential for further improvements. To measure the performance of decorrelation techniques, the condition number is usually used as the criterion. Additionally, the number of grid points in the search space can be directly utilized as a performance measure as it denotes the size of search space. However, a smaller initial volume of the search ellipsoid does not always represent a smaller number of candidates. This research has proposed a modified inverse integer Cholesky decorrelation (MIICD) method which improves the decorrelation performance over the other three techniques. The decorrelation performance of these methods was evaluated based on the condition number of the decorrelation matrix, the number of search candidates and the initial volume of search space. Additionally, the success rate of decorrelated ambiguities was calculated for all different methods to investigate the performance of ambiguity validation. The performance of different decorrelation methods was tested and compared using both simulation and real data. The simulation experiment scenarios employ the isotropic probabilistic model using a predetermined eigenvalue and without any geometry or weighting system constraints. MIICD method outperformed other three methods with conditioning improvements over LAMBDA method by 78.33% and 81.67% without and with eigenvalue constraint respectively. The real data experiment scenarios involve both the single constellation system case and dual constellations system case. Experimental results demonstrate that by comparing with LAMBDA, MIICD method can significantly improve the efficiency of reducing the condition number by 78.65% and 97.78% in the case of single constellation and dual constellations respectively. It also shows improvements in the number of search candidate points by 98.92% and 100% in single constellation case and dual constellations case.
Resumo:
Early this year the Australian Department of Environment and Heritage commissioned a desktop literature review with a focus on ultrafine particles including analysis of health impacts of the particles as well as the impact of sulphur content of diesel fuel on ultrafine particle emission. This paper summarizes the findings of the report on the link between the sulphur content of diesel fuels and the number of ultrafine particles in diesel emissions. The literature search on this topic resulted in over 150 publications. The majority of these publications, although investigating different aspects of the influence of fuel sulphur level on diesel vehicle emissions, were not directly concerned with ultrafine particle emissions. A specific focus of the paper is on: ----- ----- summary of state of knowledge established by the review, and ----- ----- summary of recommendations on the research priorities for Australia to address the information gaps for this issue, and on the appropriate management responses.
Resumo:
As part of a larger indoor environmental study, residential indoor and outdoor levels of nitrogen dioxide (NO2) were measured for 14 houses in a suburb of Brisbane, Queensland, Australia. Passive samplers were used for 48-h sampling periods during the winter of 1999. The average indoor and outdoor NO2 levels were 13.8 ± 6.3 and 16.7 ± 4.2 ppb, respectively. The indoor/outdoor NO2 concentration ratio ranged from 0.4 to 2.3, with a median value of 0.82. The results of statistic analyses indicated that there was no significant correlation between indoor and outdoor NO2 concentrations, or between indoor and fixed site NO2 monitoring station concentrations. However, there was a significant correlation between outdoor and fixed site NO2 monitoring station concentrations. There was also a significant correlation between indoor NO2 concentration and indoor submicrometre (0.007–0.808 μm) aerosol particle number concentrations. The results in this study indicated indoor NO2 levels are significantly affected by indoor NO2 sources, such as a gas stove and cigarette smoking. It implies that the outdoor or fixed site monitoring concentration alone is a poor predictor of indoor NO2 concentration.
Resumo:
The INEX 2010 Focused Relevance Feedback track offered a refined approach to the evaluation of Focused Relevance Feedback algorithms through simulated exhaustive user feedback. As in traditional approaches we simulated a user-in-the loop by re-using the assessments of ad-hoc retrieval obtained from real users who assess focused ad-hoc retrieval submissions. The evaluation was extended in several ways: the use of exhaustive relevance feedback over entire runs; the evaluation of focused retrieval where both the retrieval results and the feedback are focused; the evaluation was performed over a closed set of documents and complete focused assessments; the evaluation was performed over executable implementations of relevance feedback algorithms; and �finally, the entire evaluation platform is reusable. We present the evaluation methodology, its implementation, and experimental results obtained for nine submissions from three participating organisations.
Resumo:
Bridges are valuable assets of every nation. They deteriorate with age and often are subjected to additional loads or different load patterns than originally designed for. These changes in loads can cause localized distress and may result in bridge failure if not corrected in time. Early detection of damage and appropriate retrofitting will aid in preventing bridge failures. Large amounts of money are spent in bridge maintenance all around the world. A need exists for a reliable technology capable of monitoring the structural health of bridges, thereby ensuring they operate safely and efficiently during the whole intended lives. Monitoring of bridges has been traditionally done by means of visual inspection. Visual inspection alone is not capable of locating and identifying all signs of damage, hence a variety of structural health monitoring (SHM) techniques is used regularly nowadays to monitor performance and to assess condition of bridges for early damage detection. Acoustic emission (AE) is one technique that is finding an increasing use in SHM applications of bridges all around the world. The chapter starts with a brief introduction to structural health monitoring and techniques commonly used for monitoring purposes. Acoustic emission technique, wave nature of AE phenomenon, previous applications and limitations and challenges in the use as a SHM technique are also discussed. Scope of the project and work carried out will be explained, followed by some recommendations of work planned in future.
Resumo:
Throughout this workshop session we have looked at various configurations of Sage as well as using the Sage UI to run Sage applications (e.g. the image viewer). More advanced usage of Sage has been demonstrated using a Sage compatible version of Paraview highlighting the potential of parallel rendering. The aim of this tutorial session is to give a practical introduction to developing visual content for a tiled display using the Sage libraries. After completing this tutorial you should have the basic tools required to develop your own custom Sage applications. This tutorial is designed for software developers and intermediate programming knowledge is assumed, along with some introductory OpenGL . You will be required to write small portions of C/C++ code to complete this worksheet. However if you do not feel comfortable writing code (or have never written in C or C++), we will be on hand throughout this session so feel free to ask for some help. We have a number of machines in this lab running a VNC client to a virtual machine running Fedora 12. You should all be able to log in with the username “escience”, and password “escience10”. Some of the commands in this worksheet require you to run them as the root user, so note the password as you may need to use it a few times. If you need to access the Internet, then use the username “qpsf01”, password “escience10”
Resumo:
Scalable high-resolution tiled display walls are becoming increasingly important to decision makers and researchers because high pixel counts in combination with large screen areas facilitate content rich, simultaneous display of computer-generated visualization information and high-definition video data from multiple sources. This tutorial is designed to cater for new users as well as researchers who are currently operating tiled display walls or 'OptiPortals'. We will discuss the current and future applications of display wall technology and explore opportunities for participants to collaborate and contribute in a growing community. Multiple tutorial streams will cover both hands-on practical development, as well as policy and method design for embedding these technologies into the research process. Attendees will be able to gain an understanding of how to get started with developing similar systems themselves, in addition to becoming familiar with typical applications and large-scale visualisation techniques. Presentations in this tutorial will describe current implementations of tiled display walls that highlight the effective usage of screen real-estate with various visualization datasets, including collaborative applications such as visualcasting, classroom learning and video conferencing. A feature presentation for this tutorial will be given by Jurgen Schulze from Calit2 at the University of California, San Diego. Jurgen is an expert in scientific visualization in virtual environments, human-computer interaction, real-time volume rendering, and graphics algorithms on programmable graphics hardware.
Resumo:
In 2006, the Faculty of Built Environment and Engineering introduced the first faculty wide unit dedicated to sustainability at any Australian University. BEB200 Introducing Sustainability has semester enrolments of up to 1500 students. Instruments such as lectures, readings, field visits, group projects and structured tutorial activities are used and have evolved over the last five years in response to student and staff feedback and attempts to better engage students. More than seventy staff have taught in the unit, which is in its final offering in this form in 2010. This paper reflects on the experiences of five academics who have played key roles in the development and teaching of this unit over the last five years. They argue that sustainability is a paradigm that allows students to explore other ways of knowing as they engage with issues in a complex world, not an end in itself. From the students’ perspective, grappling with such issues enables them to move towards a context in which they can understand their own discipline and its role in the contradictory and rapidly changing professional world. Insights are offered into how sustainability units may be developed in the future.
Resumo:
This technical report is concerned with one aspect of environmental monitoring—the detection and analysis of acoustic events in sound recordings of the environment. Sound recordings offer ecologists the advantage of cheaper and increased sampling but make available so much data that automated analysis becomes essential. The report describes a number of tools for automated analysis of recordings, including noise removal from spectrograms, acoustic event detection, event pattern recognition, spectral peak tracking, syntactic pattern recognition applied to call syllables, and oscillation detection. These algorithms are applied to a number of animal call recognition tasks, chosen because they illustrate quite different modes of analysis: (1) the detection of diffuse events caused by wind and rain, which are frequent contaminants of recordings of the terrestrial environment; (2) the detection of bird and calls; and (3) the preparation of acoustic maps for whole ecosystem analysis. This last task utilises the temporal distribution of events over a daily, monthly or yearly cycle.
Resumo:
The purpose of this work is to validate and automate the use of DYNJAWS; a new component module (CM) in the BEAMnrc Monte Carlo (MC) user code. The DYNJAWS CM simulates dynamic wedges and can be used in three modes; dynamic, step-and-shoot and static. The step-and-shoot and dynamic modes require an additional input file defining the positions of the jaw that constitutes the dynamic wedge, at regular intervals during its motion. A method for automating the generation of the input file is presented which will allow for the more efficient use of the DYNJAWS CM. Wedged profiles have been measured and simulated for 6 and 10 MV photons at three field sizes (5 cm x 5 cm , 10 cm x10 cm and 20 cm x 20 cm), four wedge angles (15, 30, 45 and 60 degrees), at dmax and at 10 cm depth. Results of this study show agreement between the measured and the MC profiles to within 3% of absolute dose or 3 mm distance to agreement for all wedge angles at both energies and depths. The gamma analysis suggests that dynamic mode is more accurate than the step-and-shoot mode. The DYNJAWS CM is an important addition to the BEAMnrc code and will enable the MC verification of patient treatments involving dynamic wedges.
Resumo:
We all live in a yellow submarine… When I go to work in the morning, in the office building that hosts our BPM research group, on the way up to our level I come by this big breakout room that hosts a number of computer scientists, working away at the next generation software algorithms and iPad applications (I assume). I have never actually been in that room, but every now and then the door is left ajar for a while and I can spot couches, lots (I mean, lots!) of monitors, the odd scientist, a number of Lara Croft posters, and the usual room equipment you’d probably expect from computer scientists (and, no, it’s not like that evil Dennis guy from the Jurassic Park movie, buried in chips, coke, and flickering code screens… It’s also not like the command room from the Nebuchadnezzar, Neo’s hovercraft in the Matrix movies, although I still strongly believe these green lines of code make a good screensaver).
Resumo:
We present the findings of a study into the implementation of explicitly criterion- referenced assessment in undergraduate courses in mathematics. We discuss students' concepts of criterion referencing and also the various interpretations that this concept has among mathematics educators. Our primary goal was to move towards a classification of criterion referencing models in quantitative courses. A secondary goal was to investigate whether explicitly presenting assessment criteria to students was useful to them and guided them in responding to assessment tasks. The data and feedback from students indicates that while students found the criteria easy to understand and useful in informing them as to how they would be graded, it did not alter the way the actually approached the assessment activity.
Resumo:
In order to achieve meaningful reductions in individual ecological footprints, individuals must dramatically alter their day to day behaviours. Effective interventions will need to be evidence based and there is a necessity for the rapid transfer or communication of information from the point of research, into policy and practice. A number of health disciplines, including psychology and public health, share a common mission to promote health and well-being and it is becoming clear that the most practical pathway to achieving this mission is through interdisciplinary collaboration. This paper argues that an interdisciplinary collaborative approach will facilitate research that results in the rapid transfer of findings into policy and practice. The application of this approach is described in relation to the Green Living project which explored the psycho-social predictors of environmentally friendly behaviour. Following a qualitative pilot study, and in consultation with an expert panel comprising academics, industry professionals and government representatives, a self-administered mail survey was distributed to a random sample of 3000 residents of Brisbane and Moreton Bay (Queensland, Australia). The Green Living survey explored specific beliefs which included attitudes, norms, perceived control, intention and behaviour, as well as a number of other constructs such as environmental concern and altruism. This research has two beneficial outcomes. First, it will inform a practical model for predicting sustainable living behaviours and a number of local councils have already expressed an interest in making use of the results as part of their ongoing community engagement programs. Second, it provides an example of how a collaborative interdisciplinary project can provide a more comprehensive approach to research than can be accomplished by a single disciplinary project.
Resumo:
Linking real-time schedulability directly to the Quality of Control (QoC), the ultimate goal of a control system, a hierarchical feedback QoC management framework with the Fixed Priority (FP) and the Earliest-Deadline-First (EDF) policies as plug-ins is proposed in this paper for real-time control systems with multiple control tasks. It uses a task decomposition model for continuous QoC evaluation even in overload conditions, and then employs heuristic rules to adjust the period of each of the control tasks for QoC improvement. If the total requested workload exceeds the desired value, global adaptation of control periods is triggered for workload maintenance. A sufficient stability condition is derived for a class of control systems with delay and period switching of the heuristic rules. Examples are given to demonstrate the proposed approach.
Resumo:
The number of chondrogenic cells available locally is an, important factor in the repair process for cartilage defects. Previous studies demonstrated that the number of transplanted rabbit perichondrial cells (PC) remaining in a cartilage defect in vivo, after being carried into the site in a polylactic acid (PLA) scaffold, declined markedly within two days. This study examined the ability of in vitro culture of PC/PLA constructs to enhance subsequent biomechanical stability of the cells and the matrix content in an in vitro screening assay. PC/PLA constructs were analyzed after 1 h, 1 and 2 weeks of culture. The biomechanical adherence of PC to the PLA scaffold was tested by subjecting the PC/PLA constructs to a range of flow velocities (0.25-25 mm/s), spanning the range estimated to occur under conditions of construct insertion in vivo. The adhesion of PC to the PLA carrier was increased significantly by 1 and 2 weeks of incubation, with 25 mm/s flow causing a 57% detachment of cells after 1 h of seeding, but only 7% and 16% after I and 2 weeks of culture, respectively (p < 0.001). This adherence was associated with marked deposition of glycosaminoglycan and collagen. These findings suggest that pre-incubation of PC-laden PLA scaffolds markedly enhances the stability of the indwelling cells. (C) 2003 Orthopaedic Research Society. Published by Elsevier Science Ltd. All rights reserved.