919 resultados para Run
Resumo:
Reliable robotic perception and planning are critical to performing autonomous actions in uncertain, unstructured environments. In field robotic systems, automation is achieved by interpreting exteroceptive sensor information to infer something about the world. This is then mapped to provide a consistent spatial context, so that actions can be planned around the predicted future interaction of the robot and the world. The whole system is as reliable as the weakest link in this chain. In this paper, the term mapping is used broadly to describe the transformation of range-based exteroceptive sensor data (such as LIDAR or stereo vision) to a fixed navigation frame, so that it can be used to form an internal representation of the environment. The coordinate transformation from the sensor frame to the navigation frame is analyzed to produce a spatial error model that captures the dominant geometric and temporal sources of mapping error. This allows the mapping accuracy to be calculated at run time. A generic extrinsic calibration method for exteroceptive range-based sensors is then presented to determine the sensor location and orientation. This allows systematic errors in individual sensors to be minimized, and when multiple sensors are used, it minimizes the systematic contradiction between them to enable reliable multisensor data fusion. The mathematical derivations at the core of this model are not particularly novel or complicated, but the rigorous analysis and application to field robotics seems to be largely absent from the literature to date. The techniques in this paper are simple to implement, and they offer a significant improvement to the accuracy, precision, and integrity of mapped information. Consequently, they should be employed whenever maps are formed from range-based exteroceptive sensor data. © 2009 Wiley Periodicals, Inc.
Resumo:
Energy auditing is an effective but costly approach for reducing the long-term energy consumption of buildings. When well-executed, energy loss can be quickly identified in the building structure and its subsystems. This then presents opportunities for improving energy efficiency. We present a low-cost, portable technology called "HeatWave" which allows non-experts to generate detailed 3D surface temperature models for energy auditing. This handheld 3D thermography system consists of two commercially available imaging sensors and a set of software algorithms which can be run on a laptop. The 3D model can be visualized in real-time by the operator so that they can monitor their degree of coverage as the sensors are used to capture data. In addition, results can be analyzed offline using the proposed "Spectra" multispectral visualization toolbox. The presence of surface temperature data in the generated 3D model enables the operator to easily identify and measure thermal irregularities such as thermal bridges, insulation leaks, moisture build-up and HVAC faults. Moreover, 3D models generated from subsequent audits of the same environment can be automatically compared to detect temporal changes in conditions and energy use over time.
Resumo:
In 2008, Weeks et al. published the results of a postal survey, which explored the views of the Society of Hospital Pharmacists of Australia’s (SHPA) members on collaborative prescribing, and the extent of de facto prescribing in their institution. Since then, significant work has been undertaken on non-medical prescribing, such as pilots of pharmacist prescribing across Australia and a National Health Workforce report on developing a nationally consistent approach to prescribing by nonmedical health professionals. The first stage of the Health Workforce Australia Health Practitioner Prescribing Pathway project is complete and the recommendations for implementation have been approved by the Standing Council in November 2013. New Zealand pharmacists obtained prescribing rights in 2013, and the first cohort of 14 prescribers have completed the postgraduate pharmacist prescribing course (jointly run by the Otago and Auckland Universities).
Resumo:
“Supermax” prisons, conceived by the United States in the early 1980s, are typically reserved for convicted political criminals such as terrorists and spies and for other inmates who are considered to pose a serious ongoing threat to the wider community, to the security of correctional institutions, or to the safety of other inmates. Prisoners are usually restricted to their cells for up to twenty-three hours a day and typically have minimal contact with other inmates and correctional staff. Not only does the Federal Bureau of Prisons operate one of these facilities, but almost every state has either a supermax wing or stand-alone supermax prison. The Globalization of Supermax Prisons examines why nine advanced industrialized countries have adopted the supermax prototype, paying particular attention to the economic, social, and political processes that have affected each state. Featuring essays that look at the U.S.-run prisons of Abu Ghraib and Guantanemo, this collection seeks to determine if the American model is the basis for the establishment of these facilities and considers such issues as the support or opposition to the building of a supermax and why opposition efforts failed; the allegation of human rights abuses within these prisons; and the extent to which the decision to build a supermax was influenced by developments in the United States. Additionally, contributors address such domestic matters as the role of crime rates, media sensationalism, and terrorism in each country’s decision to build a supermax prison.
Resumo:
Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.
Resumo:
Aim The aim of this paper was to explore the concept of expertise in nursing from the perspective of how it relates to current driving forces in health care in which it discusses the potential barriers to acceptance of nursing expertise in a climate in which quantification of value and cost containment run high on agendas. Background Expert nursing practice can be argued to be central to high quality, holistic, individualized patient care. However, changes in government policy which have led to the inception of comprehensive guidelines or protocols of care are in danger of relegating the ‘expert nurse’ to being an icon of the past. Indeed, it could be argued that expert nurses are an expensive commodity within the nursing workforce. Consequently, with this change to the use of clinical guidelines, it calls into question how expert nursing practice will develop within this framework of care. Method The article critically reviews the evidence related to the role of the Expert Nurse in an attempt to identify the key concepts and ideas, and how the inception of care protocols has implications for their role. Conclusion Nursing expertise which focuses on the provision of individualized, holistic care and is based largely on intuitive decision making cannot, should not be reduced to being articulated in positivist terms. However, the dominant power and decision-making focus in health care means that nurses must be confident in articulating the value of a concept which may be outside the scope of knowledge of those with whom they are debating. Relevance to clinical practice The principles of abduction or fuzzy logic may be useful in assisting nurses to explain in terms which others can comprehend, the value of nursing expertise.
Resumo:
This paper explores the concept of expertise in intensive care nursing practice from the perspective of its relationship to the current driving forces in healthcare. It discusses the potential barriers to acceptance of nursing expertise in a climate in which quantification of value and cost containment run high on agendas. It argues that nursing expertise which focuses on the provision of individualised, holistic care and which is based largely on intuitive decision-making cannot and should not be reduced to being articulated in positivist terms. The principles of abduction or fuzzy logic, derived from computer science, may be useful in assisting nurses to explain in terms, which others can comprehend, the value of nursing expertise.
Jacobian-free Newton-Krylov methods with GPU acceleration for computing nonlinear ship wave patterns
Resumo:
The nonlinear problem of steady free-surface flow past a submerged source is considered as a case study for three-dimensional ship wave problems. Of particular interest is the distinctive wedge-shaped wave pattern that forms on the surface of the fluid. By reformulating the governing equations with a standard boundary-integral method, we derive a system of nonlinear algebraic equations that enforce a singular integro-differential equation at each midpoint on a two-dimensional mesh. Our contribution is to solve the system of equations with a Jacobian-free Newton-Krylov method together with a banded preconditioner that is carefully constructed with entries taken from the Jacobian of the linearised problem. Further, we are able to utilise graphics processing unit acceleration to significantly increase the grid refinement and decrease the run-time of our solutions in comparison to schemes that are presently employed in the literature. Our approach provides opportunities to explore the nonlinear features of three-dimensional ship wave patterns, such as the shape of steep waves close to their limiting configuration, in a manner that has been possible in the two-dimensional analogue for some time.
Resumo:
In this paper we describe the approaches adopted to generate the five runs submitted to ImageClefPhoto 2009 by the University of Glasgow. The aim of our methods is to exploit document diversity in the rankings. All our runs used text statistics extracted from the captions associated to each image in the collection, except one run which combines the textual statistics with visual features extracted from the provided images. The results suggest that our methods based on text captions significantly improve the performance of the respective baselines, while the approach that combines visual features with text statistics shows lower levels of improvements.
Resumo:
This paper presents the results of task 3 of the ShARe/CLEF eHealth Evaluation Lab 2013. This evaluation lab focuses on improving access to medical information on the web. The task objective was to investigate the effect of using additional information such as the discharge summaries and external resources such as medical ontologies on the IR effectiveness. The participants were allowed to submit up to seven runs, one mandatory run using no additional information or external resources, and three each using or not using discharge summaries.
Resumo:
We examine the security of the 64-bit lightweight block cipher PRESENT-80 against related-key differential attacks. With a computer search we are able to prove that for any related-key differential characteristic on full-round PRESENT-80, the probability of the characteristic only in the 64-bit state is not higher than 2−64. To overcome the exponential (in the state and key sizes) computational complexity of the search we use truncated differences, however as the key schedule is not nibble oriented, we switch to actual differences and apply early abort techniques to prune the tree-based search. With a new method called extended split approach we are able to make the whole search feasible and we implement and run it in real time. Our approach targets the PRESENT-80 cipher however,with small modifications can be reused for other lightweight ciphers as well.
Resumo:
Classical results in unconditionally secure multi-party computation (MPC) protocols with a passive adversary indicate that every n-variate function can be computed by n participants, such that no set of size t < n/2 participants learns any additional information other than what they could derive from their private inputs and the output of the protocol. We study unconditionally secure MPC protocols in the presence of a passive adversary in the trusted setup (‘semi-ideal’) model, in which the participants are supplied with some auxiliary information (which is random and independent from the participant inputs) ahead of the protocol execution (such information can be purchased as a “commodity” well before a run of the protocol). We present a new MPC protocol in the trusted setup model, which allows the adversary to corrupt an arbitrary number t < n of participants. Our protocol makes use of a novel subprotocol for converting an additive secret sharing over a field to a multiplicative secret sharing, and can be used to securely evaluate any n-variate polynomial G over a field F, with inputs restricted to non-zero elements of F. The communication complexity of our protocol is O(ℓ · n 2) field elements, where ℓ is the number of non-linear monomials in G. Previous protocols in the trusted setup model require communication proportional to the number of multiplications in an arithmetic circuit for G; thus, our protocol may offer savings over previous protocols for functions with a small number of monomials but a large number of multiplications.
Resumo:
Despite board meetings representing the main arena where directors discharge their duties and make critical corporate decisions, we know little about what occurs in the boardroom. Consequently, there is increasing academic interest in understanding how meetings are run and how directors participate. This study contributes to this emerging literature by exploring the impact of board meeting arrangements on directors’ interactions and perceptions of meeting effectiveness. We video-taped board meetings at two Australian corporations operating in the same industry and use an in-depth analysis of interactions and board processes to reveal that a rather small difference in meeting arrangements (i.e. the timing and length of meetings) had a significant influence on interaction patterns. Specifically, given significant amounts of environmental turbulence in the sector, director inclusiveness and participation were reduced as time pressure increased due to shorter meetings, lowering director perceptions of meeting effectiveness.
Resumo:
Parabolic trough concentrator collector is the most matured, proven and widespread technology for the exploitation of the solar energy on a large scale for middle temperature applications. The assessment of the opportunities and the possibilities of the collector system are relied on its optical performance. A reliable Monte Carlo ray tracing model of a parabolic trough collector is developed by using Zemax software. The optical performance of an ideal collector depends on the solar spectral distribution and the sunshape, and the spectral selectivity of the associated components. Therefore, each step of the model, including the spectral distribution of the solar energy, trough reflectance, glazing anti-reflection coating and the absorber selective coating is explained and verified. Radiation flux distribution around the receiver, and the optical efficiency are two basic aspects of optical simulation are calculated using the model, and verified with widely accepted analytical profile and measured values respectively. Reasonably very good agreement is obtained. Further investigations are carried out to analyse the characteristics of radiation distribution around the receiver tube at different insolation, envelop conditions, and selective coating on the receiver; and the impact of scattered light from the receiver surface on the efficiency. However, the model has the capability to analyse the optical performance at variable sunshape, tracking error, collector imperfections including absorber misalignment with focal line and de-focal effect of the absorber, different rim angles, and geometric concentrations. The current optical model can play a significant role in understanding the optical aspects of a trough collector, and can be employed to extract useful information on the optical performance. In the long run, this optical model will pave the way for the construction of low cost standalone photovoltaic and thermal hybrid collector in Australia for small scale domestic hot water and electricity production.
Resumo:
The goDesign Express 2011 Workshop was a design immersion workshop run by the Queensland University of Technology (QUT) Built Environment and Engineering Faculty during three weeks of 70-minute art class periods/sessions in August/September 2011 at Morayfield State High School, for 80 Grade 10 and 64 Grade 11 art students and two teachers, and October 2011 at Narangba Valley State High School for 60 Grade 10 and 30 Grade 11 art students and two teachers. Funded and administrated through QUT’s Widening Participation Program, which supports outreach activities to increase tertiary enrolments for under represented groups (such as low-SES, rural and indigenous students), the program utilised two activities from Day 1 of the highly successful 3-day goDesign Travelling Workshop Program for Regional Secondary Students (http://eprints.qut.edu.au/47747/). In contrast to this program, which was facilitated by two tertiary design educators, the goDesign Express 2011 Workshop was facilitated primarily by three tertiary interior design/architecture students, with assistance from a design educator. This action research study aimed to facilitate an awareness in young people, of the value of design thinking skills in generating strategies to solve local community challenges. It also aimed to investigate the value of collaboration between secondary school students and teachers, and tertiary design students and educators, in inspiring post-secondary pathways for school students, professional development for schoolteachers, and alternative career prospects and leadership skills for tertiary design students. During the workshop, secondary students and teachers explored, analysed and reimagined their local community through a series of scaffolded problem solving activities around the theme of ‘place’. Students worked individually and in groups designing graphics, fashion and products, and utilising sketching, making, communication, collaboration and presentation skills to improve their design process, while considering social, cultural and environmental opportunities for their local community. The workshop was mentioned in a news article in the local Caboolture Shire Herald newspaper.