633 resultados para Real blow up
Resumo:
This paper presents an investigation into event detection in crowded scenes, where the event of interest co-occurs with other activities and only binary labels at the clip level are available. The proposed approach incorporates a fast feature descriptor from the MPEG domain, and a novel multiple instance learning (MIL) algorithm using sparse approximation and random sensing. MPEG motion vectors are used to build particle trajectories that represent the motion of objects in uniform video clips, and the MPEG DCT coefficients are used to compute a foreground map to remove background particles. Trajectories are transformed into the Fourier domain, and the Fourier representations are quantized into visual words using the K-Means algorithm. The proposed MIL algorithm models the scene as a linear combination of independent events, where each event is a distribution of visual words. Experimental results show that the proposed approaches achieve promising results for event detection compared to the state-of-the-art.
Resumo:
Safety concerns in the operation of autonomous aerial systems require safe-landing protocols be followed during situations where the a mission should be aborted due to mechanical or other failure. On-board cameras provide information that can be used in the determination of potential landing sites, which are continually updated and ranked to prevent injury and minimize damage. Pulse Coupled Neural Networks have been used for the detection of features in images that assist in the classification of vegetation and can be used to minimize damage to the aerial vehicle. However, a significant drawback in the use of PCNNs is that they are computationally expensive and have been more suited to off-line applications on conventional computing architectures. As heterogeneous computing architectures are becoming more common, an OpenCL implementation of a PCNN feature generator is presented and its performance is compared across OpenCL kernels designed for CPU, GPU and FPGA platforms. This comparison examines the compute times required for network convergence under a variety of images obtained during unmanned aerial vehicle trials to determine the plausibility for real-time feature detection.
Resumo:
The purpose of this scoping paper is to offer an overview of the literature to determine the development to date in the area of residential real estate agency academic and career education in respect to Foreign Direct Investment (FDI) transactions and implications in Australia. This paper will review studies on the issue of foreign real estate ownership and FDI in Australian real estate markets to develop an understanding of the current state of knowledge on residential real estate agency practice, career education and real estate licensing requirements in Australia. The distinction between the real estate profession education, compared to other professions such as accounting, legal and finance is based on the intensity of the professional career training prior or post formal academic training. Real estate education could be carried out with relatively higher standards in terms of licensing requirement, career and academic education. As FDI in the Australian real estate market is a complex globalisation and economic phenomenon, a simple content of residential real estate training and education may not promote proper management or capacity in dealing with relevant foreign residential property market transaction. The preliminary summarising from the literature of residential real estate agency education, with its current relevant or emerging licensing requirement are focused on its role and effectiveness and impact in residential real estate market. Particular focus will be directed to the FDI relevant residential real estate agency transactions and practices, which have been strongly influenced by the current residential real estate market and agency practices. Taken together, there are many opportunities for future research to extend our understanding and improving the residential real estate agency education and training of Foreign Direct Investment in the Australian residential real estate sector.
Resumo:
Real Estate Agency education in Australia has undergone many changes during the last 40 years. This is primarily due to the changing nature of consumer protection provided by government and the expectation of increased fields of knowledge relating to real estate transactions. Each state and territory within Australia has a range of regulatory bodies to oversee consumer protection and the distinct licenses and educational requirements that comprise their regimes. Since the 1970’s minimum educational requirements in New South Wales are prescribed for real estate agency work. However, very little research exists in the Australian literature, where an analysis has been undertaken to evaluate these changes, which includes course content, hours allocated for each subject, assessment criteria for each subject,the educational qualification attained, and the requirement for industry experience as a pre-requisite for licensing approval. It is argued that the change to educational requirements, has impacted negatively for the consumer, with an increase in consumer complaints, particularly during the last 10 years.
Resumo:
As teacher/researchers interested in the pursuit of socially-just outcomes in early childhood education, the form and function of language occupies a special position in our work. We believe that mastering a range of literacy competences includes not only the technical skills for learning, but also the resources for viewing and constructing the world (Freire and Macdeo, 1987). Rather than seeing knowledge about language as the accumulation of technical skills alone, the viewpoint to which we subscribe treats knowledge about language as a dialectic that evolves from, is situated in, and contributes to a social arena (Halliday, 1978). We do not shy away from this position just because children are in the early years of schooling. In ‘Playing with Grammar’, we focus on the Foundation to Year 2 grouping, in line with the Australian Curriculum, Assessment and Reporting Authority’s (hereafter ACARA) advice on the ‘nature of learners’ (ACARA, 2013). With our focus on the early years of schooling comes our acknowledgement of the importance and complexity of play. At a time where accountability in education has moved many teachers to a sense of urgency to prove language and literacy achievement (Genishi and Dyson, 2009), we encourage space to revisit what we know about literature choices and learning experiences and bring these together to facilitate language learning. We can neither ignore, nor overemphasise, the importance of play for the development of language through: the opportunities presented for creative use and practice; social interactions for real purposes; and, identifying and solving problems in the lives of young children (Marsh and Hallet, 2008). We argue that by engaging young children in opportunities to play with language we are ultimately empowering them to be active in their language learning and in the process fostering a love of language and the intricacies it holds. Our goal in this publication is to provide a range of highly practical strategies for scaffolding young children through some of the Content Descriptions from the Australian Curriculum English Version 5.0, hereafter AC:E V5.0 (ACARA, 2013). This recently released curriculum offers a new theoretical approach to building children’s knowledge about language. The AC:E V5.0 uses selected traditional terms through an approach developed in systemic functional linguistics (see Halliday and Matthiessen, 2004) to highlight the dynamic forms and functions of multimodal language in texts. For example, the following statement, taken from the ‘Language: Knowing about the English language’ strand states: English uses standard grammatical terminology within a contextual framework, in which language choices are seen to vary according to the topics at hand, the nature and proximity of the relationships between the language users, and the modalities or channels of communication available (ACARA, 2013). Put simply, traditional grammar terms are used within a functional framework made up of field, tenor, and mode. An understanding of genre is noted with the reference to a ‘contextual framework’. The ‘topics at hand’ concern the field or subject matter of the text. The ‘relationships between the language users’ is a description of tenor. There is reference to ‘modalities’, such as spoken, written or visual text. We posit that this innovative approach is necessary for working with contemporary multimodal and cross-cultural texts (see Exley and Mills, 2012). We believe there is enormous power in using literature to expose children to the richness of language and in turn develop language and literacy skills. Taking time to look at language patterns within actual literature is a pathway to ‘…capture interest, stir the imagination and absorb the [child]’ into the world of language and literacy (Saxby, 1993, p. 55). In the following three sections, we have tried to remain faithful to our interpretation of the AC:E V5.0 Content Descriptions without giving an exhaustive explanation of the grammatical terms. Other excellent tomes, such as Derewianka (2011), Humphrey, Droga and Feez (2012), and Rossbridge and Rushton (2011) provide these more comprehensive explanations as does the AC:E V5.0 Glossary. We’ve reproduced some of the AC:E V5.0 glossary at the end of this publication. Our focus is on the structure and unfolding of the learning experiences. We outline strategies for working with children in Foundation, Year 1 and Year 2 by providing some demonstration learning experiences based on texts we’ve selected, but maintain that the affordances of these strategies will only be realised when teaching and learning is purposively tied to authentic projects in local contexts. We strongly encourage you not to use only the resource texts we’ve selected, but to capitalise upon your skill for identifying the language features in the texts you and the children are studying and adapt some of the strategies we have outlined. Each learning experience is connected to one of the Content Descriptions from the AC:E V5.0 and contains an experience specific purpose, a suggested resource text and a sequence for the experience that always commences with an orientation to text followed by an examination of a particular grammatical resource. We expect that each of these learning experiences will take a couple if not a few teaching episodes to work through, especially if children are meeting a concept for the first time. We hope you use as much, or as little, of each experience as is needed. Our plans allow for focused discussion, shared exploration and opportunities to revisit the same text for the purpose of enhancing meaning making. We do not want the teaching of grammar to slip into a crisis of irrelevance or to be seen as a series of worksheet drills with finite answers. Strategies for effective practice, however, have much portability. We are both very keen to hear from teachers who are adopting and adapting these learning experiences in their classrooms. Please email us on b.exley@qut.edu.au or lkervin@uow.edu.au. We’d love to continue the conversation with you over time.
Resumo:
This paper proposes to adopt data envelopment analysis (DEA) based Malmquist total factor productivity (TFP) indices methods to evaluate the effect of mergers and acquisitions (M&As) on acquirers in short-term and long-term window. Based on analyzing 32 M&A deals conducted by Chinese real estate firms from 2000-2011, the study result demonstrate that the effect of M&A on developers’ performance is positive. Through M&A, the developers’ Malmquist TFP experienced a steady growth; their technology has got noticeable progress immediately after acquisition; and their technical efficiency has suffered a slight decrease in short-term after acquisition, but then achieved marked increase in the long-term when realization of integration and synergy. However, there is no evidence that the real estate firms have achieved scale efficiency improvement after M&A in either short-term or long-term.
Resumo:
Price based technique is one way to handle increase in peak demand and deal with voltage violations in residential distribution systems. This paper proposes an improved real time pricing scheme for residential customers with demand response option. Smart meters and in-home display units are used to broadcast the price and appropriate load adjustment signals. Customers are given an opportunity to respond to the signals and adjust the loads. This scheme helps distribution companies to deal with overloading problems and voltage issues in a more efficient way. Also, variations in wholesale electricity prices are passed on to electricity customers to take collective measure to reduce network peak demand. It is ensured that both customers and utility are benefitted by this scheme.
Resumo:
Introduction Falls are the most frequent adverse event reported in hospitals. Approximately 30% of in-hospital falls lead to an injury and up to 2% result in a fracture. A large randomised trial found that a trained health professional providing individualised falls prevention education to older inpatients reduced falls in a cognitively intact subgroup. This study aims to investigate whether this efficacious intervention can reduce falls and be clinically useful and cost-effective when delivered in the real-life clinical environment. Methods A stepped-wedge cluster randomised trial will be used across eight subacute units (clusters) which will be randomised to one of four dates to start the intervention. Usual care on these units includes patient's screening, assessment and implementation of individualised falls prevention strategies, ongoing staff training and environmental strategies. Patients with better levels of cognition (Mini-Mental State Examination >23/30) will receive the individualised education from a trained health professional in addition to usual care while patient's feedback received during education sessions will be provided to unit staff. Unit staff will receive training to assist in intervention delivery and to enhance uptake of strategies by patients. Falls data will be collected by two methods: case note audit by research assistants and the hospital falls reporting system. Cluster-level data including patient's admissions, length of stay and diagnosis will be collected from hospital systems. Data will be analysed allowing for correlation of outcomes (clustering) within units. An economic analysis will be undertaken which includes an incremental cost-effectiveness analysis. Ethics and dissemination The study was approved by The University of Notre Dame Australia Human Research Ethics Committee and local hospital ethics committees. Results The results will be disseminated through local site networks, and future funding and delivery of falls prevention programmes within WA Health will be informed. Results will also be disseminated through peer-reviewed publications and medical conferences.
Resumo:
This paper describes the theory and practice for a stable haptic teleoperation of a flying vehicle. It extends passivity-based control framework for haptic teleoperation of aerial vehicles in the longest intercontinental setting that presents great challenges. The practicality of the control architecture has been shown in maneuvering and obstacle-avoidance tasks over the internet with the presence of significant time-varying delays and packet losses. Experimental results are presented for teleoperation of a slave quadrotor in Australia from a master station in the Netherlands. The results show that the remote operator is able to safely maneuver the flying vehicle through a structure using haptic feedback of the state of the slave and the perceived obstacles.
Resumo:
The ability to measure surface temperature and represent it on a metrically accurate 3D model has proven applications in many areas such as medical imaging, building energy auditing, and search and rescue. A system is proposed that enables this task to be performed with a handheld sensor, and for the first time with results able to be visualized and analyzed in real-time. A device comprising a thermal-infrared camera and range sensor is calibrated geometrically and used for data capture. The device is localized using a combination of ICP and video-based pose estimation from the thermal-infrared video footage which is shown to reduce the occurrence of failure modes. Furthermore, the problem of misregistration which can introduce severe distortions in assigned surface temperatures is avoided through the use of a risk-averse neighborhood weighting mechanism. Results demonstrate that the system is more stable and accurate than previous approaches, and can be used to accurately model complex objects and environments for practical tasks.
Resumo:
Technological advances have led to an influx of affordable hardware that supports sensing, computation and communication. This hardware is increasingly deployed in public and private spaces, tracking and aggregating a wealth of real-time environmental data. Although these technologies are the focus of several research areas, there is a lack of research dealing with the problem of making these capabilities accessible to everyday users. This thesis represents a first step towards developing systems that will allow users to leverage the available infrastructure and create custom tailored solutions. It explores how this notion can be utilized in the context of energy monitoring to improve conventional approaches. The project adopted a user-centered design process to inform the development of a flexible system for real-time data stream composition and visualization. This system features an extensible architecture and defines a unified API for heterogeneous data streams. Rather than displaying the data in a predetermined fashion, it makes this information available as building blocks that can be combined and shared. It is based on the insight that individual users have diverse information needs and presentation preferences. Therefore, it allows users to compose rich information displays, incorporating personally relevant data from an extensive information ecosystem. The prototype was evaluated in an exploratory study to observe its natural use in a real-world setting, gathering empirical usage statistics and conducting semi-structured interviews. The results show that a high degree of customization does not warrant sustained usage. Other factors were identified, yielding recommendations for increasing the impact on energy consumption.
Resumo:
Impaction bone grafting for reconstitution of bone stock in revision hip surgery has been used for nearly 30 years. We used this technique, in combination with a cemented acetabular component, in the acetabula of 304 hips in 292 patients revised for aseptic loosening between 1995 and 2001. The only additional supports used were stainless steel meshes placed against the medial wall or laterally around the acetabular rim to contain the graft. All Paprosky grades of defect were included. Clinical and radiographic outcomes were collected in surviving patients at a minimum of 10 years following the index operation. Mean follow-up was 12.4 years (SD 1.5; range 10.0-16.0). Kaplan-Meier survivorship with revision for aseptic loosening as the endpoint was 85.9% (95% CI 81.0 to 90.8%) at 13.5 years. Clinical scores for pain relief remained satisfactory, and there was no difference in clinical scores between cups that appeared stable and those that appeared loose radiographically.
Resumo:
Currently, the GNSS computing modes are of two classes: network-based data processing and user receiver-based processing. A GNSS reference receiver station essentially contributes raw measurement data in either the RINEX file format or as real-time data streams in the RTCM format. Very little computation is carried out by the reference station. The existing network-based processing modes, regardless of whether they are executed in real-time or post-processed modes, are centralised or sequential. This paper describes a distributed GNSS computing framework that incorporates three GNSS modes: reference station-based, user receiver-based and network-based data processing. Raw data streams from each GNSS reference receiver station are processed in a distributed manner, i.e., either at the station itself or at a hosting data server/processor, to generate station-based solutions, or reference receiver-specific parameters. These may include precise receiver clock, zenith tropospheric delay, differential code biases, ambiguity parameters, ionospheric delays, as well as line-of-sight information such as azimuth and elevation angles. Covariance information for estimated parameters may also be optionally provided. In such a mode the nearby precise point positioning (PPP) or real-time kinematic (RTK) users can directly use the corrections from all or some of the stations for real-time precise positioning via a data server. At the user receiver, PPP and RTK techniques are unified under the same observation models, and the distinction is how the user receiver software deals with corrections from the reference station solutions and the ambiguity estimation in the observation equations. Numerical tests demonstrate good convergence behaviour for differential code bias and ambiguity estimates derived individually with single reference stations. With station-based solutions from three reference stations within distances of 22–103 km the user receiver positioning results, with various schemes, show an accuracy improvement of the proposed station-augmented PPP and ambiguity-fixed PPP solutions with respect to the standard float PPP solutions without station augmentation and ambiguity resolutions. Overall, the proposed reference station-based GNSS computing mode can support PPP and RTK positioning services as a simpler alternative to the existing network-based RTK or regionally augmented PPP systems.
Resumo:
This paper will identify and discuss the major occupational health and safety (OHS) hazards and risks for clean-up and recovery workers. The lessons learned from previous disasters including; the Exxon Valdez oil spill, World Trade Centre (WTC) terrorist attack, Hurricane Katrina and the Deepwater Horizon Gulf of Mexico oil spill will be discussed. The case for an increased level of preparation and planning to mitigate the health risks for clean-up and recovery workers will be presented, based on recurring themes identified in the peer reviewed literature. There are a number of important issues pertaining to the occupational health and safety of workers who are engaged in clean-up and recovery operations following natural and technological disasters. These workers are often exposed to a wide range of occupational health and safety hazards, some of which may be unknown at the time. It is well established that clean-up and recovery operations involve risks of physical injury, for example, from manual handling, mechanical equipment, extreme temperatures, slips, trips and falls. In addition to these well established physical injury risks there are now an increasing number of studies which highlight the risks of longer term or chronic health effects arising from clean-up and recovery work. In particular, follow up studies from the Exxon Valdez oil spill, Hurricane Katrina and the World Trade Centre (WTC) terrorism attack have documented the longer term health consequences of these events. These health effects include respiratory symptoms and musculoskeletal disorders, as well as post traumatic stress disorder (PTSD). In large scale operations many of those workers and supervisors involved have not had any specific occupational health and safety (OHS) training and may not have access to the necessary instruction, personal protective equipment or other appropriate equipment, this is especially true when volunteers are used to form part of the clean-up and recovery workforce. In general, first responders are better equipped and trained than clean-up and recovery workers and some of the training approaches used for the traditional first responders would be relevant for clean-up and recovery workers.