946 resultados para Monitor
Resumo:
A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.
Resumo:
There is a growing interest in the use of megavoltage cone-beam computed tomography (MV CBCT) data for radiotherapy treatment planning. To calculate accurate dose distributions, knowledge of the electron density (ED) of the tissues being irradiated is required. In the case of MV CBCT, it is necessary to determine a calibration-relating CT number to ED, utilizing the photon beam produced for MV CBCT. A number of different parameters can affect this calibration. This study was undertaken on the Siemens MV CBCT system, MVision, to evaluate the effect of the following parameters on the reconstructed CT pixel value to ED calibration: the number of monitor units (MUs) used (5, 8, 15 and 60 MUs), the image reconstruction filter (head and neck, and pelvis), reconstruction matrix size (256 by 256 and 512 by 512), and the addition of extra solid water surrounding the ED phantom. A Gammex electron density CT phantom containing EDs from 0.292 to 1.707 was imaged under each of these conditions. The linear relationship between MV CBCT pixel value and ED was demonstrated for all MU settings and over the range of EDs. Changes in MU number did not dramatically alter the MV CBCT ED calibration. The use of different reconstruction filters was found to affect the MV CBCT ED calibration, as was the addition of solid water surrounding the phantom. Dose distributions from treatment plans calculated with simulated image data from a 15 MU head and neck reconstruction filter MV CBCT image and a MV CBCT ED calibration curve from the image data parameters and a 15 MU pelvis reconstruction filter showed small and clinically insignificant differences. Thus, the use of a single MV CBCT ED calibration curve is unlikely to result in any clinical differences. However, to ensure minimal uncertainties in dose reporting, MV CBCT ED calibration measurements could be carried out using parameter-specific calibration measurements.
Resumo:
Photographic records of dietary intake (PhDRs) are an innovative method for the dietary assessment and may alleviate the burden of recording intake compared to traditional methods of recording intake. While the performance of PhDRs has been evaluated, no investigation into the application of this method had occurre within dietetic practice. This study examined the attitudes of dietitians towards the use of PhDRs in the provision of nutrition care. A web-based survey on the practices and beliefs with regards to technology use among Dietitians Association of Australia members was conducted in August 2011. Of the 87 dietitians who responded, 86% assessed the intakes of clients as part of individualised medical nutrition therapy, with the diet history the most common method used. The majority (91%) of dietitians surveyed believed that a PhDR would be of use in their current practice to estimate intake. Information contained in the PhDR would primarily be used to obtain a qualitative evaluation of diet (84%) or to supplement an existing assessment method (69%), as opposed to deriving an absolute measure of nutrient intake (31%). Most (87%) indicated that a PhDR would also be beneficial in both the delivery of the intervention and to evaluate and monitor goals and outcomes, while only 46% felt that a PhDR would assist in determining the nutrition diagnosis. This survey highlights the potential for the use of PhDRs within practice. Future endeavours lie in establishing resources which support the inclusion of PhDRs within the nutrition care process.
Resumo:
This paper is a response to Hoban and Neilsen's (2010) Five Rs model for understanding how learners engage with slowmation. An alternative model (the Learning MMAEPER Model) that builds on the 5Rs model is explained in terms of its use in secondary science preservice teacher education. To probe into the surface and deep learning that can occur during the creation of a slowmation, the learning and relearning model is explored in terms of learning elements. This model can assist teachers to monitor the learning of their students and direct them to a deeper understanding of science concepts.
Resumo:
Purpose: To examine the symmetry of corneal changes following near work in the fellow eyes of non-amblyopic myopic anisometropes. Methods: Thirty-four non-amblyopic, myopic anisometropes (minimum 1 D spherical equivalent anisometropia) had corneal topography measured before and after a controlled near work task. Subjects were positioned in a headrest to minimise head movements and read continuous text on a computer monitor for 10 minutes at an angle of 25 degrees downward gaze and an accommodation demand of 2.5 D. Measures of the morphology of the palpebral aperture during primary and downward gaze were also obtained. Results: The more and less myopic eyes exhibited a high degree of interocular symmetry for measures of palpebral aperture morphology during both primary and downward gaze. Following the near work task, fellow eyes also displayed a symmetrical change in superior corneal topography (hyperopic defocus) which correlated with the position of the upper eyelid during downward gaze. Greater changes in the spherical corneal power vector (M) following reading were associated with narrower palpebral aperture during downward gaze (p = 0.07 for more myopic and p = 0.03 for less myopic eyes). A significantly greater change in J0 (an increase in against the rule astigmatism) was observed in the more myopic eyes (-0.04 ± 0.04 D) compared to the less myopic eyes (-0.02 ± 0.06 D) over a 6 mm corneal diameter (p = 0.01). Conclusions: Changes in corneal topography following near work are highly symmetrical between the fellow eyes of myopic anisometropes due to the interocular symmetry of the palpebral aperture. However, the more myopic eye exhibits changes in corneal astigmatism of greater magnitude compared to the less myopic eye.
Resumo:
Binge drinking is an important issue in Australia and worldwide. Existing studies have shown that mobile tools provide an effective method to self-monitor drink sessions, whereas social tool such as Facebook, can be used to construct social drinker identity (thus normalizing binge drinking), but if used among a peer-support that promotes the importance of responsible drinking, it potentially can be effective in moderating alcohol consumption. To combine mobile and social tool approaches, the study involves two complementary and largely qualitative studies to inform a novel design of an engaging mobile social tool for supporting responsible drinking among young women: (1) a survey of literature and mobile tools on alcohol related studies and interventions; (2) an in-depth focus group interview among young women aged 18 to 24. The results and discussions provide some valuable insights for future research and development in the field.
Early evidence for direct and indirect effects of the infant rotavirus vaccine program in Queensland
Resumo:
Objective: To assess the impact of introducing a publicly funded infant rotavirus vaccination program on disease notifications and on laboratory testing and results. Design and setting: Retrospective analysis of routinely collected data (rotavirus notifications [2006–2008] and laboratory rotavirus testing data from Queensland Health laboratories [2000–2008]) to monitor rotavirus trends before and after the introduction of a publicly funded infant rotavirus vaccination program in Queensland in July 2007. Main outcome measures: Age group-specific rotavirus notification trends; number of rotavirus tests performed and the proportion positive. Results: In the less than 2 years age group, rotavirus notifications declined by 53% (2007) and 65% (2008); the number of laboratory tests performed declined by 3% (2007) and 15% (2008); and the proportion of tests positive declined by 45% (2007) and 43% (2008) compared with data collected before introduction of the vaccination program. An indirect effect of infant vaccination was seen: notifications and the proportion of tests positive for rotavirus declined in older age groups as well. Conclusions: The publicly funded rotavirus vaccination program in Queensland is having an early impact, direct and indirect, on rotavirus disease as assessed using routinely collected data. Further observational studies are required to assess vaccine effectiveness. Parents and immunisation providers should ensure that all Australian children receive the recommended rotavirus vaccine doses in the required timeframe.
Resumo:
Vertical displacements are one of the most relevant parameters for structural health monitoring of bridges in both the short and long terms. Bridge managers around the globe are always looking for a simple way to measure vertical displacements of bridges. However, it is difficult to carry out such measurements. On the other hand, in recent years, with the advancement of fiber-optic technologies, fiber Bragg grating (FBG) sensors are more commonly used in structural health monitoring due to their outstanding advantages including multiplexing capability, immunity of electromagnetic interference as well as high resolution and accuracy. For these reasons, using FBG sensors is proposed to develop a simple, inexpensive and practical method to measure vertical displacements of bridges. A curvature approach for vertical displacement measurements using curvature measurements is proposed. In addition, with the successful development of FBG tilt sensors, an inclination approach is also proposed using inclination measurements. A series of simulation tests of a full- scale bridge was conducted. It shows that both of the approaches can be implemented to determine vertical displacements for bridges with various support conditions, varying stiffness (EI) along the spans and without any prior known loading. These approaches can thus measure vertical displacements for most of slab-on-girder and box-girder bridges. Besides, the approaches are feasible to implement for bridges under various loading. Moreover, with the advantages of FBG sensors, they can be implemented to monitor bridge behavior remotely and in real time. A beam loading test was conducted to determine vertical displacements using FBG strain sensors and tilt sensors. The discrepancies as compared with dial gauges reading using the curvature and inclination approaches are 0.14mm (1.1%) and 0.41mm (3.2%), respectively. Further recommendations of these approaches for developments will also be discussed at the end of the paper.
Resumo:
Reporting of medication administration errors (MAEs) is one means by which health care facilities monitor their practice in an attempt to maintain the safest patient environment. This study examined the likelihood of registered nurses (RNs) reporting MAEs when working in Saudi Arabia. It also attempted to identify potential barriers in the reporting of MAE. This study found that 63% of RNs raised concerns about reporting of MAEs in Saudi Arabia—nursing administration was the largest impediment affecting nurses' willingness to report MAEs. Changing attitude to a non-blame system and implementation of anonymous reporting systems may encourage a greater reporting of MAEs.
Resumo:
In this paper we demonstrate how to monitor a smartphone running Symbian operating system and Windows Mobile in order to extract features for anomaly detection. These features are sent to a remote server because running a complex intrusion detection system on this kind of mobile device still is not feasible due to capability and hardware limitations. We give examples on how to compute relevant features and introduce the top ten applications used by mobile phone users based on a study in 2005. The usage of these applications is recorded by a monitoring client and visualized. Additionally, monitoring results of public and self-written malwares are shown. For improving monitoring client performance, Principal Component Analysis was applied which lead to a decrease of about 80 of the amount of monitored features.
Resumo:
Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.
Resumo:
Articular cartilage is a complex structure with an architecture in which fluid-swollen proteoglycans constrained within a 3D network of collagen fibrils. Because of the complexity of the cartilage structure, the relationship between its mechanical behaviours at the macroscale level and its components at the micro-scale level are not completely understood. The research objective in this thesis is to create a new model of articular cartilage that can be used to simulate and obtain insight into the micro-macro-interaction and mechanisms underlying its mechanical responses during physiological function. The new model of articular cartilage has two characteristics, namely: i) not use fibre-reinforced composite material idealization ii) Provide a framework for that it does probing the micro mechanism of the fluid-solid interaction underlying the deformation of articular cartilage using simple rules of repartition instead of constitutive / physical laws and intuitive curve-fitting. Even though there are various microstructural and mechanical behaviours that can be studied, the scope of this thesis is limited to osmotic pressure formation and distribution and their influence on cartilage fluid diffusion and percolation, which in turn governs the deformation of the compression-loaded tissue. The study can be divided into two stages. In the first stage, the distributions and concentrations of proteoglycans, collagen and water were investigated using histological protocols. Based on this, the structure of cartilage was conceptualised as microscopic osmotic units that consist of these constituents that were distributed according to histological results. These units were repeated three-dimensionally to form the structural model of articular cartilage. In the second stage, cellular automata were incorporated into the resulting matrix (lattice) to simulate the osmotic pressure of the fluid and the movement of water within and out of the matrix; following the osmotic pressure gradient in accordance with the chosen rule of repartition of the pressure. The outcome of this study is the new model of articular cartilage that can be used to simulate and study the micromechanical behaviours of cartilage under different conditions of health and loading. These behaviours are illuminated at the microscale level using the socalled neighbourhood rules developed in the thesis in accordance with the typical requirements of cellular automata modelling. Using these rules and relevant Boundary Conditions to simulate pressure distribution and related fluid motion produced significant results that provided the following insight into the relationships between osmotic pressure gradient and associated fluid micromovement, and the deformation of the matrix. For example, it could be concluded that: 1. It is possible to model articular cartilage with the agent-based model of cellular automata and the Margolus neighbourhood rule. 2. The concept of 3D inter connected osmotic units is a viable structural model for the extracellular matrix of articular cartilage. 3. Different rules of osmotic pressure advection lead to different patterns of deformation in the cartilage matrix, enabling an insight into how this micromechanism influences macromechanical deformation. 4. When features such as transition coefficient were changed, permeability (representing change) is altered due to the change in concentrations of collagen, proteoglycans (i.e. degenerative conditions), the deformation process is impacted. 5. The boundary conditions also influence the relationship between osmotic pressure gradient and fluid movement at the micro-scale level. The outcomes are important to cartilage research since we can use these to study the microscale damage in the cartilage matrix. From this, we are able to monitor related diseases and their progression leading to potential insight into drug-cartilage interaction for treatment. This innovative model is an incremental progress on attempts at creating further computational modelling approaches to cartilage research and other fluid-saturated tissues and material systems.
Resumo:
Our daily lives become more and more dependent upon smartphones due to their increased capabilities. Smartphones are used in various ways from payment systems to assisting the lives of elderly or disabled people. Security threats for these devices become increasingly dangerous since there is still a lack of proper security tools for protection. Android emerges as an open smartphone platform which allows modification even on operating system level. Therefore, third-party developers have the opportunity to develop kernel-based low-level security tools which is not normal for smartphone platforms. Android quickly gained its popularity among smartphone developers and even beyond since it bases on Java on top of "open" Linux in comparison to former proprietary platforms which have very restrictive SDKs and corresponding APIs. Symbian OS for example, holding the greatest market share among all smartphone OSs, was closing critical APIs to common developers and introduced application certification. This was done since this OS was the main target for smartphone malwares in the past. In fact, more than 290 malwares designed for Symbian OS appeared from July 2004 to July 2008. Android, in turn, promises to be completely open source. Together with the Linux-based smartphone OS OpenMoko, open smartphone platforms may attract malware writers for creating malicious applications endangering the critical smartphone applications and owners� privacy. In this work, we present our current results in analyzing the security of Android smartphones with a focus on its Linux side. Our results are not limited to Android, they are also applicable to Linux-based smartphones such as OpenMoko Neo FreeRunner. Our contribution in this work is three-fold. First, we analyze android framework and the Linux-kernel to check security functionalities. We survey wellaccepted security mechanisms and tools which can increase device security. We provide descriptions on how to adopt these security tools on Android kernel, and provide their overhead analysis in terms of resource usage. As open smartphones are released and may increase their market share similar to Symbian, they may attract attention of malware writers. Therefore, our second contribution focuses on malware detection techniques at the kernel level. We test applicability of existing signature and intrusion detection methods in Android environment. We focus on monitoring events on the kernel; that is, identifying critical kernel, log file, file system and network activity events, and devising efficient mechanisms to monitor them in a resource limited environment. Our third contribution involves initial results of our malware detection mechanism basing on static function call analysis. We identified approximately 105 Executable and Linking Format (ELF) executables installed to the Linux side of Android. We perform a statistical analysis on the function calls used by these applications. The results of the analysis can be compared to newly installed applications for detecting significant differences. Additionally, certain function calls indicate malicious activity. Therefore, we present a simple decision tree for deciding the suspiciousness of the corresponding application. Our results present a first step towards detecting malicious applications on Android-based devices.
Resumo:
In the increasingly competitive Australian tertiary education market, a consumer orientation is essential. This is particularly so for small regional campuses competing with larger universities in the state capitals. Campus management need to carefully monitor both the perceptions of prospective students within the catchment area, and the (dis)satisfaction levels of current students. This study reports the results of an exploratory investigation into the perceptions held of a regional campus, using two techniques that have arguably been underutilised in the education marketing literature. Repertory Grid Analysis, a technique developed almost fifty years ago, was used to identify attributes deemed salient to year 12 high school students at the time they were applying for university places. Importance-performance analysis (IPA), developed three decades ago, was then used to identify attributes that were determinant for a new cohort of first year undergraduate students. The paper concludes that group applications of Repertory Grid offer education market researchers a useful means for identifying attributes used by high school students to differentiate universities, and that IPA is a useful technique for guiding promotional decision making. In this case, the two techniques provided a quick, economical and effective snapshot of market perceptions, which can be used as a foundation for the development of an ongoing market research programme.
Resumo:
Given the increasing investments being made in brand development by destination marketing organisations (DMO) since the 1990s, including rebranding and repositioning, more research is needed to enhance understanding of how to effectively monitor destination brand performance over time. This paper reports the results of a study of brand performance of a competitive set of destinations, in their most important market, between 2003 and 2012. Brand performance was measured from the perspective of consumer perceptions, based on the concept of consumer-based brand equity (CBBE). A structured questionnaire was administered to different samples in 2003, 2007 and 2012. The results indicated minimal changes in perceptions of the five destinations over the 10 year period. Due to the commonality of challenges faced by DMOs worldwide, it is suggested the CBBE hierarchy provides destination marketers with a practical tool for evaluating brand performance over time; in terms of measures of effectiveness of past marketing communications, as well as indicators of future performance. In addition, and importantly, CBBE also provides transparent accountability measures for stakeholders. While the topic of destination image has been one of the most popular in the tourism literature, there has been a paucity of research published in relation to the temporal aspect of consumer perceptions. This is a rare investigation into the measurement of perceptions of destinations over a 10 year period.