430 resultados para Display Modalities
Resumo:
In this paper we present a tutorial introduction to two important senses for biological and robotic systems — inertial and visual perception. We discuss the fundamentals of these two sensing modalities from a biological and an engineering perspective. Digital camera chips and micro-machined accelerometers and gyroscopes are now commodities, and when combined with today's available computing can provide robust estimates of self-motion as well 3D scene structure, without external infrastructure. We discuss the complementarity of these sensors, describe some fundamental approaches to fusing their outputs and survey the field.
Resumo:
The article described an open-source toolbox for machine vision called Machine Vision Toolbox (MVT). MVT includes more than 60 functions including image file reading and writing, acquisition, display, filtering, blob, point and line feature extraction, mathematical morphology, homographies, visual Jacobians, camera calibration, and color space conversion. MVT can be used for research into machine vision but is also versatile enough to be usable for real-time work and even control. MVT, combined with MATLAB and a model workstation computer, is a useful and convenient environment for the investigation of machine vision algorithms. The article illustrated the use of a subset of toolbox functions for some typical problems and described MVT operations including the simulation of a complete image-based visual servo system.
Resumo:
Intelligent surveillance systems typically use a single visual spectrum modality for their input. These systems work well in controlled conditions, but often fail when lighting is poor, or environmental effects such as shadows, dust or smoke are present. Thermal spectrum imagery is not as susceptible to environmental effects, however thermal imaging sensors are more sensitive to noise and they are only gray scale, making distinguishing between objects difficult. Several approaches to combining the visual and thermal modalities have been proposed, however they are limited by assuming that both modalities are perfuming equally well. When one modality fails, existing approaches are unable to detect the drop in performance and disregard the under performing modality. In this paper, a novel middle fusion approach for combining visual and thermal spectrum images for object tracking is proposed. Motion and object detection is performed on each modality and the object detection results for each modality are fused base on the current performance of each modality. Modality performance is determined by comparing the number of objects tracked by the system with the number detected by each mode, with a small allowance made for objects entering and exiting the scene. The tracking performance of the proposed fusion scheme is compared with performance of the visual and thermal modes individually, and a baseline middle fusion scheme. Improvement in tracking performance using the proposed fusion approach is demonstrated. The proposed approach is also shown to be able to detect the failure of an individual modality and disregard its results, ensuring performance is not degraded in such situations.
Resumo:
Six Sigma provides a framework for quality improvement and business excellence. Introduced in the 1980s in manufacturing, the concept of Six Sigma has gained popularity in service organizations. After initial success in healthcare and banking, Six Sigma has gradually gained traction in other types of service industries, including hotels and lodging. Starwood Hotels and Resorts was the first hospitality giant to embrace Six Sigma. In 2001, Starwood adopted the method to develop innovative, customer-focused solutions and to transfer these solutions throughout the global organization. To analyze Starwood's use of Six Sigma, the authors collected data from articles, interviews, presentations and speeches published in magazines, newspapers and Web sites. This provided details to corroborate information, and they also made inferences from these sources. Financial metrics can explain the success of Six Sigma in any organization. There was no shortage of examples of Starwood's success resulting from Six Sigma project metrics uncovered during the research.
Resumo:
Purpose of review: To examine the relationship between energy intake, appetite control and exercise, with particular reference to longer term exercise studies. This approach is necessary when exploring the benefits of exercise for weight control, as changes in body weight and energy intake are variable and reflect diversity in weight loss. Recent findings: Recent evidence indicates that longer term exercise is characterized by a highly variable response in eating behaviour. Individuals display susceptibility or resistance to exercise-induced weight loss, with changes in energy intake playing a key role in determining the degree of weight loss achieved. Marked differences in hunger and energy intake exist between those who are capable of tolerating periods of exercise-induced energy deficit, and those who are not. Exercise-induced weight loss can increase the orexigenic drive in the fasted state, but for some this is offset by improved postprandial satiety signalling. Summary: The biological and behavioural responses to acute and long-term exercise are highly variable, and these responses interact to determine the propensity for weight change. For some people, long-term exercise stimulates compensatory increases in energy intake that attenuate weight loss. However, favourable changes in body composition and health markers still exist in the absence of weight loss. The physiological mechanisms that confer susceptibility to compensatory overconsumption still need to be determined.
Resumo:
In this research I have examined how ePortfolios can be designed for Music postgraduate study through a practice led research enquiry. This process involved designing two Web 2.0 ePortfolio systems for a group of five post graduate music research students. The design process revolved around the application of an iterative methodology called Software Develop as Research (SoDaR) that seeks to simultaneously develop design and pedagogy. The approach to designing these ePortfolio systems applied four theoretical protocols to examine the use of digitised artefacts in ePortfolio systems to enable a dynamic and inclusive dialogue around representations of the students work. The research and design process involved an analysis of existing software and literature with a focus upon identifying the affordances of available Web 2.0 software and the applications of these ideas within 21st Century life. The five post graduate music students each posed different needs in relation to the management of digitised artefacts and the communication of their work amongst peers, supervisors and public display. An ePortfolio was developed for each of them that was flexible enough to address their needs within the university setting. However in this first SoDaR iteration data gathering phase I identified aspects of the university context that presented a negative case that impacted upon the design and usage of the ePortfolios and prevented uptake. Whilst the portfolio itself functioned effectively, the university policies and technical requirements prevented serious use. The negative case analysis of the case study found revealed that Access and Control and Implementation, Technical and Policy Constraints protocols where limiting user uptake. From the semistructured interviews carried out as part of this study participant feedback revealed that whilst the participants did not use the ePortfolio system I designed, each student was employing Web 2.0 social networking and storage processes in their lives and research. In the subsequent iterations I then designed a more ‘ideal’ system that could be applied outside of the University context that draws upon the employment of these resources. In conclusion I suggest recommendations about ePortfolio design that considers what the applications of the theoretical protocols reveal about creative arts settings. The transferability of these recommendations are of course dependent upon the reapplication of the theoretical protocols in a new context. To address the mobility of ePortfolio design between Institutions and wider settings I have also designed a prototype for a business card sized USB portal for the artists’ ePortfolio. This research project is not a static one; it stands as an evolving design for a Web 2.0 ePortfolio that seeks to refer to users needs, institutional and professional contexts and the development of software that can be incorporated within the design. What it potentially provides to creative artist is an opportunity to have a dialogue about art with artefacts of the artist products and processes in that discussion.
Resumo:
World economies increasingly demand reliable and economical power supply and distribution. To achieve this aim the majority of power systems are becoming interconnected, with several power utilities supplying the one large network. One problem that occurs in a large interconnected power system is the regular occurrence of system disturbances which can result in the creation of intra-area oscillating modes. These modes can be regarded as the transient responses of the power system to excitation, which are generally characterised as decaying sinusoids. For a power system operating ideally these transient responses would ideally would have a “ring-down” time of 10-15 seconds. Sometimes equipment failures disturb the ideal operation of power systems and oscillating modes with ring-down times greater than 15 seconds arise. The larger settling times associated with such “poorly damped” modes cause substantial power flows between generation nodes, resulting in significant physical stresses on the power distribution system. If these modes are not just poorly damped but “negatively damped”, catastrophic failures of the system can occur. To ensure system stability and security of large power systems, the potentially dangerous oscillating modes generated from disturbances (such as equipment failure) must be quickly identified. The power utility must then apply appropriate damping control strategies. In power system monitoring there exist two facets of critical interest. The first is the estimation of modal parameters for a power system in normal, stable, operation. The second is the rapid detection of any substantial changes to this normal, stable operation (because of equipment breakdown for example). Most work to date has concentrated on the first of these two facets, i.e. on modal parameter estimation. Numerous modal parameter estimation techniques have been proposed and implemented, but all have limitations [1-13]. One of the key limitations of all existing parameter estimation methods is the fact that they require very long data records to provide accurate parameter estimates. This is a particularly significant problem after a sudden detrimental change in damping. One simply cannot afford to wait long enough to collect the large amounts of data required for existing parameter estimators. Motivated by this gap in the current body of knowledge and practice, the research reported in this thesis focuses heavily on rapid detection of changes (i.e. on the second facet mentioned above). This thesis reports on a number of new algorithms which can rapidly flag whether or not there has been a detrimental change to a stable operating system. It will be seen that the new algorithms enable sudden modal changes to be detected within quite short time frames (typically about 1 minute), using data from power systems in normal operation. The new methods reported in this thesis are summarised below. The Energy Based Detector (EBD): The rationale for this method is that the modal disturbance energy is greater for lightly damped modes than it is for heavily damped modes (because the latter decay more rapidly). Sudden changes in modal energy, then, imply sudden changes in modal damping. Because the method relies on data from power systems in normal operation, the modal disturbances are random. Accordingly, the disturbance energy is modelled as a random process (with the parameters of the model being determined from the power system under consideration). A threshold is then set based on the statistical model. The energy method is very simple to implement and is computationally efficient. It is, however, only able to determine whether or not a sudden modal deterioration has occurred; it cannot identify which mode has deteriorated. For this reason the method is particularly well suited to smaller interconnected power systems that involve only a single mode. Optimal Individual Mode Detector (OIMD): As discussed in the previous paragraph, the energy detector can only determine whether or not a change has occurred; it cannot flag which mode is responsible for the deterioration. The OIMD seeks to address this shortcoming. It uses optimal detection theory to test for sudden changes in individual modes. In practice, one can have an OIMD operating for all modes within a system, so that changes in any of the modes can be detected. Like the energy detector, the OIMD is based on a statistical model and a subsequently derived threshold test. The Kalman Innovation Detector (KID): This detector is an alternative to the OIMD. Unlike the OIMD, however, it does not explicitly monitor individual modes. Rather it relies on a key property of a Kalman filter, namely that the Kalman innovation (the difference between the estimated and observed outputs) is white as long as the Kalman filter model is valid. A Kalman filter model is set to represent a particular power system. If some event in the power system (such as equipment failure) causes a sudden change to the power system, the Kalman model will no longer be valid and the innovation will no longer be white. Furthermore, if there is a detrimental system change, the innovation spectrum will display strong peaks in the spectrum at frequency locations associated with changes. Hence the innovation spectrum can be monitored to both set-off an “alarm” when a change occurs and to identify which modal frequency has given rise to the change. The threshold for alarming is based on the simple Chi-Squared PDF for a normalised white noise spectrum [14, 15]. While the method can identify the mode which has deteriorated, it does not necessarily indicate whether there has been a frequency or damping change. The PPM discussed next can monitor frequency changes and so can provide some discrimination in this regard. The Polynomial Phase Method (PPM): In [16] the cubic phase (CP) function was introduced as a tool for revealing frequency related spectral changes. This thesis extends the cubic phase function to a generalised class of polynomial phase functions which can reveal frequency related spectral changes in power systems. A statistical analysis of the technique is performed. When applied to power system analysis, the PPM can provide knowledge of sudden shifts in frequency through both the new frequency estimate and the polynomial phase coefficient information. This knowledge can be then cross-referenced with other detection methods to provide improved detection benchmarks.
Resumo:
The primary purpose of this research was to examine individual differences in learning from worked examples. By integrating cognitive style theory and cognitive load theory, it was hypothesised that an interaction existed between individual cognitive style and the structure and presentation of worked examples in their effect upon subsequent student problem solving. In particular, it was hypothesised that Analytic-Verbalisers, Analytic-Imagers, and Wholist-lmagers would perform better on a posttest after learning from structured-pictorial worked examples than after learning from unstructured worked examples. For Analytic-Verbalisers it was reasoned that the cognitive effort required to impose structure on unstructured worked examples would hinder learning. Alternatively, it was expected that Wholist-Verbalisers would display superior performances after learning from unstructured worked examples than after learning from structured-pictorial worked examples. The images of the structured-pictorial format, incongruent with the Wholist-Verbaliser style, would be expected to split attention between the text and the diagrams. The information contained in the images would also be a source of redundancy and not easily ignored in the integrated structured-pictorial format. Despite a number of authors having emphasised the need to include individual differences as a fundamental component of problem solving within domainspecific subjects such as mathematics, few studies have attempted to investigate a relationship between mathematical or science instructional method, cognitive style, and problem solving. Cognitive style theory proposes that the structure and presentation of learning material is likely to affect each of the four cognitive styles differently. No study could be found which has used Riding's (1997) model of cognitive style as a framework for examining the interaction between the structural presentation of worked examples and an individual's cognitive style. 269 Year 12 Mathematics B students from five urban and rural secondary schools in Queensland, Australia participated in the main study. A factorial (three treatments by four cognitive styles) between-subjects multivariate analysis of variance indicated a statistically significant interaction. As the difficulty of the posttest components increased, the empirical evidence supporting the research hypotheses became more pronounced. The rigour of the study's theoretical framework was further tested by the construction of a measure of instructional efficiency, based on an index of cognitive load, and the construction of a measure of problem-solving efficiency, based on problem-solving time. The consistent empirical evidence within this study that learning from worked examples is affected by an interaction of cognitive style and the structure and presentation of the worked examples emphasises the need to consider individual differences among senior secondary mathematics students to enhance educational opportunities. Implications for teaching and learning are discussed and recommendations for further research are outlined.
Resumo:
Basic competencies in assessing and treating substance use disorders should be core to the training of any clinical psychologist, because of the high frequency of risky or problematic substance use in the community, and its high co-occurrence with other problems. Skills in establishing trust and a therapeutic alliance are particularly important in addiction, given the stigma and potential for legal sanctions that surround it. The knowledge and skills of all clinical practitioners should be sufficient to allow valid screening and diagnosis of substance use disorders, accurate estimation of consumption and a basic functional analysis. Practitioners should also be able to undertake brief interventions including motivational interviews, and appropriately apply generic interventions such as problem solving or goal setting to addiction. Furthermore, clinical psychologists should have an understanding of the nature, evidence base and indications for biochemical assays, pharmacotherapies and other medical treatments, and ways these can be integrated with psychological practice. Specialists in addiction should have more sophisticated competencies in each of these areas. They need to have a detailed understating of current addiction theories and basic and applied research, be able to undertake and report on a detailed psychological assessment, and display expert competence in addiction treatment. These skills should include an ability to assess and manage complex or co-occurring problems, to adapt interventions to the needs of different groups, and to assist people who have not responded to basic treatments. They should also be able to provide consultation to others, undertake evaluations of their practice, and monitor and evaluate emerging research data in the field.
Resumo:
Purpose: The aim was to document contact lens prescribing trends in Australia between 2000 and 2009. ---------- Methods: A survey of contact lens prescribing trends was conducted each year between 2000 and 2009. Australian optometrists were asked to provide information relating to 10 consecutive contact lens fittings between January and March each year. ---------- Results: Over the 10-year survey period, 1,462 practitioners returned survey forms representing a total of 13,721 contact lens fittings. The mean age (± SD) of lens wearers was 33.2 ± 13.6 years and 65 per cent were female. Between 2006 and 2009, rigid lens new fittings decreased from 18 to one per cent. Low water content lenses reduced from 11.5 to 3.2 per cent of soft lens fittings between 2000 and 2008. Between 2005 and 2009, toric lenses and multifocal lenses represented 26 and eight per cent, respectively, of all soft lenses fitted. Daily disposable, one- to two-week replacement and monthly replacement lenses accounted for 11.6, 30.0 and 46.5 per cent of all soft lens fittings over the survey period, respectively. The proportion of new soft fittings and refittings prescribed as extended wear has generally declined throughout the past decade. Multi-purpose lens care solutions dominate the market. Rigid lenses and monthly replacement soft lenses are predominantly worn on a full-time basis, whereas daily disposable soft lenses are mainly worn part-time.---------- Conclusions: This survey indicates that technological advances, such as the development of new lens materials, manufacturing methods and lens designs, and the availability of various lens replacement options, have had a significant impact on the contact lens market during the first decade of the 21st Century.
Resumo:
Thoroughly revised and updated, this popular book provides a comprehensive yet easy to read guide to modern contact lens practice. Beautifully re-designed in a clean, contemporary layout, this second edition presents relevant and up-to-date information in a systematic manner, with a logical flow of subject matter from front to back. This book wonderfully captures the ‘middle ground’ in the contact lens field … somewhere between a dense research-based tome and a basic fitting guide. As such, it is ideally suited for both students and general eye care practitioners who require a practical, accessible and uncluttered account of the contact lens field. Contents Part 1 Introduction Historical perspective. The anterior eye Visual optics Clinical instruments Part 2 Soft contact lenses Soft lens materials Soft lens manufacture Soft lens optics Soft lens measurement Soft lens design and fitting Soft toric lens design and fitting Soft lens care systems Part 3 Rigid contact lenses Rigid lens materials Rigid lens manufacture Rigid lens optics Rigid lens measurement Rigid lens design and fitting Rigid toric lens design and fitting Rigid lens care systems Part 4 Lens replacement modalities Unplanned lens replacement Daily soft lens replacement Planned soft lens replacement Planned rigid lens replacement Part 5 Special lenses and fitting considerations Scleral lenses Tinted lenses Presbyopia Continuous wear Sport Keratoconus High ametropia Paediatric fitting Therapeutic applications Post-refractive Surgery Post-keratoplasty Orthokeratology Diabetes Part 6 Patient examination and management History taking Preliminary examination Patient education Aftercare Complications Digital imaging Compliance Practice management Appendices Index
Resumo:
In this paper we discuss an advanced, 3D groundwater visualisation and animation system that allows scientists, government agencies and community groups to better understand the groundwater processes that effect community planning and decision-making. The system is unique in that it has been designed to optimise community engagement. Although it incorporates a powerful visualisation engine, this open-source system can be freely distributed and boasts a simple user interface allowing individuals to run and investigate the models on their own PCs and gain intimate knowledge of the groundwater systems. The initial version of the Groundwater Visualisation System (GVS v1.0), was developed from a coastal delta setting (Bundaberg, QLD), and then applied to a basalt catchment area (Obi Obi Creek, Maleny, QLD). Several major enhancements have been developed to produce higher quality visualisations, including display of more types of data, support for larger models and improved user interaction. The graphics and animation capabilities have also been enhanced, notably the display of boreholes, depth logs and time-series water level surfaces. The GVS software remains under continual development and improvement