925 resultados para Graphic consistency
Resumo:
Data collection using Autonomous Underwater Vehicles (AUVs) is increasing in importance within the oceano- graphic research community. Contrary to traditional moored or static platforms, mobile sensors require intelligent planning strategies to manoeuvre through the ocean. However, the ability to navigate to high-value locations and collect data with specific scientific merit is worth the planning efforts. In this study, we examine the use of ocean model predictions to determine the locations to be visited by an AUV, and aid in planning the trajectory that the vehicle executes during the sampling mission. The objectives are: a) to provide near-real time, in situ measurements to a large-scale ocean model to increase the skill of future predictions, and b) to utilize ocean model predictions as a component in an end-to-end autonomous prediction and tasking system for aquatic, mobile sensor networks. We present an algorithm designed to generate paths for AUVs to track a dynamically evolving ocean feature utilizing ocean model predictions. This builds on previous work in this area by incorporating the predicted current velocities into the path planning to assist in solving the 3-D motion planning problem of steering an AUV between two selected locations. We present simulation results for tracking a fresh water plume by use of our algorithm. Additionally, we present experimental results from field trials that test the skill of the model used as well as the incorporation of the model predictions into an AUV trajectory planner. These results indicate a modest, but measurable, improvement in surfacing error when the model predictions are incorporated into the planner.
Resumo:
There are many reasons why interface design for interactive courseware fails to support quality of learning experiences. The causes such as the level of interactivity, the availability of the interfaces to interact with the end users and a lack of deep knowledge about the role of interface design by the designers in the development process are most acknowledged. Related to this, as a creator for the interactive courseware, generally the developers expect the resources that they produced are effective, accurate and robust. However, rarely do the developers have the opportunity to create good interfaces with the emphasis on time consuming, money and skill. Thus, some challenges faces by them in the interface design development can’t be underestimated as well. Therefore, their perspective of the interactive courseware is important to ensure the material and also the features of the interactive courseware can facilitate teaching and learning activity. Within this context in mind, this paper highlights the challenges that faces by the Malaysian developer from the ten face to face interviewed data gathered. It discusses from the Malaysian developer perspectives that involved in the development of interface design for interactive courseware for the Smart School Project. Particularly, in creating such a great interfaces, the highlights challenges will present within the constraints of time, curriculum demand, and competencies of the development team.
Resumo:
The 27-item Intolerance of Uncertainty Scale (IUS) has become one of the most frequently used measure of Intolerance of Uncertainty. More recently, an abridged, 12-item version of the IUS has been developed. The current research used clinical (n = 50) and non-clinical (n = 56) samples to examine and compare the psychometric properties of both versions of the IUS. The two scales showed good internal consistency at both the total and subscale level and had satisfactory test-retest reliability. Both versions were correlated with worry and trait anxiety and had satisfactory concurrent validity. Significant differences between the scores of the clinical and non-clinical sample supported discriminant validity. Predictive validity was also supported for the two scales. Total scores, in the case of the clinical sample, and a subscale, in the case of the non-clinical sample, significantly predicted pathological worry and trait anxiety. Overall, the clinicians and researchers can use either version of the IUS with confidence, due to their sound psychometric properties.
Resumo:
Background: In response to the need for more comprehensive quality assessment within Australian residential aged care facilities, the Clinical Care Indicator (CCI) Tool was developed to collect outcome data as a means of making inferences about quality. A national trial of its effectiveness and a Brisbane-based trial of its use within the quality improvement context determined the CCI Tool represented a potentially valuable addition to the Australian aged care system. This document describes the next phase in the CCI Tool.s development; the aims of which were to establish validity and reliability of the CCI Tool, and to develop quality indicator thresholds (benchmarks) for use in Australia. The CCI Tool is now known as the ResCareQA (Residential Care Quality Assessment). Methods: The study aims were achieved through a combination of quantitative data analysis, and expert panel consultations using modified Delphi process. The expert panel consisted of experienced aged care clinicians, managers, and academics; they were initially consulted to determine face and content validity of the ResCareQA, and later to develop thresholds of quality. To analyse its psychometric properties, ResCareQA forms were completed for all residents (N=498) of nine aged care facilities throughout Queensland. Kappa statistics were used to assess inter-rater and test-retest reliability, and Cronbach.s alpha coefficient calculated to determine internal consistency. For concurrent validity, equivalent items on the ResCareQA and the Resident Classification Scales (RCS) were compared using Spearman.s rank order correlations, while discriminative validity was assessed using known-groups technique, comparing ResCareQA results between groups with differing care needs, as well as between male and female residents. Rank-ordered facility results for each clinical care indicator (CCI) were circulated to the panel; upper and lower thresholds for each CCI were nominated by panel members and refined through a Delphi process. These thresholds indicate excellent care at one extreme and questionable care at the other. Results: Minor modifications were made to the assessment, and it was renamed the ResCareQA. Agreement on its content was reached after two Delphi rounds; the final version contains 24 questions across four domains, enabling generation of 36 CCIs. Both test-retest and inter-rater reliability were sound with median kappa values of 0.74 (test-retest) and 0.91 (inter-rater); internal consistency was not as strong, with a Chronbach.s alpha of 0.46. Because the ResCareQA does not provide a single combined score, comparisons for concurrent validity were made with the RCS on an item by item basis, with most resultant correlations being quite low. Discriminative validity analyses, however, revealed highly significant differences in total number of CCIs between high care and low care groups (t199=10.77, p=0.000), while the differences between male and female residents were not significant (t414=0.56, p=0.58). Clinical outcomes varied both within and between facilities; agreed upper and lower thresholds were finalised after three Delphi rounds. Conclusions: The ResCareQA provides a comprehensive, easily administered means of monitoring quality in residential aged care facilities that can be reliably used on multiple occasions. The relatively modest internal consistency score was likely due to the multi-factorial nature of quality, and the absence of an aggregate result for the assessment. Measurement of concurrent validity proved difficult in the absence of a gold standard, but the sound discriminative validity results suggest that the ResCareQA has acceptable validity and could be confidently used as an indication of care quality within Australian residential aged care facilities. The thresholds, while preliminary due to small sample size, enable users to make judgements about quality within and between facilities. Thus it is recommended the ResCareQA be adopted for wider use.
Resumo:
This article investigates virtual reality representations of performance in London’s late sixteenth-century Rose Theatre, a venue that, by means of current technology, can once again challenge perceptions of space, performance, and memory. The VR model of The Rose represents a virtual recreation of this venue in as much detail as possible and attempts to recover graphic demonstrations of the trace memories of the performance modes of the day. The VR model is based on accurate archeological and theatre historical records and is easy to navigate. The introduction of human figures onto The Rose’s stage via motion capture allows us to explore the relationships between space, actor and environment. The combination of venue and actors facilitates a new way of thinking about how the work of early modern playwrights can be stored and recalled. This virtual theatre is thus activated to intersect productively with contemporary studies in performance; as such, our paper provides a perspective on and embodiment of the relation between technology, memory and experience. It is, at its simplest, a useful archiving project for theatrical history, but it is directly relevant to contemporary performance practice as well. Further, it reflects upon how technology and ‘re-enactments’ of sorts mediate the way in which knowledge and experience are transferred, and even what may be considered ‘knowledge.’ Our work provides opportunities to begin addressing what such intermedial confrontations might produce for ‘remembering, experiencing, thinking and imagining.’ We contend that these confrontations will enhance live theatre performance rather than impeding or disrupting contemporary performance practice. Our ‘paper’ is in the form of a video which covers the intellectual contribution while also permitting a demonstration of the interventions we are discussing.
Resumo:
Louis Nowra wrote 'Radiance' especially for the three actors who performed it in the play’s premier season at Belvoir Street Theatre in September 1993. And the Currency Press playscript / programme produced for that season foregrounds these three performers – Rachael Maza, Lydia Miller and Rhoda Roberts – in such a way that the usual distinction between dramatis personae and the actors who play them is considerably diminished. Both the blurb on the back cover and Nowra’s introduction emphasise this special relationship between text and actors, but it is the front cover shot which particularly reflects the conjunction between the two. Rather than depicting a scene from performance, or a ‘graphic’ suggesting something of the play’s thematic content, the front cover of Radiance features the three actors in a posed promotional shot. Arms joined warmly, lovingly, about each other’s waist, bodies turned away from but faces towards the camera, it is the actors we see, not their characters. It’s a very joyful image; they’re positively beaming. Radiant. They look as if they could really be the three half-sisters they portray, except that such moments of blithe sorority are just about non-existent in the play.
Resumo:
This paper examines current teaching practice within the context of the Bachelor of Design (Fashion) programme at AUT University and compares it to the approach adopted in previous years. In recent years, staff on the Bachelor of Design (Fashion) adopted a holistic approach to the assessment of design projects similar to the successful ideas and methods put forward by Stella Lange at the FINZ conference, 2005. Prior to adopting this holistic approach, the teaching culture at AUT University was modular and divorced the development of conceptual design ideas from the technical processes of patternmaking and garment construction, thus limiting the creative potential of integrated project work. Fashion Design is not just about drawing pretty pictures but is rather an entire process that encapsulates conceptual design ideas and technical processes within the context of a target market. Fashion design at AUT being under the umbrella of a wider Bachelor of Design must encourage a more serious view of Fashion and Fashion Design as a whole. In the development of the Bachelor of Design degree at AUT, the university recognised that design education would be best serviced by an inclusive approach. At inception, Core Studio and Core Theory papers formed the first semester of the programme across the discipline areas of Fashion, Spatial Design, Graphic Design and Digital Design. These core papers reinforce the reality that there is a common skill set that transcends all design disciplines with the differentiation between disciplines being determined by the techniques and processes they adopt. Studio based teaching within the scope of a major design project was recognised and introduced some time ago for students in their graduating year, however it was also expected that by year 3 the student had amassed the basic skills required to be able to work in this way. The opinion concerning teaching these basic skills was that they were best serviced by a modular approach. Prior attempts to manage design project delivery leant towards deconstructing the newly formed integrated papers in order to ensure key technical skills were covered in enough depth. So, whilst design projects have played an integral part in the delivery of fashion design over the year levels, the earlier projects were timetabled by discipline and unconvincingly connected. This paper discusses how the holistic approach to assessment must be coupled with an integrated approach to delivery. The methods and processes used are demonstrated and some recently trialled developments are shown to have resulted in achieving the integrated approach in both delivery and assessment.
Resumo:
This paper investigates virtual reality representations of performance in London’s late sixteenth-century Rose Theatre, a venue that, by means of current technology, can once again challenge perceptions of space, performance, and memory. The VR model of The Rose becomes a Camillo device in that it represents a virtual recreation of this venue in as much detail as possible and attempts to recover graphic demonstrations of the trace memories of the performance modes of the day. The VR model is based on accurate archeological and theatre historical records and is easy to navigate. The introduction of human figures onto The Rose’s stage via motion capture allows us to explore the relationships between space, actor and environment. The combination of venue and actors facilitates a new way of thinking about how the work of early modern playwrights can be stored and recalled. This virtual theatre is thus activated to intersect productively with contemporary studies in performance; as such, our paper provides a perspective on and embodiment of the relation between technology, memory and experience. It is, at its simplest, a useful archiving project for theatrical history, but it is directly relevant to contemporary performance practice as well. Further, it reflects upon how technology and ‘re-enactments’ of sorts mediate the way in which knowledge and experience are transferred, and even what may be considered ‘knowledge.’ Our work provides opportunities to begin addressing what such intermedial confrontations might produce for ‘remembering, experiencing, thinking and imagining.’ We contend that these confrontations will enhance live theatre performance rather than impeding or disrupting contemporary performance practice. This paper intersects with the CFP’s ‘Performing Memory’ and ‘Memory Lab’ themes. Our presentation (which includes a demonstration of the VR model and the motion capture it requires) takes the form of two closely linked papers that share a single abstract. The two papers will be given by two people, one of whom will be physically present in Utrecht, the other participating via Skype.
Resumo:
To analyse mechanotransduction resulting from tensile loading under defined conditions, various devices for in vitro cell stimulation have been developed. This work aimed to determine the strain distribution on the membrane of a commercially available device and its consistency with rising cycle numbers, as well as the amount of strain transferred to adherent cells. The strains and their behaviour within the stimulation device were determined using digital image correlation (DIC). The strain transferred to cells was measured on eGFP-transfected bone marrow-derived cells imaged with a fluorescence microscope. The analysis was performed by determining the coordinates of prominent positions on the cells, calculating vectors between the coordinates and their length changes with increasing applied tensile strain. The stimulation device was found to apply homogeneous (mean of standard deviations approx. 2% of mean strain) and reproducible strains in the central well area. However, on average, only half of the applied strain was transferred to the bone marrow-derived cells. Furthermore, the strain measured within the device increased significantly with an increasing number of cycles while the membrane's Young's modulus decreased, indicating permanent changes in the material during extended use. Thus, strain magnitudes do not match the system readout and results require careful interpretation, especially at high cycle numbers.
Resumo:
Aurora, an illustrated novella, is a retelling of the classic fairytale Sleeping Beauty, set on the Australian coast around the grounds of the family lighthouse. Instead of following in the footsteps of tradition, this tale focuses on the long time Aurora is cursed to sleep by the malevolent Minerva; we follow Aurora as she voyages into the unconscious. Hunted by Minerva through the shifting landscape of her dreams, Aurora is dogged by a nagging pull towards the light—there is something she has left behind. Eventually, realising she must face Minerva to break the curse, they stage a battle of the minds in which Aurora triumphs, having grasped the power of her thoughts, her words. Aurora, an Australian fairytale, is a story of self-empowerment, the ability to shape destiny and the power of the mind. The exegesis examines a two-pronged question: is the illustrated book for young adults—graphic novel—relevant to a contemporary readership, and, is the graphic novel, where text and image intersect, a suitably specular genre in which to explore the unconscious? It establishes the language of the unconscious and the meaning of the term ‘graphic novel’, before investigating the place of the illustrated book for an older readership in a contemporary market, particularly exploring visual literacy and the way text and image—a hybrid narrative—work together. It then studies the aptitude of graphic literature to representing the unconscious and looks at two pioneers of the form: Audrey Niffenegger, specifically her visual novel The Three Incestuous Sisters, and Shaun Tan, and his graphic novel The Arrival. Finally, it reflects upon the creative work, Aurora, in light of three concerns: how best to develop a narrative able to relay the dreaming story; how to bestow a certain ‘Australianess’ upon the text and images; and the dilemma of designing an illustrated book for an older readership.
Resumo:
This research report documents work conducted by the Center for Transportation (CTR) at The University of Texas at Austin in analyzing the Joint Analysis using the Combined Knowledge (J.A.C.K.) program. This program was developed by the Texas Department of Transportation (TxDOT) to make projections of revenues and expenditures. This research effort was to span from September 2008 to August 2009, but the bulk of the work was completed and presented by December 2008. J.A.C.K. was subsequently renamed TRENDS, but for consistency with the scope of work, the original name is used throughout this report.
Resumo:
A trend in design and implementation of modern industrial automation systems is to integrate computing, communication and control into a unified framework at different levels of machine/factory operations and information processing. These distributed control systems are referred to as networked control systems (NCSs). They are composed of sensors, actuators, and controllers interconnected over communication networks. As most of communication networks are not designed for NCS applications, the communication requirements of NCSs may be not satisfied. For example, traditional control systems require the data to be accurate, timely and lossless. However, because of random transmission delays and packet losses, the control performance of a control system may be badly deteriorated, and the control system rendered unstable. The main challenge of NCS design is to both maintain and improve stable control performance of an NCS. To achieve this, communication and control methodologies have to be designed. In recent decades, Ethernet and 802.11 networks have been introduced in control networks and have even replaced traditional fieldbus productions in some real-time control applications, because of their high bandwidth and good interoperability. As Ethernet and 802.11 networks are not designed for distributed control applications, two aspects of NCS research need to be addressed to make these communication networks suitable for control systems in industrial environments. From the perspective of networking, communication protocols need to be designed to satisfy communication requirements for NCSs such as real-time communication and high-precision clock consistency requirements. From the perspective of control, methods to compensate for network-induced delays and packet losses are important for NCS design. To make Ethernet-based and 802.11 networks suitable for distributed control applications, this thesis develops a high-precision relative clock synchronisation protocol and an analytical model for analysing the real-time performance of 802.11 networks, and designs a new predictive compensation method. Firstly, a hybrid NCS simulation environment based on the NS-2 simulator is designed and implemented. Secondly, a high-precision relative clock synchronization protocol is designed and implemented. Thirdly, transmission delays in 802.11 networks for soft-real-time control applications are modeled by use of a Markov chain model in which real-time Quality-of- Service parameters are analysed under a periodic traffic pattern. By using a Markov chain model, we can accurately model the tradeoff between real-time performance and throughput performance. Furthermore, a cross-layer optimisation scheme, featuring application-layer flow rate adaptation, is designed to achieve the tradeoff between certain real-time and throughput performance characteristics in a typical NCS scenario with wireless local area network. Fourthly, as a co-design approach for both a network and a controller, a new predictive compensation method for variable delay and packet loss in NCSs is designed, where simultaneous end-to-end delays and packet losses during packet transmissions from sensors to actuators is tackled. The effectiveness of the proposed predictive compensation approach is demonstrated using our hybrid NCS simulation environment.
Resumo:
The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.
Resumo:
Traffic oscillations are typical features of congested traffic flow that are characterized by recurring decelerations followed by accelerations (stop-and-go driving). The negative environmental impacts of these oscillations are widely accepted, but their impact on traffic safety has been debated. This paper describes the impact of freeway traffic oscillations on traffic safety. This study employs a matched case-control design using high-resolution traffic and crash data from a freeway segment. Traffic conditions prior to each crash were taken as cases, while traffic conditions during the same periods on days without crashes were taken as controls. These were also matched by presence of congestion, geometry and weather. A total of 82 cases and about 80,000 candidate controls were extracted from more than three years of data from 2004 to 2007. Conditional logistic regression models were developed based on the case-control samples. To verify consistency in the results, 20 different sets of controls were randomly extracted from the candidate pool for varying control-case ratios. The results reveal that the standard deviation of speed (thus, oscillations) is a significant variable, with an average odds ratio of about 1.08. This implies that the likelihood of a (rear-end) crash increases by about 8% with an additional unit increase in the standard deviation of speed. The average traffic states prior to crashes were less significant than the speed variations in congestion.
Resumo:
This protocol represents an attempt to assist in the instruction of teamwork assessment for first-year students across QUT. We anticipate that teaching staff will view this protocol as a generic resource in teamwork instruction, processes and evaluation. Teamwork has been acknowledged as a problematic practice at QUT while existing predominantly in importance amongst graduate capabilities for all students at this institution. This protocol is not an extensive document on the complexities and dynamics of teamwork processes, but instead presents itself as a set of best practice guidelines and recommendations to assist in team design, development, management, support and assessment. It is recommended that this protocol be progressively implemented across QUT, not only to attain teamwork teaching consistency, but to address and deal with the misconceptions and conflict around the importance of the teamwork experience. The authors acknowledge the extensive input and contributions from a Teamwork Steering Committee selected from academic staff and administrative members across the institution. As well, we welcome feedback and suggestions to both fine tune and make inclusive those strategies that staff believe add to optimal teamwork outcomes.